Traditional Ones (traditional + ones)

Distribution by Scientific Domains


Selected Abstracts


Biological lemon and sweet orange essential oil composition

FLAVOUR AND FRAGRANCE JOURNAL, Issue 6 2004
A. Verzera
Abstract The volatile fraction composition of sweet orange and lemon oils obtained using biological and traditional cultivation is reported. The oils came from Sicily and were industrially obtained. The aim of the research was to establish whether the use of pesticides in citrus cultivation could in,uence the essential oil composition. The volatile fraction was analysed by HRGC and HRGC,MS. The content of organophosphorus and organochlorine pesticides was determined by HRGC,FPD and HRGC,ECD. Differences in the oil composition resulted, especially in the content of carbonyl compounds; the results obtained, altogether, show that the biological oils are of higher quality in terms of their composition than traditional ones. Copyright © 2004 John Wiley & Sons, Ltd. [source]


Flexural Strength Evaluation of Nonconstant Thickness Ceramic Floorings by Means of the Finite-Element Method

INTERNATIONAL JOURNAL OF APPLIED CERAMIC TECHNOLOGY, Issue 2 2010
Beatriz Defez
The ceramic tile industry has become an extremely competitive sector. The entry of new Asian and South American manufacturers into the market is shifting the leadership in production and exports from the traditional clusters of Europe to China, Turkey, and Brazil. In this uncertain environment, enterprises should raise the quality and cut costs by means of new products and processes. Ceramic tiles lightened by carving of a deep back relief could give rise to a generation of new, efficient products. These tiles could be manufactured with fewer raw materials than the traditional ones, which may lead to saving of weight and energy. Additionally, a lighter final product improves working conditions on the shopfloor and at the building site. Nevertheless, lightened tiles are structurally different from traditional ones, and so is their mechanical behavior. Because tiles are constructive elements, it is necessary to know their response under typical loads and assure fulfillment of the valid standards. This paper aims at evaluating the flexural strength (R) of lightened ceramic floorings using solid three-dimensional modelling and the finite-element method, establishing a new formula for the application of the international standard ISO 10545 "Ceramic Tiles." In order to achieve this objective, one reference model and 48 different relief versions were designed, which underwent a simplified computational simulation of the bending test. In accordance with the Rankine criterion, the maximal stresses of each version were calculated, as much as their distribution. Next, we correlated the results defining a new parameter called "normalized thickness," defined as the thickness that a carved tile should have to behave as a traditional flooring under flexion. This parameter allowed the adjustment of the international standard ISO 10545 to this kind of a product, facilitating their certification and therefore their real introduction in the market. Finally, thanks to the collaboration of the company Keros Cerámica S. A., it was verified that the methodology used was appropriate. [source]


Texture-based parametric active contour for target detection and tracking

INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY, Issue 3 2009
Ali Reza Vard
Abstract In recent years, active contour models (ACM) have been considered as powerful tools for image segmentation and object tracking in computer vision and image processing applications. This article presents a new tracking method based on parametric active contour models. In the proposed method, a new pressure energy called "texture pressure energy" is added to the energy function of the parametric active contour model to detect and track a texture target object in a texture background. In this scheme, the texture features of the contour are calculated by a moment-based method. Then, by comparing these features with texture features of the target object, the contour curve is expanded or contracted to be adapted to the object boundaries. Experimental results show that the proposed method is more efficient and accurate in the tracking of objects compare to the traditional ones, when both object and background are textures in nature. © 2009 Wiley Periodicals, Inc. Int J Imaging Syst Technol, 19, 187,198, 2009 [source]


H-methods in applied sciences

JOURNAL OF CHEMOMETRICS, Issue 3-4 2008
Agnar Höskuldsson
Abstract The author has developed a framework for mathematical modelling within applied sciences. It is characteristic for data from ,nature and industry' that they have reduced rank for inference. It means that full rank solutions normally do not give satisfactory solutions. The basic idea of H-methods is to build up the mathematical model in steps by using weighing schemes. Each weighing scheme produces a score and/or a loading vector that are expected to perform a certain task. Optimisation procedures are used to obtain ,the best' solution at each step. At each step, the optimisation is concerned with finding a balance between the estimation task and the prediction task. The name H-methods has been chosen because of close analogy with the Heisenberg uncertainty inequality. A similar situation is present in modelling data. The mathematical modelling stops, when the prediction aspect of the model cannot be improved. H-methods have been applied to wide range of fields within applied sciences. In each case, the H-methods provide with superior solutions compared to the traditional ones. A background for the H-methods is presented. The H-principle of mathematical modelling is explained. It is shown how the principle leads to well-defined optimisation procedures. This is illustrated in the case of linear regression. The H-methods have been applied in different areas: general linear models, nonlinear models, multi-block methods, path modelling, multi-way data analysis, growth models, dynamic models and pattern recognition. Copyright © 2008 John Wiley & Sons, Ltd. [source]


SIMULATION OF THIN-FILM DEODORIZERS IN PALM OIL REFINING

JOURNAL OF FOOD PROCESS ENGINEERING, Issue 2010
ROBERTA CERIANI
ABSTRACT As the need for healthier fats and oils (natural vitamin and trans fat contents) and interest in biofuels are growing, many changes in the world's vegetable oil market are driving the oil industry to developing new technologies and recycling traditional ones. Computational simulation is widely used in the chemical and petrochemical industries as a tool for optimization and design of (new) processes, but that is not the case for the edible oil industry. Thin-film deodorizers are novel equipment developed for steam deacidification of vegetable oils, and no work on the simulation of this type of equipment could be found in the open literature. This paper tries to fill this gap by presenting results from the study of the effect of processing variables, such as temperature, pressure and percentage of stripping steam, in the final quality of product (deacidified palm oil) in terms of final oil acidity, the tocopherol content and neutral oil loss. The simulation results have been evaluated by using the response surface methodology. The model generated by the statistical analysis for tocopherol retention has been validated by matching its results with industrial data published in the open literature. PRACTICAL APPLICATIONS This work is a continuation of our previous works (Ceriani and Meirelles 2004a, 2006; Ceriani et al. 2008), dealing with the simulation of continuous deodorization and/or steam deacidification for a variety of vegetable oils using stage-wised columns, and analyzing both the countercurrent and the cross-flow patterns. In this work, we have studied thin-film deodorizers, which are novel equipment developed for steam deacidification of vegetable oils. Here, we highlight issues related to final oil product quality and the corresponding process variables. [source]


A theoretical explanation for the retention mechanism of ion exclusion chromatography

JOURNAL OF SEPARATION SCIENCE, JSS, Issue 17 2003
Bronis, aw K. G
Abstract Ion Exclusion Chromatography is classically used for the separation of weak acid anions. Dilute strong acids (e.g. sulphuric or perchloric acid) or just water are used as eluents. To increase the exclusion effect, strong cation exchangers, characterized by high concentration of functional groups, are applied. The inner column volume of commercially available columns is increased by increasing their size in comparison to traditional ones (usually 300×7.8 mm ID). The description of the retention mechanism of this technique implicitly assumes that both mobile and stationary phases are typical aqueous solutions, and their dielectric constants are thus equal. This equality implies the equality of solute dissociation constants in both phases. Another implicit assumption is that the dead- and inner volumes of the column are constant, and independent of the mobile phase composition. The present paper shows that stationary and mobile phases are generally characterized by different physicochemical parameters. Thus, they cannot be considered as regular aqueous solutions. Additionally, we show that weak cation exchanger resins, which are characterized by a relatively small concentration of the functional groups, and weak acid based buffers can also be used in IEC. This would expand the possible applications of this method and enable, for example, the separation of strong acids (anions). The influence of ionic strength on the retention and dead- and inner column volumes is also discussed. Finally we also briefly describe the retention mechanism of Electrostatic Ion Chromatography. [source]


The effects of spacing and titles on judgments of the effectiveness of structured abstracts

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, Issue 14 2007
James Hartley
Previous research assessing the effectiveness of structured abstracts has been limited in two respects. First, when comparing structured abstracts with traditional ones, investigators usually have rewritten the original abstracts, and thus confounded changes in the layout with changes in both the wording and the content of the text. Second, investigators have not always included the title of the article together with the abstract when asking participants to judge the quality of the abstracts, yet titles alert readers to the meaning of the materials that follow. The aim of this research was to redress these limitations. Three studies were carried out. Four versions of each of four abstracts were prepared. These versions consisted of structured/traditional abstracts matched in content, with and without titles. In Study 1, 64 undergraduates each rated one of these abstracts on six separate rating scales. In Study 2, 225 academics and research workers rated the abstracts electronically, and in Study 3, 252 information scientists did likewise. In Studies 1 and 3, the respondents rated the structured abstracts significantly more favorably than they did the traditional ones, but the presence or absence of titles had no effect on their judgments. In Study 2, no main effects were observed for structure or for titles. The layout of the text, together with the subheadings, contributed to the higher ratings of effectiveness for structured abstracts, but the presence or absence of titles had no clear effects in these experimental studies. It is likely that this spatial organization, together with the greater amount of information normally provided in structured abstracts, explains why structured abstracts are generally judged to be superior to traditional ones. [source]


Unmodified and Modified Surface Sisal Fibers as Reinforcement of Phenolic and Lignophenolic Matrices Composites: Thermal Analyses of Fibers and Composites

MACROMOLECULAR MATERIALS & ENGINEERING, Issue 4 2006
Jane Maria Faulstich de Paiva
Abstract Summary: The study and development of polymeric composite materials, especially using lignocellulosic fibers, have received increasing attention. This is interesting from the environmental and economical viewpoints as lignocellulosic fibers are obtained from renewable resources. This work aims to contribute to reduce the dependency on materials from nonrenewable sources, by utilizing natural fibers (sisal) as reinforcing agents and lignin (a polyphenolic macromolecule obtained from lignocellulosic materials) to partially substitute phenol in a phenol-formaldehyde resin. Besides, it was intended to evaluate how modifications applied on sisal fibers influence their properties and those of the composites reinforced with them, mainly thermal properties. Sisal fibers were modified by either (i) mercerization (NaOH 10%), (ii) esterification (succinic anhydride), or (iii) ionized air treatment (discharge current of 5 mA). Composites were made by mould compression, of various sisal fibers in combination with either phenol-formaldehyde or lignin-phenol-formaldehyde resins. Sisal fibers and composites were characterized by thermogravimetry (TG) and DSC to establish their thermal stability. Scanning electron microscopy (SEM) was used to investigate the morphology of unmodified and modified surface sisal fibers as well as the fractured composites surface. Dynamic mechanical thermoanalysis (DMTA) was used to examine the influence of temperature on the composite mechanical properties. The results obtained for sisal fiber-reinforced phenolic and lignophenolic composites showed that the use of lignin as a partial substitute of phenol in phenolic resins in applications different from the traditional ones, as for instance in other than adhesives is feasible. Micrograph of the impact fracture surface of phenolic composite reinforced with mercerized sisal fiber (500 X). [source]


The Needs and Benefits of Applying Textual Data Mining within the Product Development Process

QUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, Issue 1 2004
Rakesh Menon
Abstract As a result of the growing competition in recent years, new trends such as increased product complexity, changing customer requirements and shortening development time have emerged within the product development process (PDP). These trends have added more challenges to the already-difficult task of quality and reliability prediction and improvement. They have given rise to an increase in the number of unexpected events in the PDP. Traditional tools are only partially adequate to cover these unexpected events. As such, new tools are being sought to complement traditional ones. This paper investigates the use of one such tool, textual data mining for the purpose of quality and reliability improvement. The motivation for this paper stems from the need to handle ,loosely structured textual data' within the product development process. Thus far, most of the studies on data mining within the PDP have focused on numerical databases. In this paper, the need for the study of textual databases is established. Possible areas within a generic PDP for consumer and professional products, where textual data mining could be employed are highlighted. In addition, successful implementations of textual data mining within two large multi-national companies are presented. Copyright © 2003 John Wiley & Sons, Ltd. [source]


A serum metabolomic investigation on hepatocellular carcinoma patients by chemical derivatization followed by gas chromatography/mass spectrometry

RAPID COMMUNICATIONS IN MASS SPECTROMETRY, Issue 19 2008
Ruyi Xue
The purpose of this study was to investigate the serum metabolic difference between hepatocellular carcinoma (HCC, n,=,20) male patients and normal male subjects (n,=,20). Serum metabolome was detected through chemical derivatization followed by gas chromatography/mass spectrometry (GC/MS). The acquired GC/MS data was analyzed by stepwise discriminant analysis (SDA) and support vector machine (SVM). The metabolites including butanoic acid, ethanimidic acid, glycerol, L-isoleucine, L-valine, aminomalonic acid, D-erythrose, hexadecanoic acid, octadecanoic acid, and 9,12-octadecadienoic acid in combination with each other gave the strongest segregation between the two groups. By applying these variables, our method provided a diagnostic model that could well discriminate between HCC patients and normal subjects. More importantly, the error count estimate for each group was 0%. The total classifying accuracy of the discriminant function tested by SVM 20-fold cross validation was 75%. This technique is different from traditional ones and appears to be a useful tool in the area of HCC diagnosis. Copyright © 2008 John Wiley & Sons, Ltd. [source]