Raw Data (raw + data)

Distribution by Scientific Domains


Selected Abstracts


Prenatal and early postnatal morphogenesis and growth of human laryngotracheal structures

JOURNAL OF ANATOMY, Issue 2 2008
Pierre Fayoux
Abstract Advances in neonatal medicine have resulted in increased care of fetal and neonatal airways. These advances have required an exhaustive knowledge of fetal airway anatomy and development. The aim of this study was to determine the anatomical development of laryngotracheal structures during the fetal and immediate postnatal period and to correlate these observations with other fetal biometric parameters to estimate developmental particularities of the fetal airway. An anatomical prospective study was based on examination of larynx and trachea from 300 routine autopsies of fetuses and infants, free of malformation and never intubated. Anatomical measurements of cricoid cartilage, thyroid cartilage, glottis, arytenoid cartilage and trachea were performed using a precision calliper and precision divider. Statistical analysis was performed to represent the growth of anatomical structures and to evaluate the correlation with biometric data. Raw data and 10th and 90th percentile curves were fitted satisfactorily with a linear model for gestational age. A linear relationship between laryngotracheal measurement and body weight and height was observed except for glottis length, interarytenoid distance and anterior cricoid height. The diameter of the cricoid lumen was significantly less than that of the trachea and glottis lumen. A sexual dysmorphism was noted for thyroid cartilage measurements and interarytenoid distance, with measurements significantly smaller in females. This study reports the anatomical development of normal laryngotracheal structures during the fetal period. Despite the fact that this study was performed during postmortem examination, these observations can be useful to develop criteria, materials and surgical procedures adapted to fetal and neonatal airways as well as for the purposes of early diagnosis and management of laryngotracheal malformations. [source]


An Assessment of British Science over the Twentieth Century,

THE ECONOMIC JOURNAL, Issue 538 2009
Bruce A. Weinberg
The twentieth century saw dramatic international shifts in scientific leadership. Despite these dramatic shifts Britain's position has been remarkably stable and strong. I study these changes using data on Nobel laureates in Chemistry, Medicine, and Physics. Raw data show a slight decline in British science, mainly in physics but once one accounts for the tremendous increase in the US, British science actually shows strong growth. I show that raw data and data that adjust for population and gross domestic product (per capita or total), consistently rank Britain as one of the top scientific performers. [source]


ORIGINAL ARTICLE: How Should Data on Murine Spontaneous Abortion Rates be Expressed and Analyzed?

AMERICAN JOURNAL OF REPRODUCTIVE IMMUNOLOGY, Issue 3 2008
David A. Clark
Problem, Spontaneous abortions in the CBA × DBA/2 model are normally reported as number of resorptions/total number of implantations (R/T), pooling data from individual mice. The significance of differences between groups has been determined using non-parametric statistics (e.g. chi-square or Fisher's Exact test) based on a priori predictions. Recently, it has been argued that medians with box plots should replace the accepted standard, but this deprives readers of data needed to verify P -values, and leads to inferences incompatible with biological and statistical reality. Method of study, Raw data on 173 individual CBA × DBA/2 matings were analyzed by median and mean, along with R/T data from 18 independent experiments containing 5,10 mice per group. Raw data from 19 CBA × BALB/c matings were similarly analyzed. Results, Individual CBA × DBA/2 mouse resorption rates showed a non-Gaussian distribution, but the mean and median differed by <0.5%. Resorption data from 6 and 12 independent pools of mice were normally distributed. Only the mean enabled a between-group P -value calculation. CBA × BALB/c matings gave a median of 0 and mean of 5.1%; the data were not normally distributed, but that was because of a bimodal distribution. One group of mice had 0 abortions, and the second a mean of 13.9% abortions, and the data from the latter group were normally distributed. Conclusion, Although it is possible to compare individual mice, and even individual implantation sites, in resorption (abortion) studies, as the relevant question is the significance of differences between treatment groups of mice, and reproducibility, the established classical method of reporting R/T should continue to be provided. In CBA × BALB/c matings, where abortion rates are low, using the median is misleading and may obscure the existence of two distinct populations. [source]


Multivariate analysis of leaf shape patterns in Asian species of the Uvaria group (Annonaceae)

BOTANICAL JOURNAL OF THE LINNEAN SOCIETY, Issue 3 2003
CONOR MEADE
Multivariate analysis of leaf radian measurements was used to investigate variation in leaf shape among 34 Asian species of the Uvaria group, a large palaeotropical group of climbing Annonaceae characterized by imbricate petals and stellate hairs. Raw data were normalized by conversion into 15 ratio characters and using the log10 transformation. All species surveyed showed a unique leaf-shape ,bauplan'. The ratio character with the greatest discriminating power in both the Principal Components Analysis and Discriminant Analysis (DA) results was a measure of the shape of the leaf base. Ratio characters with the highest factor loadings for principal components 1 and 2 clearly separated the sampled taxa when plotted against one another and provided support for the retention of several taxa as distinct species or varieties. Classification of cases into taxa using DA yielded a correct classification rate of only 52% for the ratio-transformed data; however, division of taxa in the dataset into smaller subgroups defined by discrete morphological characters significantly increased the accuracy of case identification to between 67 and 100% of cases correctly classified, depending on the group. Case identification using DA on log10 -transformed data was higher than for the ratio values in the entire dataset (61.7%) and the larger subgroups. However, the rate of correct case assignment was lower in the smaller groups than for the ratio data. © 2003 The Linnean Society of London, Botanical Journal of the Linnean Society, 2003, 143, 231,242. [source]


Towards closing the analysis gap: Visual generation of decision supporting schemes from raw data

COMPUTER GRAPHICS FORUM, Issue 3 2008
T. May
Abstract The derivation, manipulation and verification of analytical models from raw data is a process which requires a transformation of information across different levels of abstraction. We introduce a concept for the coupling of data classification and interactive visualization in order to make this transformation visible and steerable for the human user. Data classification techniques generate mappings that formally group data items into categories. Interactive visualization includes the user into an iterative refinement process. The user identifies and selects interesting patterns to define these categories. The following step is the transformation of a visible pattern into the formal definition of a classifier. In the last step the classifier is transformed back into a pattern that is blended with the original data in the same visual display. Our approach allows in intuitive assessment of a formal classifier and its model, the detection of outliers and the handling of noisy data using visual pattern-matching. We instantiated the concept using decision trees for classification and KVMaps as the visualization technique. The generation of a classifier from visual patterns and its verification is transformed from a cognitive to a mostly pre-cognitive task. [source]


Liquid-based cytology: is this the way forward for cervical screening?

CYTOPATHOLOGY, Issue 2 2002
R. P. MOSELEY
Liquid-based cytology: is this the way forward for cervical screening? Liquid-based cytology (LBC) is currently being marketed as an alternative methodology to replace the conventional PAP smear in cervical cytology. A substantial body of literature exists in support of LBC, some of which is at least partially sponsored by product manufacturers. The majority of published literature in support of LBC employs Bethesda reporting terminology. In this study we have analysed published raw data and presented this in NHSCSP terminology. Claims relating to sensitivity, specificity and smear adequacy have then been considered with reference to this data. Our analysis of existing data does not support the nationwide implementation of LBC at present. Further studies are recommended in order to evaluate the place of this technology within the NHSCSP. [source]


Evaluation of different warping methods for the analysis of CE profiles of urinary nucleosides

ELECTROPHORESIS, Issue 16 2007
Ewa Szyma
Abstract Nowadays, numerous metabolite concentrations can readily be determined in a given biological sample by high-throughput analytical methods. However, such raw analytical data comprise noninformative components due to many disturbances normally occurring in the analyses of biological material. To eliminate those unwanted original analytical data components, advanced chemometric data preprocessing methods might be of help. Here, such methods are applied to electrophoretic nucleoside profiles in urine samples of cancer patients and healthy volunteers. In this study, three warping methods: dynamic time warping (DTW), correlation optimized warping (COW), and parametric time warping (PTW) were examined on two sets of electrophoretic data by means of quality of peaks alignment, time of preprocessing, and way of customization. The application of warping methods helped to limit shifting of peaks and enabled differentiation between whole electropherograms of healthy and cancer patients objectively by a principal component analysis (PCA). The evaluation of preprocessed data and raw data by PC analysis confirms differences between the applied warping tools and proves their suitability in metabonomic data interpretation. [source]


Diagnosis Clusters for Emergency Medicine

ACADEMIC EMERGENCY MEDICINE, Issue 12 2003
Debbie A. Travers RN
Objectives: Aggregated emergency department (ED) data are useful for research, ED operations, and public health surveillance. Diagnosis data are widely available as The International Classification of Diseases, version, 9, Clinical Modification (ICD-9-CM) codes; however, there are over 24,000 ICD-9-CM code-descriptor pairs. Standardized groupings (clusters) of ICD-9-CM codes have been developed by other disciplines, including family medicine (FM), internal medicine (IM), inpatient care (Agency for Healthcare Research and Quality [AHRQ]), and vital statistics (NCHS). The purpose of this study was to evaluate the coverage of four existing ICD-9-CM cluster systems for emergency medicine. Methods: In this descriptive study, four cluster systems were used to group ICD-9-CM final diagnosis data from a southeastern university tertiary referral center. Included were diagnoses for all ED visits in July 2000 and January 2001. In the comparative analysis, the authors determined the coverage in the four cluster systems, defined as the proportion of final diagnosis codes that were placed into clusters and the frequencies of diagnosis codes in each cluster. Results: The final sample included 7,543 visits with 19,530 diagnoses. Coverage of the ICD-9-CM codes in the ED sample was: AHRQ, 99%; NCHS, 88%; FM, 71%; IM, 68%. Seventy-six percent of the AHRQ clusters were small, defined as grouping <1% of the diagnosis codes in the sample. Conclusions: The AHRQ system provided the best coverage of ED ICD-9-CM codes. However, most of the clusters were small and not significantly different from the raw data. [source]


Wasted fishery resources: discarded by-catch in the USA

FISH AND FISHERIES, Issue 4 2005
Jennie M Harrington
Abstract Fishery by-catch, especially discarded by-catch, is a serious problem in the world's oceans. Not only are the stocks of discarded species affected, but entire trophic webs and habitats may be disrupted at the ecosystem level. This paper reviews discarding in the marine fisheries of the USA; however, the type, diversity and regulatory mechanisms of the fisheries are similar to developed fisheries and management programmes throughout the world. We have compiled current estimates of discarded by-catch for each major marine fishery in the USA using estimates from existing literature, both published and unpublished. We did not re-estimate discards or discard rates from raw data, nor did we include data on protected species (turtles, mammals and birds) and so this study covers discarded by-catch of finfish and fishable invertebrates. For some fisheries, additional calculations were required to transform number data into weight data, and typically length and weight composition data were used. Specific data for each fishery are referenced in Harrington et al. (Wasted Resources: Bycatch and discards in US Fisheries, Oceana, Washington, DC, 2005). Overall, our compiled estimates are that 1.06 million tonnes of fish were discarded and 3.7 million tonnes of fish were landed in USA marine fisheries in 2002. This amounts to a nationwide discard to landings ratio of 0.28, amongst the highest in the world. Regionally, the southeast had the largest discard to landings ratio (0.59), followed closely by the highly migratory species fisheries (0.52) and the northeast fisheries (0.49). The Alaskan and west coast fisheries had the lowest ratios (0.12 and 0.15 respectively). Shrimp fisheries in the southeast were the major contributors to the high discard rate in that region, with discard ratios of 4.56 (Gulf of Mexico) and 2.95 (South Atlantic). By-catch and discarding is a major component of the impact of fisheries on marine ecosystems. There have been substantial efforts to reduce by-catch in some fisheries, but broadly based programmes covering all fisheries are needed within the USA and around the world. In response to international agreements to improve fishery management, by-catch and discard reduction must become a regular part of fishery management planning. [source]


Use of pharmacokinetics in the coagulation factor treatment of patients with haemophilia

HAEMOPHILIA, Issue 6 2005
A. D. Shapiro
Summary., Dosing decisions for replacement coagulation factors in patients with haemophilia should be made on an individual patient basis, with the required dose dependent on factors including the clinical situation, the severity of the factor deficiency, and the location and extent of bleeding. Moreover, there is considerable variability in the pharmacokinetics of coagulation products that needs to be considered; in particular, with both factor (F) IX and FVIII products, there is considerable inter-patient variability in in vivo recovery and terminal half-life values. In the present report, we provide a practical guide to calculating and applying pharmacokinetic parameters relevant to the optimal dosing of coagulation products. We discuss the conduct of a pharmacokinetic study in an individual patient, how to calculate pharmacokinetic values from raw data and clinical situations where an individual pharmacokinetic study is helpful. We highlight the importance of considering an individual pharmacokinetic study in all patients starting a new coagulation product. [source]


Testing for an economic gradient in health status using subjective data

HEALTH ECONOMICS, Issue 11 2008
Michael Lokshin
Abstract Can self-assessments of health reveal the true health differentials between ,rich' and ,poor'? The potential sources of bias include psychological adaptation to ill-health, socioeconomic covariates of health reporting errors and income measurement errors. We propose an estimation method to reduce the bias by isolating the component of self-assessed health that is explicable in terms of objective health indicators and allowing for broader dimensions of economic welfare than captured by current incomes. On applying our method to survey data for Russia we find a pronounced (nonlinear) economic gradient in health status that is not evident in the raw data. This is largely attributable to the health effects of age, education and location. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Measurement sampling and scaling for deep montane snow depth data

HYDROLOGICAL PROCESSES, Issue 4 2006
S. R. Fassnacht
Abstract The resolution of snow depth measurements was scaled from a nominal horizontal resolution of approximately 1·5 m to 3, 5, 10, 20, and 30 m using averaging (AVG) and resampling with a uniform random stratified sampling (RSS) scheme. The raw snow depth values were computed from airborne light detection and ranging data by differencing summer elevation measurements from winter snow surface elevations. Three montane study sites from the NASA Cold Lands Processes Experiment, each covering an 1100 m × 1100 m area, were used. To examine scaling, log,log semi-variograms with 50 log-width bins were created for both of the different subsetting methods, i.e. RSS and AVG. From the raw data, a scale break, going from a structured to a nearly spatially random system, was observed in each of the log,log variograms. For each site, the scale break was still detectable at slightly greater than the resampling resolution for the RSS scheme, but at approximately twice the subsetting resolution for the AVG scheme. The resolution required to identify the scale break was still from 5 to 10 m, depending upon the location and sampling method. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Reproduction of temporal scaling by a rectangular pulses rainfall model

HYDROLOGICAL PROCESSES, Issue 3 2002
Jonas Olsson
Abstract The presence of scaling statistical properties in temporal rainfall has been well established in many empirical investigations during the latest decade. These properties have more and more come to be regarded as a fundamental feature of the rainfall process. How to best use the scaling properties for applied modelling remains to be assessed, however, particularly in the case of continuous rainfall time-series. One therefore is forced to use conventional time-series modelling, e.g. based on point process theory, which does not explicitly take scaling into account. In light of this, there is a need to investigate the degree to which point-process models are able to ,unintentionally' reproduce the empirical scaling properties. In the present study, four 25-year series of 20-min rainfall intensities observed in Arno River basin, Italy, were investigated. A Neyman,Scott rectangular pulses (NSRP) model was fitted to these series, so enabling the generation of synthetic time-series suitable for investigation. A multifractal scaling behaviour was found to characterize the raw data within a range of time-scales between approximately 20 min and 1 week. The main features of this behaviour were surprisingly well reproduced in the simulated data, although some differences were observed, particularly at small scales below the typical duration of a rain cell. This suggests the possibility of a combined use of the NSRP model and a scaling approach, in order to extend the NSRP range of applicability for simulation purposes. Copyright © 2002 John Wiley & Sons, Ltd. [source]


Outcome of secondary root canal treatment: a systematic review of the literature

INTERNATIONAL ENDODONTIC JOURNAL, Issue 12 2008
Y.-L. Ng
Abstract Aims, (i) To investigate the effects of study characteristics on the reported success rates of secondary root canal treatment (2°RCT or root canal retreatment); and (ii) to investigate the effects of clinical factors on the success of 2°RCT. Methodology, Longitudinal human clinical studies investigating outcome of 2°RCT which were published upto the end of 2006 were identified electronically (MEDLINE and Cochrane database 1966,2006 Dec, week 4). Four journals (Dental Traumatology, International Endodontic Journal, Journal of Endodontics, Oral Surgery Oral Medicine Oral Pathology Endodontics Radiology), bibliographies of all relevant papers and review articles were hand-searched. Two reviewers (Y-LN, KG) independently assessed and selected the studies based on specified inclusion criteria and extracted the data onto a pre-designed proforma, independently. The criteria were: (i) Clinical studies on 2°RCT; (ii) Stratified analyses available for 2°RCT where 1°RCT data included; (iii) Sample size given and larger than 10; (iv) At least 6-month post-operative review; (v) Success based on clinical and/or radiographic criteria (strict = absence of apical radiolucency; loose = reduction in size of radiolucency); and (vi) Overall success rate given or could be calculated from the raw data. Three strands of evidence or analyses were used to triangulate a consensus view. The reported findings from individual studies, including those excluded for quantitative analysis, were utilized for the intuitive synthesis which constituted the first strand of evidence. Secondly, the pooled weighted success rates by each study characteristic and potential prognostic factor were estimated using the random effect model. Thirdly, the effects of study characteristics and prognostic factors (expressed as odds ratios) on success rates were estimated using fixed and random effects meta-analysis with DerSimonean and Laird's methods. Meta-regression models were used to explore potential sources of statistical heterogeneity. Study characteristics considered in the meta-regression analyses were: decade of publication, study-specific criteria for success (radiographic, combined radiographic & clinical), unit of outcome measure (tooth, root), duration after treatment when assessing success (,at least 4 years' or ,<4 years'), geographic location of the study (North American, Scandinavian, other countries), and qualification of the operator (undergraduate students, postgraduate students, general dental practitioners, specialist or mixed group). Results, Of the 40 papers identified, 17 studies published between 1961 and 2005 were included; none were published in 2006. The majority of studies were retrospective (n = 12) and only five prospective. The pooled weighted success rate of 2°RCT judged by complete healing was 76.7% (95% CI 73.6%, 89.6%) and by incomplete healing, 77.2% (95% CI 61.1%, 88.1%). The success rates by ,decade of publication' and ,geographic location of study' were not significantly different at the 5% level. Eighteen clinical factors had been investigated in various combinations in previous studies. The most frequently and thoroughly investigated were ,periapical status' (n = 13), ,size of lesion' (n = 7), and ,apical extent of RF' (n = 5) which were found to be significant prognostic factors. The effect of different aspects of primary treatment history and re-treatment procedures has been poorly tested. Conclusions, The pooled estimated success rate of secondary root canal treatment was 77%. The presence of pre-operative periapical lesion, apical extent of root filling and quality of coronal restoration proved significant prognostic factors with concurrence between all three strands of evidence whilst the effects of 1°RCT history and 2°RCT protocol have been poorly investigated. [source]


Correlation between clinical success and apical dye penetration

INTERNATIONAL ENDODONTIC JOURNAL, Issue 8 2001
C. M. Oliver
Abstract Aim This study was undertaken to examine whether a correlation exists between apical dye penetration and the clinical performance of root fillings. Methodology Apical dye penetration into 116 roots of human teeth that had been root-filled at least 6 months prior to extraction was tested in vitro using a vacuum technique and by measuring the length of dye penetration. Endodontic treatment was classified as clinically successful or unsuccessful and results for these groups were compared using analysis of variance and the Student's t -test. Positive and negative controls were used to test the experimental system. Results All controls performed as expected. Dye penetrated significantly further in unsuccessful cases although the raw data suggested little difference. Overall, dye penetrated 99.5% of the specimens, indicating that the presence of dye in the canal is a poor indicator of whether the technique or material will succeed. However, the extent of dye penetration may be related to the clinical outcome. Conclusions Clinically placed root canal fillings do not provide an apical seal that prevents fluid penetration. The outcome of treatment cannot be predicted from the results of apical dye leakage studies. [source]


Muscle Weakness and Falls in Older Adults: A Systematic Review and Meta-Analysis

JOURNAL OF AMERICAN GERIATRICS SOCIETY, Issue 7 2004
Julie D. Moreland MSc
Objectives: To evaluate and summarize the evidence of muscle weakness as a risk factor for falls in older adults. Design: Random-effects meta-analysis. Setting: English-language studies indexed in MEDLINE and CINAHL (1985,2002) under the key words aged and accidental falls and risk factors; bibliographies of retrieved papers. Participants: Fifty percent or more subjects in a study were aged 65 and older. Studies of institutionalized and community-dwelling subjects were included. Measurements: Prospective cohort studies that included measurement of muscle strength at inception (in isolation or with other factors) with follow-up for occurrence of falls. Methods: Sample size, population, setting, measure of muscle strength, and length of follow-up, raw data if no risk estimate, odds ratios (ORs), rate ratios, or incidence density ratios. Each study was assessed using the validity criteria: adjustment for confounders, objective definition of fall outcome, reliable method of measuring muscle strength, and blinded outcome measurement. Results: Thirty studies met the selection criteria; data were available from 13. For lower extremity weakness, the combined OR was 1.76 (95% confidence interval (CI)=1.31,2.37) for any fall and 3.06 (95% CI=1.86,5.04) for recurrent falls. For upper extremity weakness the combined OR was 1.53 (95% CI=1.01,2.32) for any fall and 1.41 (95% CI=1.25,1.59) for recurrent falls. Conclusion: Muscle strength (especially lower extremity) should be one of the factors that is assessed and treated in older adults at risk for falls. More clinical trials are needed to isolate whether muscle-strengthening exercises are effective in preventing falls. [source]


The establishment of an urban bird population

JOURNAL OF ANIMAL ECOLOGY, Issue 5 2008
Christian Rutz
Summary 1Despite the accelerating global spread of urbanized habitats and its associated implications for wildlife and humans, surprisingly little is known about the biology of urban ecosystems. 2Using data from a 60-year study period, this paper provides a detailed description of how the northern goshawk Accipiter gentilis L. , generally considered a shy forest species , colonized the city of Hamburg, Germany. Six non-mutually exclusive hypotheses are investigated regarding the environmental factors that may have triggered this invasion. 3The spatio-temporal analysis of 2556 goshawk chance observations (extracted from a total data set of 1 174 493 bird observations; 1946,2003) showed that hawks regularly visited the city centre decades before the first successful breeding attempts were recorded. Many observations were made in parts of the city where territories were established in later years, demonstrating that these early visitors had encountered, but not used, potential nest sites. 4Pioneer settlement coincided with: (i) an increase in (legal) hunting pressure on goshawks in nearby rural areas; (ii) an increase in avian prey abundance in the city; and (iii) a succession of severe winters in the Greater Hamburg area. On the other hand, there was no evidence to suggest that the early stages of the invasion were due to: (i) decreasing food availability in rural areas; (ii) major habitat changes in the city; or (iii) rural intraguild dynamics forcing hawks into urban refugia. While breeding numbers of a potential rural source population were at a long-term low when the city was colonized, prior to first settlement there was a sharp increase of goshawk chance observations in the city and its rural periphery. 5The urban population expanded rapidly, and pair numbers began to stabilize after about 10 years. Ringing data (219 ringed nestlings from 70 urban broods; 1996,2000) demonstrated that most urban recruits had fledged in the city, but also confirmed considerable gene flow between urban and rural habitats. Analysis of chance observations (as raw data or as detrended time series) suggested a tight coupling of population dynamics inside and outside the city. 6City-colonizations such as the one described here provide a valuable opportunity to study some fundamental aspects of population ecology on a scale at which detailed monitoring is logistically feasible. Furthermore, a good understanding of urban ecology has become essential for efficient wildlife conservation in modern, human-altered environments. [source]


A simple and low-cost solution for the automation of X-ray powder diffractometers with chart recorder output

JOURNAL OF APPLIED CRYSTALLOGRAPHY, Issue 4 2006
M. Jayaprakasan
X-ray powder diffraction is an established method for the qualitative identification of crystalline materials and their quantitative analysis. The new generation of X-ray diffraction systems are based on expensive digital/embedded control technology and computer interfaces. Yet many laboratories use conventional manual-controlled systems with XY strip-chart recorders. Since the output spectrum is a strip chart (hard copy), raw data, essential for structural and qualitative analysis, are not readily available for further analysis. Upgrading to modern computerized diffractometers is very expensive. The proposed automation design described here is intended to enable the conventional diffractometer user to collect, store and analyze data quickly. The design also improves the resolution by five times compared with the conventional setup. For the automation, a PC add-on card has been designed to control and collect the timing and intensity counts from the conventional X-ray diffractometer, and suitable software has been developed to collect, process and present the X-ray diffraction data for both qualitative and quantitative analysis. Moreover, a major advantage of this design is that it does not warrant any physical modification of the hardware of the conventional setup; it is simply an extension to enhance the performance of collecting raw data with a higher resolution at desired intervals/timings. [source]


Do bacteria need to be regulated?

JOURNAL OF APPLIED MICROBIOLOGY, Issue 3 2006
P. Silley
Abstract Additives for use in animal nutrition are regulated under Regulation (EC) No. 1831/2003. The scope of this paper addresses the specific microbiological issues relevant to a microbial feed additive, containing a Bacillus spp. and uses as an example a product with the trade name, Calsporin®. Bacillus subtilis C-3102 is the active ingredient in Calsporin® and is added to animal feed to favourably affect animal production and performance (growth and feed efficiency), by modulating the gastrointestinal flora. It is not the purpose of this review to present the raw data for Calsporin® but rather to use Calsporin® as an example of the type of data required by the European regulatory authorities. At the time of preparation of this manuscript Calsporin® has yet to be reviewed by the authorities. The regulatory system under the auspices of the EFSA FEEDAP Panel is clearly attempting to move in line with development of scientific opinion and is to be applauded for such efforts. Bacteria do need to be regulated, and the regulations clearly provide adequate and appropriate protection to human health and to environmental considerations. [source]


Studies on the adaptability of different Borgen norms applied in self-modeling curve resolution (SMCR) method

JOURNAL OF CHEMOMETRICS, Issue 6 2009
Róbert Rajkó
Abstract Lawton and Sylvestre, and later Borgen et al. provided first the analytical solution for determining feasible regions of self-modeling curve resolution (SMCR) method for two- and three-component systems, respectively. After 20 years, Rajkó and István recently revitalized Borgen's method given a clear interpretation and algorithm how to draw Borgen plots using computer geometry tools; later Rajkó proved the existence of the natural duality in minimal constrained SMCR. In both latter cases, 1-norm was used to normalize raw data; however Borgen et al. introduced a more general class of normalization. In this paper, the definition and detailed descriptions of Borgen norms are given firstly appearing in the chemical literature. Some theoretical and practical studies on the adaptability of some Borgen norms used for SMCR method are also provided. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Analysis of video images from a gas,liquid transfer experiment: a comparison of PCA and PARAFAC for multivariate image analysis

JOURNAL OF CHEMOMETRICS, Issue 7 2003
Stephen P. Gurden
Abstract The use of chemical imaging is a developing area which has potential benefits for chemical systems where spatial distribution is important. Examples include processes in which homogeneity is critical, such as polymerizations, pharmaceutical powder blending and surface catalysis, and dynamic processes such as the study of diffusion rates or the transport of environmental pollutants. Whilst single images can be used to determine chemical distribution patterns at a given point in time, dynamic processes can be studied using a sequence of images measured at regular time intervals, i.e. a movie. Multivariate modeling of image data can help to provide insight into the important chemical factors present. However, many issues of how best to apply these models remain unclear, especially when the data arrays involved have four or five different dimensions (height, width, wavelength, time, experiment number, etc.). In this paper we describe the analysis of video images recorded during an experiment to investigate the uptake of CO2 across a free air,water interface. The use of PCA and PARAFAC for the analysis of both single images and movies is described and some differences and similarities are highlighted. Some other image transformation techniques, such as chemical mapping and histograms, are found to be useful both for pretreatment of the raw data and for dimensionality reduction of the data arrays prior to further modeling. Copyright © 2003 John Wiley & Sons, Ltd. [source]


Advanced relationships between categories analysis as a qualitative research tool

JOURNAL OF CLINICAL PSYCHOLOGY, Issue 7 2010
Merav Rabinovich
Abstract The authors propose an advanced relationships between categories (RBC) model as an expansion of Tutty, Rothery, and Grinnell's (1996) qualitative tool for classifying RBC patterns as contained, temporal, and causal relationships. It is assumed that identification of the relationships obtained among categories of qualitative data paves the way for construction of a theory, even though few tools have been developed for this purpose to date. The advanced RBC model points to three additional relationship patterns: bilateral, trilateral, and quadrilateral relationships. These relationships reveal how the text itself links among its various components. The model serves as an innovative tool for systematic derivation of explanations based on the qualitative raw data, contributing to grounded theory and other interpretive studies. © 2010 Wiley Periodicals, Inc. J Clin Psychol: 66:1,11, 2010. [source]


Quantum chemical geometry optimizations in proteins using crystallographic raw data

JOURNAL OF COMPUTATIONAL CHEMISTRY, Issue 11 2002
Ulf Ryde
Abstract A method is developed for the combination of quantum chemical geometry optimizations and crystallographic structure refinement. The method is implemented by integrating the quantum chemical software Turbomole with the crystallographic software Crystallography and NMR System (CNS), using three small procedures transferring information between the two programs. The program (COMQUM-X)is used to study the binding of the inhibitor N -methylmesoporphyrin to ferrochelatase, and we show that the method behaves properly and leads to an improvement of the structure of the inhibitor. It allows us to directly quantify in energy terms how much the protein distort the structure of the bound inhibitor compared to the optimum vacuum structure (4,6 kJ/mol). The approach improves the standard combined quantum chemical and molecular mechanics (QC/MM) approach by guaranteeing that the final structure is in accordance with experimental data (the reflections) and avoiding the risk of propagating errors in the crystal coordinates. The program can also be seen as an improvement of standard crystallographic refinement, providing an accurate empirical potential function for any group of interest. The results can be directly interpreted in standard crystallographic terms (e.g., R factors or electron density maps). The method can be used to interpret crystal structures (e.g., the protonation status of metal-bound water molecules) and even to locally improve them. © 2002 Wiley Periodicals, Inc. J Comput Chem 23: 1058,1070, 2002 [source]


Hyperlink Analyses of the World Wide Web: A Review

JOURNAL OF COMPUTER-MEDIATED COMMUNICATION, Issue 4 2003
Han Woo Park
We have recently witnessed the growth of hyperlink studies in the field of Internet research. Although investigations have been conducted across many disciplines and topics, their approaches can be largely divided into hyperlink network analysis (HNA) and Webometrics. This article is an extensive review of the two analytical methods, and a reflection on their application. HNA casts hyperlinks between Web sites (or Web pages) as social and communicational ties, applying standard techniques from Social Networks Analysis to this new data source. Webometrics has tended to apply much simpler techniques combined with a more in-depth investigation into the validity of hypotheses about possible interpretations of the results. We conclude that hyperlinks are a highly promising but problematic new source of data that can be mined for previously hidden patterns of information, although much care must be taken in the collection of raw data and in the interpretation of the results. In particular, link creation is an unregulated phenomenon and so it would not be sensible to assume that the meaning of hyperlinks in any given context is evident, without a systematic study of the context of link creation, and of the relationship between link counts, among other measurements. Social Networks Analysis tools and techniques form an excellent resource for hyperlink analysis, but should only be used in conjunction with improved techniques for data collection, validation and interpretation. [source]


Metabolomics-based systematic prediction of yeast lifespan and its application for semi-rational screening of ageing-related mutants

AGING CELL, Issue 4 2010
Ryo Yoshida
Summary Metabolomics , the comprehensive analysis of metabolites , was recently used to classify yeast mutants with no overt phenotype using raw data as metabolic fingerprints or footprints. In this study, we demonstrate the estimation of a complicated phenotype, longevity, and semi-rational screening for relevant mutants using metabolic profiles as strain-specific fingerprints. The fingerprints used in our experiments are profiled data consisting of individually identified and quantified metabolites rather than raw spectrum data. We chose yeast replicative lifespan as a model phenotype. Several yeast mutants that affect lifespan were selected for analysis, and they were subjected to metabolic profiling using mass spectrometry. Fingerprinting based on the profiles revealed a correlation between lifespan and metabolic profile. Amino acids and nucleotide derivatives were the main contributors to this correlation. Furthermore, we established a multivariate model to predict lifespan from a metabolic profile. The model facilitated the identification of putative longevity mutants. This work represents a novel approach to evaluate and screen complicated and quantitative phenotype by means of metabolomics. [source]


Comparison of genetic (co)variance matrices within and between Scabiosa canescens and S. columbaria

JOURNAL OF EVOLUTIONARY BIOLOGY, Issue 5 2000
Waldmann
In the current study, we used bootstrap analyses and the common principal component (CPC) method of Flury (1988) to estimate and compare the G -matrix of Scabiosa columbaria and S. canescens populations. We found three major patterns in the G -matrices: (i) the magnitude of the (co)variances was more variable among characters than among populations, (ii) different populations showed high (co)variance for different characters, and (iii) there was a tendency for S. canescens to have higher genetic (co)variances than S. columbaria. The hypothesis of equal G -matrices was rejected in all comparisons and there was no evidence that the matrices differed by a proportional constant in any of the analyses. The two ,species matrices' were found to be unrelated, both for raw data and data standardized over populations, and there was significant between-population variation in the G -matrix in both species. Populations of S. canescens showed conservation of structure (principal components) in their G -matrices, contrasting with the lack of common structure among the S. columbaria matrices. Given these observations and the results from previous studies, we propose that selection may be responsible for some of the variation between the G -matrices, at least in S. columbaria and at the between-species level. [source]


Facial Soft Tissue Depths in Craniofacial Identification (Part I): An Analytical Review of the Published Adult Data,

JOURNAL OF FORENSIC SCIENCES, Issue 6 2008
Carl N. Stephan Ph.D.
Abstract:, With the ever increasing production of average soft tissue depth studies, data are becoming increasingly complex, less standardized, and more unwieldy. So far, no overarching review has been attempted to determine: the validity of continued data collection; the usefulness of the existing data subcategorizations; or if a synthesis is possible to produce a manageable soft tissue depth library. While a principal components analysis would provide the best foundation for such an assessment, this type of investigation is not currently possible because of a lack of easily accessible raw data (first, many studies are narrow; second, raw data are infrequently published and/or stored and are not always shared by some authors). This paper provides an alternate means of investigation using an hierarchical approach to review and compare the effects of single variables on published mean values for adults whilst acknowledging measurement errors and within-group variation. The results revealed: (i) no clear secular trends at frequently investigated landmarks; (ii) wide variation in soft tissue depth measures between different measurement techniques irrespective of whether living persons or cadavers were considered; (iii) no clear clustering of non-Caucasoid data far from the Caucasoid means; and (iv) minor differences between males and females. Consequently, the data were pooled across studies using weighted means and standard deviations to cancel out random and opposing study-specific errors, and to produce a single soft tissue depth table with increased sample sizes (e.g., 6786 individuals at pogonion). [source]


Information and communication technology for process management in healthcare: a contribution to change the culture of blame

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 6-7 2010
Silvana Quaglini
Abstract Statistics on medical errors and their consequences has astonished, during the previous years, both healthcare professionals and ordinary people. Mass-media are becoming more and more sensitive to medical malpractices. This paper elaborates on the well-known resistance of the medical world to disclose actions and processes that could have caused some damages; it illustrates the possible causes of medical errors and, for some of them, it suggests solutions based on information and communication technology. In particular, careflow management systems and process mining techniques are proposed as a means to improve the healthcare delivery process: the former by facilitating task assignments and resource management, the latter by discovering not only individuals' errors, but also the chains of responsibilities concurring to produce errors in a complex patient's pathway. Both supervised and unsupervised process mining will be addressed. The former compares real processes with a known process model (e.g., a clinical practice guideline or a medical protocol), whereas the latter mines processes from raw data, without imposing any model. The potentiality of these techniques is illustrated by means of examples from stroke patient management. Copyright © 2010 John Wiley & Sons, Ltd. [source]


Development and evolution of a heterogeneous continuous media server: a case study

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 2 2005
Dwight J. Makaroff
Abstract Media server software is significantly complicated to develop and maintain, due to the nature of the many interface aspects which must be considered. This paper provides a case study of the design, implementation, and evolution of a continuous media file server. We place emphasis on the evolution of the software and our approach to maintainability. The user interface is a major consideration, even though the server software would appear isolated from that factor. Since continuous media servers must send the raw data to a client application over a network, the protocol considerations, hardware interface, and data storage/retrieval methods are of the paramount importance. In addition, the application programmer's interface to the server facilities has an impact on both the internal design and the performance of such a server. We discuss our experiences and insight into the development of such software products within a small research-based university environment. We experienced two main types of evolutionary change: requirements changes from the limited user community and performance enhancements/corrections. While the former were anticipated via a generic interface and modular design structure, the latter were surprising and substantially more difficult to solve. Copyright © 2005 John Wiley & Sons, Ltd. [source]


Image processing pipeline for synchrotron-radiation-based tomographic microscopy

JOURNAL OF SYNCHROTRON RADIATION, Issue 4 2010
C. Hintermüller
With synchrotron-radiation-based tomographic microscopy, three-dimensional structures down to the micrometer level can be visualized. Tomographic data sets typically consist of 1000 to 1500 projections of 1024 × 1024 to 2048 × 2048 pixels and are acquired in 5,15,min. A processing pipeline has been developed to handle this large amount of data efficiently and to reconstruct the tomographic volume within a few minutes after the end of a scan. Just a few seconds after the raw data have been acquired, a selection of reconstructed slices is accessible through a web interface for preview and to fine tune the reconstruction parameters. The same interface allows initiation and control of the reconstruction process on the computer cluster. By integrating all programs and tools, required for tomographic reconstruction into the pipeline, the necessary user interaction is reduced to a minimum. The modularity of the pipeline allows functionality for new scan protocols to be added, such as an extended field of view, or new physical signals such as phase-contrast or dark-field imaging etc. [source]