Original Data (original + data)

Distribution by Scientific Domains

Terms modified by Original Data

  • original data set

  • Selected Abstracts


    Two millennia of male stature development and population health and wealth in the Low Countries

    INTERNATIONAL JOURNAL OF OSTEOARCHAEOLOGY, Issue 4 2005
    G. J. R. MaatArticle first published online: 5 AUG 200
    Abstract This paper offers a review of shifts in average male stature and their relationship with health and wealth in the Low Countries from AD 50 to 1997. Twenty-one population samples were studied to cover the full time span. To make data compatible, so-called ,virtual statures' were used, i.e. the statures which adult males were supposed to have had at the end of their growth period, before they started shrinking by ageing. Original data were extracted from ,in situ measured statures', ,calculated statures' and ,corrected cadaveric statures'. If possible, maximum femoral lengths were also collected from the same population samples to check whether trends in stature development were in agreement with raw skeletal data. A long phase of stature decrease from ca. 176,cm to 166,cm, a so-called ,negative secular trend', was noticed from the Roman Period up to and including the first half of the 19th century. This was followed by a sharp and still ongoing increase in stature to 184,cm, a typical ,positive secular trend', from the second half of the 19th century to the present time. General shifts in stature and ,outliers' illustrative for the process are viewed in the context of socio-economic, demographic, health and nutritional factors. Copyright © 2005 John Wiley & Sons, Ltd. [source]


    Premedication with clonidine is superior to benzodiazepines.

    ACTA ANAESTHESIOLOGICA SCANDINAVICA, Issue 4 2010
    A meta analysis of published studies
    Background: Premedication is considered important in pediatric anesthesia. Benzodiazepines are the most commonly used premedication agents. Clonidine, an ,2 adrenoceptor agonist, is gaining popularity among anesthesiologists. The goal of the present study was to perform a meta-analysis of studies comparing premedication with clonidine to Benzodiazepines. Methods: A comprehensive literature search was conducted to identify clinical trials focusing on the comparison of clonidine and Benzodiazepines for premedication in children. Six reviewers independently assessed each study to meet the inclusion criteria and extracted data. Original data from each trial were combined to calculate the pooled odds ratio (OR) or the mean differences (MD), 95% confidence intervals [95% CI] and statistical heterogeneity were accessed. Results: Ten publications fulfilling the inclusion criteria were found. Premedication with clonidine, in comparison with midazolam, exhibited a superior effect on sedation at induction (OR=0.49 [0.27, 0.89]), decreased the incidence of emergence agitation (OR=0.25 [0.11, 0.58]) and produced a more effective early post-operative analgesia (OR=0.33 [0.21, 0.58]). Compared with diazepam, clonidine was superior in preventing post-operative nausea and vomiting (PONV). Discussion: Premedication with clonidine is superior to midazolam in producing sedation, decreasing post-operative pain and emergence agitation. However, the superiority of clonidine for PONV prevention remains unclear while other factors such as nausea prevention might interfere with this result. [source]


    Towards closing the analysis gap: Visual generation of decision supporting schemes from raw data

    COMPUTER GRAPHICS FORUM, Issue 3 2008
    T. May
    Abstract The derivation, manipulation and verification of analytical models from raw data is a process which requires a transformation of information across different levels of abstraction. We introduce a concept for the coupling of data classification and interactive visualization in order to make this transformation visible and steerable for the human user. Data classification techniques generate mappings that formally group data items into categories. Interactive visualization includes the user into an iterative refinement process. The user identifies and selects interesting patterns to define these categories. The following step is the transformation of a visible pattern into the formal definition of a classifier. In the last step the classifier is transformed back into a pattern that is blended with the original data in the same visual display. Our approach allows in intuitive assessment of a formal classifier and its model, the detection of outliers and the handling of noisy data using visual pattern-matching. We instantiated the concept using decision trees for classification and KVMaps as the visualization technique. The generation of a classifier from visual patterns and its verification is transformed from a cognitive to a mostly pre-cognitive task. [source]


    A Screen Space Quality Method for Data Abstraction

    COMPUTER GRAPHICS FORUM, Issue 3 2008
    J. Johansson
    Abstract The rendering of large data sets can result in cluttered displays and non-interactive update rates, leading to time consuming analyses. A straightforward solution is to reduce the number of items, thereby producing an abstraction of the data set. For the visual analysis to remain accurate, the graphical representation of the abstraction must preserve the significant features present in the original data. This paper presents a screen space quality method, based on distance transforms, that measures the visual quality of a data abstraction. This screen space measure is shown to better capture significant visual structures in data, compared with data space measures. The presented method is implemented on the GPU, allowing interactive creation of high quality graphical representations of multivariate data sets containing tens of thousands of items. [source]


    Short-Term Traffic Volume Forecasting Using Kalman Filter with Discrete Wavelet Decomposition

    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 5 2007
    Yuanchang Xie
    Short-term traffic volume data are often corrupted by local noises, which may significantly affect the prediction accuracy of short-term traffic volumes. Discrete wavelet decomposition analysis is used to divide the original data into several approximate and detailed data such that the Kalman filter model can then be applied to the denoised data and the prediction accuracy can be improved. Two types of wavelet Kalman filter models based on Daubechies 4 and Haar mother wavelets are investigated. Traffic volume data collected from four different locations are used for comparison in this study. The test results show that both proposed wavelet Kalman filter models outperform the direct Kalman filter model in terms of mean absolute percentage error and root mean square error. [source]


    Testing Models of Low-Frequency Variability

    ECONOMETRICA, Issue 5 2008
    Ulrich K. Müller
    We develop a framework to assess how successfully standard time series models explain low-frequency variability of a data series. The low-frequency information is extracted by computing a finite number of weighted averages of the original data, where the weights are low-frequency trigonometric series. The properties of these weighted averages are then compared to the asymptotic implications of a number of common time series models. We apply the framework to twenty U.S. macroeconomic and financial time series using frequencies lower than the business cycle. [source]


    Advanced glycation end-products and the kidney

    EUROPEAN JOURNAL OF CLINICAL INVESTIGATION, Issue 8 2010
    Martin Busch
    Eur J Clin Invest 2010; 40 (8): 742,755 Abstract Background, Advanced glycation end-products (AGEs) are increased in situations with hyperglycemia and oxidative stress such as diabetes mellitus. They are products of nonenzymatic glycation and oxidation of proteins and lipids. The kidney plays an important role in clearance and metabolism of AGEs. Methods, Medline© and other relevant databases were searched. In addition, key review articles were scanned for relevant original publication. Finally, original data from our research group were also included. Results, Kidney podocytes and endothelial cells express specific receptors for AGEs. Their activation leads to multiple pathophysiological effects including hypertrophy with cell cycle arrest and apoptosis, altered migration, and generation of proinflammatory cytokines. AGEs have been primarily implicated in the pathophysiology of diabetic nephropathy and diabetic microvascular complications. AGEs are also involved in other primary renal diseases as well as in the development and progression of atherosclerosis. However, serum or plasma concentrations of AGEs do not correlate well with cardiovascular events in patients with chronic kidney disease (CKD). This is likely due to the fact that serum concentrations failed to correlate with AGEs deposited in target tissues. Several inhibitors of the AGE-RAGE axis are currently tested for various indications. Conclusion, AGEs and their receptors are involved in the pathogenesis of vascular and kidney disease. The role of circulating AGEs as biomarkers for cardiovascular risk estimation is questionable. Whether putative inhibitors of AGEs will get the maturity for its therapeutic use in the future remains open. [source]


    Analysing soil variation in two dimensions with the discrete wavelet transform

    EUROPEAN JOURNAL OF SOIL SCIENCE, Issue 4 2004
    R. M. Lark
    Summary Complex spatial variation in soil can be analysed by wavelets into contributions at several scales or resolutions. The first applications were to data recorded at regular intervals in one dimension, i.e. on transects. The theory extends readily to two dimensions, but the application to small sets of gridded data such as one is likely to have from a soil survey requires special adaptation. This paper describes the extension of wavelet theory to two dimensions. The adaptation of the wavelet filters near the limits of a region that was successful in one dimension proved unsuitable in two dimensions. We therefore had to pad the data out symmetrically beyond the limits to minimize edge effects. With the above modifications and Daubechies's wavelet with two vanishing moments the analysis is applied to soil thickness, slope gradient, and direct solar beam radiation at the land surface recorded at 100-m intervals on a 60 × 101 square grid in south-west England. The analysis revealed contributions to the variance at several scales and for different directions and correlations between the variables that were not evident in maps of the original data. In particular, it showed how the thickness of the soil increasingly matches the geological structure with increasing dilation of the wavelet, this relationship being local to the strongly aligned outcrops. The analysis reveals a similar pattern in slope gradient, and a negative correlation with soil thickness, most clearly evident at the coarser scales. The solar beam radiation integrates slope gradient and azimuth, and the analysis emphasizes the relations with topography at the various spatial scales and reveals additional effects of aspect on soil thickness. [source]


    Innovation and Innovators Inside Government: From Institutions to Networks

    GOVERNANCE, Issue 4 2007
    MARK CONSIDINE
    Innovation and innovators inhabit an institutional space, which is partially defined by formal positions and partially by informal networks. This article investigates the role of politicians and bureaucrats in fostering innovation inside government and provides an empirical explanation of who the innovators are, whether this is mostly an attribute of position or role, or mostly an effect of certain forms of networking. The study uses original data collected from 11 municipal governments in Australia in order to define and describe the normative underpinnings of innovation inside government and to show the importance of advice and strategic information networks among politicians and senior bureaucrats (n = 947). Social network analysis is combined with conventional statistical analysis in order to demonstrate the comparative importance of networks in explaining who innovates. [source]


    Torsional Motion in (tert -Butyl)ammonium Hemispheraplexes: Rotational Barriers and Energy of Binding

    HELVETICA CHIMICA ACTA, Issue 5 2003
    Emily
    The ADPs (ADPs=atomic anisotropic displacement parameters) from the single-crystal X-ray studies of nine related TBA+ (TBA+=(tert -butyl)ammonium) hemispheraplexes are analyzed, and the results compared to the free energy of binding of this guest by the nine hosts. The lipophilic hosts (Fig.,1) were synthesized over a number of years, with increasing pre-organization for and specificity of binding. Structural studies for six of the complexes have been published, but the remaining three structures, including those of the strongest binders of TBA+, are disordered and have only now been completed. New area-detector data has been analyzed for the TBA+ClO complexes of 5 and of 8 at two temperatures, while the original data for 9,TBA+SCN, has been treated with a disorder model. In addition, improved models are presented for the complexes of 6 and 7. Methods for assessing the precision of the ADP analyses are discussed. Although most of the structures are imprecise, the TBA+ groups do demonstrate some of the characteristics of independent motion. The general trend in calculated libration amplitudes for the TBA+ group suggests that the guests with the greatest free energy of binding, and the shortest distances from N+ to the ligand plane, are those with the highest barriers to internal rotation. [source]


    Current status of minimally invasive necrosectomy for post-inflammatory pancreatic necrosis

    HPB, Issue 2 2009
    Benoy Idicula Babu
    Abstract Objective:, This paper reviews current knowledge on minimally invasive pancreatic necrosectomy. Background:, Blunt (non-anatomical) debridement of necrotic tissue at laparotomy is the standard method of treatment of infected post-inflammatory pancreatic necrosis. Recognition that laparotomy may add to morbidity by increasing postoperative organ dysfunction has led to the development of alternative, minimally invasive methods for debridement. This study reports the status of minimally invasive necrosectomy by different approaches. Methods:, Searches of MEDLINE and EMBASE for the period 1996,2008 were undertaken. Only studies with original data and information on outcome were included. This produced a final population of 28 studies reporting on 344 patients undergoing minimally invasive necrosectomy, with a median (range) number of patients per study of nine (1,53). Procedures were categorized as retroperitoneal, endoscopic or laparoscopic. Results:, A total of 141 patients underwent retroperitoneal necrosectomy, of whom 58 (41%) had complications and 18 (13%) required laparotomy. There were 22 (16%) deaths. Overall, 157 patients underwent endoscopic necrosectomy; major complications were reported in 31 (20%) and death in seven (5%). Laparoscopic necrosectomy was carried out in 46 patients, of whom five (11%) required laparotomy and three (7%) died. Conclusions:, Minimally invasive necrosectomy is technically feasible and a body of evidence now suggests that acceptable outcomes can be achieved. There are no comparisons of results, either with open surgery or among different minimally invasive techniques. [source]


    Spatially adaptive color filter array interpolation for noiseless and noisy data

    INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY, Issue 3 2007
    Dmitriy Paliy
    Abstract Conventional single-chip digital cameras use color filter arrays (CFA) to sample different spectral components. Demosaicing algorithms interpolate these data to complete red, green, and blue values for each image pixel, to produce an RGB image. In this article, we propose a novel demosaicing algorithm for the Bayer CFA. For the algorithm design, we assume that, following the concept proposed in (Zhang and Wu, IEEE Trans Image Process 14 (2005), 2167,2178), the initial interpolation estimates of color channels contain two additive components: the true values of color intensities and the errors that are considered as an additive noise. A specially designed signal-adaptive filter is used to remove this so-called demosaicing noise. This filter is based on the local polynomial approximation (LPA) and the paradigm of the intersection of confidence intervals applied to select varying scales of LPA. This technique is nonlinear and spatially-adaptive with respect to the smoothness and irregularities of the image. The presented CFA interpolation (CFAI) technique takes significant advantage from assuming that the original data is noise-free. Nevertheless, in many applications, the observed data is noisy, where the noise is treated as an important intrinsic degradation of the data. We develop an adaptation of the proposed CFAI for noisy data, integrating the denoising and CFAI into a single procedure. It is assumed that the data is given according to the Bayer pattern and corrupted by signal-dependant noise common for charge-coupled device and complementary-symmetry/metal-oxide semiconductor sensors. The efficiency of the proposed approach is demonstrated by experimental results with simulated and real data. © 2007 Wiley Periodicals, Inc. Int J Imaging Syst Technol, 17, 105,122, 2007 [source]


    The state of worker protections in the United States: Unregulated work in New York City

    INTERNATIONAL LABOUR REVIEW, Issue 2-3 2008
    Annette BERNHARDT
    Abstract. Using original data gathered in 2003,06, the authors examine the prevalence and types of non-compliance with labour law in New York City. Workplace violations , or "unregulated work", are widespread across a range of low-wage industries and have been driven by a mix of economic factors as well as public policy. The solution, the authors argue, is to strengthen law enforcement and provide for the new types of employment relationship that have resulted from changes in the organization of work and production. [source]


    Robust Methods for the Analysis of Income Distribution, Inequality and Poverty

    INTERNATIONAL STATISTICAL REVIEW, Issue 3 2000
    Maria-Pia Victoria-Feser
    Summary Income distribution embeds a large field of research subjects in economics. It is important to study how incomes are distributed among the members of a population in order for example to determine tax policies for redistribution to decrease inequality, or to implement social policies to reduce poverty. The available data come mostly from surveys (and not censuses as it is often believed) and often subject to long debates about their reliability because the sources of errors are numerous. Moreover the forms in which the data are availabe is not always as one would expect, i.e. complete and continuous (microdata) but one also can only have data in a grouped form (in income classes) and/or truncated data where a portion of the original data has been omitted from the sample or simply not recorded. Because of these data features, it is important to complement classical statistical procedures with robust ones. In tis paper such methods are presented, especially for model selection, model fitting with several types of data, inequality and poverty analysis and ordering tools. The approach is based on the Influence Function (IF) developed by Hampel (1974) and further developed by Hampel, Ronchetti, Rousseeuw & Stahel (1986). It is also shown through the analysis of real UK and Tunisian data, that robust techniques can give another picture of income distribution, inequality or poverty when compared to classical ones. [source]


    The analysis of motor vehicle crash clusters using the vector quantization technique

    JOURNAL OF ADVANCED TRANSPORTATION, Issue 3 2010
    Lorenzo Mussone
    Abstract In this paper, a powerful tool for analyzing motor vehicle data based on the vector quantization (VQ) technique is demonstrated. The technique uses an approximation of a probability density function for a stochastic vector without assuming an "a priori" distribution. A self-organizing map (SOM) is used to transform accident data from an N-dimensional space into a two-dimensional plane. The SOM retains all the original data yet provides an effective visual tool for describing patterns such as the frequency at which a particular category of events occurs. This enables new relationships to be identified. Accident data from three cities in Italy (Turin, Milan, and Legnano) are used to illustrate the usefulness of the technique. Crashes are aggregated and clustered crashes by type, severity, and along other dimensions. The paper includes discussion as to how this method can be utilized to further improve safety analysis. Copyright © 2010 John Wiley & Sons, Ltd. [source]


    Classification of GC-MS measurements of wines by combining data dimension reduction and variable selection techniques

    JOURNAL OF CHEMOMETRICS, Issue 8 2008
    Davide Ballabio
    Abstract Different classification methods (Partial Least Squares Discriminant Analysis, Extended Canonical Variates Analysis and Linear Discriminant Analysis), in combination with variable selection approaches (Forward Selection and Genetic Algorithms), were compared, evaluating their capabilities in the geographical discrimination of wine samples. Sixty-two samples were analysed by means of dynamic headspace gas chromatography mass spectrometry (HS-GC-MS) and the entire chromatographic profile was considered to build the dataset. Since variable selection techniques pose a risk of overfitting when a large number of variables is used, a method for coupling data dimension reduction and variable selection was proposed. This approach compresses windows of the original data by retaining only significant components of local Principal Component Analysis models. The subsequent variable selection is then performed on these locally derived score variables. The results confirmed that the classification models achieved on the reduced data were better than those obtained on the entire chromatographic profile, with the exception of Extended Canonical Variates Analysis, which gave acceptable models in both cases. Copyright © 2008 John Wiley & Sons, Ltd. [source]


    Hierarchical principal component analysis (PCA) and projection to latent structure (PLS) technique on spectroscopic data as a data pretreatment for calibration

    JOURNAL OF CHEMOMETRICS, Issue 4 2001
    K. Janné
    Abstract Spectroscopic data consists of several hundred to some thousand variables, wherein most of the variables are autocorrelated. When PCA and PLS techniques are used for the interpretation of these kinds of data, the loading plots are usually complex due to the covariation in the spectrum, and therefore difficult to correlate to the corresponding score plot. One of the standard methods used to decrease the influence of light scatter or shifts of the spectra is the multiplicative scatter correction technique. Another technique is the hierarchical multiblock segmentation technique, where new variables are created from the original data by blocking the spectra into sub spectra, and then projecting the sub spectra by PCA. These new variables are then used in the coming PCA or PLS calculations. These techniques reduce the random and non-wanted signals from e.g. light scatter, but still conserve all systematic information in the signals, but the greatest advantage is that the technique gives an easier interpretation of the correlation between scores and the loadings. Two examples are presented; the attenuated total reflection (ATR) and NIR, which show the advantages as well as the implementation of the method. Copyright © 2001 John Wiley & Sons, Ltd. [source]


    Adulteration of Chinese herbal medicines with synthetic drugs: a systematic review

    JOURNAL OF INTERNAL MEDICINE, Issue 2 2002
    E. Ernst
    Abstract.,Ernst E. (University of Exeter, Exeter, UK). Adulteration of Chinese herbal medicines with synthetic drugs: a systematic review (Review Article). J Intern Med 2002; 252: 107,113. The popularity of Chinese herbal medicines (CHMs) demands a critical analysis of safety issues. The aim of this systematic review is to summarize data regarding adulterations of CHMs with conventional drugs. Literature searches were carried out in six databases. Articles containing original data on adulterations were considered without language restrictions. Eighteen case reports, two case series and four analytical investigations were identified. The list of adulterants contains drugs associated with serious adverse effects like corticosteroids. In several instances, patients were seriously harmed. One report from Taiwan suggests that 24% of all samples were contaminated with at least one conventional pharmacological compound. It is concluded that adulteration of CHMs with synthetic drugs is a potentially serious problem which needs to be addressed by adequate regulatory measures. [source]


    Kernel matching scheme for block bootstrap of time series data

    JOURNAL OF TIME SERIES ANALYSIS, Issue 2 2004
    Tae Yoon Kim
    Abstract., The block bootstrap for time series consists in randomly resampling blocks of the original data with replacement and aligning these blocks into a bootstrap sample. Recently several matching schemes for the block bootstraps have been suggested to improve its performance by reduction of bias [Bernoulli 4 (1998), 305]. The matching schemes are usually achieved by aligning with higher likelihood those blocks which match at their ends. The kernel matching scheme we consider here takes some of the dependence structure of the data into account and is based on a kernel estimate of the conditional lag one distribution. In this article transition probabilities of the kernel matching scheme are investigated in detail by concentrating on a simple case. Our results here discuss theoretical properties of the transition probability matrix including ergodicity, which shows the potential of the matching scheme for bias reduction. [source]


    Consistent interpolation of equidistantly sampled data

    NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, Issue 4 2009
    Eike Rietsch
    Abstract Let u1, u2, ,, uN with un,, denote the values of a function recorded or computed at N real and equidistant abscissa values tn=n,t+t0 for n=1, ,, N. A consistent interpolation operator L, as defined in this paper, interpolates these function values for N new abscissas tn = (n+˝),t+t0, the first N,1 of which are halfway between those originally given while the last one is outside of the original abscissa range. Application of L to these interpolated function values produces the last N,1 samples u2, u3, ,, uN of the original data plus one extrapolated function value uN+1. Hence, L2 is essentially a shift operator, but with a prediction component. The difference between various interpolation methods (e.g. polynomials, Fourier series) is now reduced to the way in which uN+1 is determined. This concept not only permits a uniform view at interpolation by quite different classes of functions but also allows the creation of more general interpolation, differentiation, and integration formulas, which can be tailored to particular problems. Copyright © 2008 John Wiley & Sons, Ltd. [source]


    Rock and a Hard Place: Public Willingness to Trade Civil Rights and Liberties for Greater Security

    POLITICS & POLICY, Issue 5 2009
    HANK C. JENKINS-SMITH
    Our research examines the implications of political beliefs for the relationship between preferences for freedom and security. We briefly situate the relationship in historical context and relate it to today's struggle with terrorism. Then we examine the influence of political beliefs on normative preferences for how liberty and security should be related and for perceptions of how they currently are being balanced. Using original data from a national Internet survey of more than 3,000 respondents, we examine causal relationships among core, domain, and policy context beliefs for preferences about balancing freedom and security. [source]


    Re: Occupational exposure to pesticides and pancreatic cancer.

    AMERICAN JOURNAL OF INDUSTRIAL MEDICINE, Issue 2 2001
    2001.
    To the Editor: In our recent paper describing associations of pancreatic cancer to pesticides, Table I presented JEM scores for selected occupations. We omitted several scores from the table, which indicated less variability across the study than was assessed. We have added all possible scores in Table I to reflect this variability. Also, several scores in the table may need further explanation. The supervisors, food and beverage preparation and food counter, fountain and related occupations had possible exposure to nonagricultural fungicides (i.e. disinfectants). Water and sewage treatment plant operators may handle herbicides to kill algae. Historically, textile mills and some dry cleaning operations applied insecticides to fabric. The subjects with jobs of mixing/blending machine operator/tender, or welder and cutter who were assigned pesticide exposures had worked in the chemical industry where pesticides may have been manufactured. The omissions were only in the reported data and not in the original data, and thus did not affect the epidemiologic results or conclusions. [source]


    Decisional needs assessment regarding Down syndrome prenatal testing: a systematic review of the perceptions of women, their partners and health professionals

    PRENATAL DIAGNOSIS, Issue 13 2008
    Sylvie St-Jacques
    Abstract Objective To identify decisional needs of women, their partners and health professionals regarding prenatal testing for Down syndrome through a systematic review. Methods Articles reporting original data from real clinical situations on sources of difficulty and/or ease in making decisions regarding prenatal testing for Down syndrome were selected. Data were extracted using a taxonomy adapted from the Ottawa Decision-Support Framework and the quality of the studies was assessed using Qualsyst validated tools. Results In all 40 publications covering 32 unique studies were included. The majority concerned women. The most often reported sources of difficulty for decision-making in women were pressure from others, emotions and lack of information; in partners, emotion; in health professionals, lack of information, length of consultation, and personal values. The most important sources of ease were, in women, personal values, understanding and confidence in the medical system; in partners, personal values, information from external sources, and income; in health professionals, peer support and scientific meetings. Conclusion Interventions regarding a decision about prenatal testing for Down syndrome should address many decisional needs, which may indeed vary among the parties involved, whether women, their partners or health professionals. Very little is known about the decisional needs of partners and health professionals. Copyright © 2008 John Wiley & Sons, Ltd. [source]


    In-process Control of Design Inspection Effectiveness

    QUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, Issue 1 2004
    Tzvi Raz
    Abstract We present a methodology for the in-process control of design inspection focusing on escaped defects. The methodology estimates the defect escape probability at each phase in the process using the information available at the beginning of a particular phase. The development of the models is illustrated by a case involving data collected from the design inspections of software components. The data include the size of the product component, as well as the time invested in preparing for the inspection and actually carrying it out. After smoothing the original data with a clustering algorithm, to compensate for its excessive variability, a series of regression models exhibiting increasingly better fits to the data as more information becomes available was obtained. We discuss how management can use such models to reduce escape risk as the inspection process evolves. Copyright © 2003 John Wiley & Sons, Ltd. [source]


    Sex differences in anthropoid mandibular canine lateral enamel formation

    AMERICAN JOURNAL OF PHYSICAL ANTHROPOLOGY, Issue 2 2009
    Debbie Guatelli-Steinberg
    Abstract Previous research has demonstrated that great ape and macaque males achieve large canine crown sizes primarily through extended canine growth periods. Recent work has suggested, however, that platyrrhine males may achieve larger canine sizes by accelerating rather than prolonging growth. This study tested the hypothesis that the ontogenetic pathway leading to canine sexual dimorphism in catarrhines differs from that of platyrrhines. To test this hypothesis, males and females of several catarrhine genera (Hylobates, Papio, Macaca, Cercopithecus, and Cercocebus) and three platyrrhine genera (Cebus, Ateles, and Callicebus) were compared in the number and spacing of perikymata (enamel growth increments) on their canine crowns. In addition, perikymata periodicities (the number of days of growth perikymata represent) were determined for five genera (Hylobates, Papio, Macaca, Cebus, and Ateles) using previously published as well as original data gathered for this study. The central findings are as follows: 1) males have more perikymata than females for seven of eight genera (in five of the seven, the differences are statistically significant); 2) in general, the greater the degree of sexual dimorphism, the greater the sex difference in male and female perikymata numbers; 3) there is no evidence of a systematic sex difference in primate periodicities; and 4) there is some evidence that sex differences in enamel formation rates may make a minor contribution to canine sexual dimorphism in Papio and Cercopithecus. These findings strongly suggest that in both catarrhines and platyrrhines prolongation of male canine growth is the primary mechanism by which canine crown sexual dimorphism is achieved. Am J Phys Anthropol, 2009. © 2009 Wiley-Liss, Inc. [source]


    The Government Agenda in Parliamentary Democracies

    AMERICAN JOURNAL OF POLITICAL SCIENCE, Issue 3 2004
    Lanny W. Martin
    Lawmaking is a challenge for coalition governments because it inherently demands cooperation and compromise by parties with divergent policy goals. The jurisdictional system of cabinet government exacerbates the problem by providing parties the means to undermine the coalition bargain in the pursuit of their own policy interests. In this article, I explore whether arrangements that allow partners to police one another induce compromise on one of the most important decisions taken by a government,the organization of the policy agenda. In an analysis of original data on the timing and policy content of over 800 government bills from four European democracies, I show that coalition governments pursue a largely "accommodative" agenda. Policy initiatives dealing with issues that are more attractive to all partners in the coalition are likely to be given priority on the agenda, while those dealing with relatively unattractive issues are likely to be postponed. [source]


    Policing the Bargain: Coalition Government and Parliamentary Scrutiny

    AMERICAN JOURNAL OF POLITICAL SCIENCE, Issue 1 2004
    Lanny W. Martin
    Policymaking by coalition governments creates a classic principal-agent problem. Coalitions are comprised of parties with divergent preferences who are forced to delegate important policymaking powers to individual cabinet ministers, thus raising the possibility that ministers will attempt to pursue policies favored by their own party at the expense of their coalition partners. What is going to keep ministers from attempting to move policy in directions they favor rather than sticking to the "coalition deal"? We argue that parties will make use of parliamentary scrutiny of "hostile" ministerial proposals to overcome the potential problems of delegation and enforce the coalition bargain. Statistical analysis of original data on government bills in Germany and the Netherlands supports this argument. Our findings suggest that parliaments play a central role in allowing multiparty governments to solve intracoalition conflicts. [source]


    Adaptive thinning of atmospheric observations in data assimilation with vector quantization and filtering methods

    THE QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY, Issue 613 2005
    T. Ochotta
    Abstract In data assimilation for numerical weather prediction, measurements of various observation systems are combined with background data to define initial states for the forecasts. Current and future observation systems, in particular satellite instruments, produce large numbers of measurements with high spatial and temporal density. Such datasets significantly increase the computational costs of the assimilation and, moreover, can violate the assumption of spatially independent observation errors. To ameliorate these problems, we propose two greedy thinning algorithms, which reduce the number of assimilated observations while retaining the essential information content of the data. In the first method, the number of points in the output set is increased iteratively. We use a clustering method with a distance metric that combines spatial distance with difference in observation values. In a second scheme, we iteratively estimate the redundancy of the current observation set and remove the most redundant data points. We evaluate the proposed methods with respect to a geometric error measure and compare them with a uniform sampling scheme. We obtain good representations of the original data with thinnings retaining only a small portion of observations. We also evaluate our thinnings of ATOVS satellite data using the assimilation system of the Deutscher Wetterdienst. Impact of the thinning on the analysed fields and on the subsequent forecasts is discussed. Copyright © 2005 Royal Meteorological Society [source]


    What happens to translocated game birds that ,disappear'?

    ANIMAL CONSERVATION, Issue 5 2009
    M. J. Dickens
    Abstract The ultimate goal of most translocation efforts is to create a self-sustaining wild population of a species deliberately moved from one part of their range to another. As follow-up of a translocation attempt is often difficult, causes for failure are relatively unknown. Dispersal away from the release site is one potential source of failure because it decreases the likelihood of the released population establishing itself post-translocation. In this study, we used chukar Alectoris chukar as a surrogate for translocated game birds in order to conduct a large-scale experimental study. We observed that these desert-adapted birds demonstrate a strong fidelity for specific water sources. We also report the propensity for the translocated individuals to either disperse and return to their original water source site or remain at the release site. During two field seasons, we observed opposing behaviors such that the proportion of individuals returning to the capture site, versus those remaining at the release site, shifted between years. We analyzed this change between the years as well as within the years to assess the potential underlying causes such as translocated distance, differences in rainfall between seasons and water source type. We concluded that homing behavior was strong in this non-migratory bird species and that strength of this homing behavior varied, potentially due to conditions surrounding the limiting resource, water availability. The large-scale, original data presented here may help to explain why some releases result in a successfully established population while other releases result in widely dispersed individuals. [source]


    Frequency of the AGT Pro11Leu Polymorphism in Humans: does Diet Matter?

    ANNALS OF HUMAN GENETICS, Issue 1 2010
    Laure Ségurel
    Summary The Pro11Leu substitution in the AGXT gene, which causes primary hyperoxaluria type 1, is found with high frequency in some human populations (e.g., 5,20% in Caucasians). It has been suggested that this detrimental mutation could have been positively selected in populations with a meat-rich diet. In order to test this hypothesis, we investigated the occurrence of Pro11Leu in both herder and agriculturalist populations from Central Asia. We found a lower frequency of this detrimental mutation in herders, whose diet is more meat-rich, as compared to agriculturalists, which therefore challenges the universality of the previous claim. Furthermore, when combining our original data with previously published results, we could show that the worldwide genetic differentiation measured at the Pro11Leu polymorphism does not depart from neutrality. Hence, the distribution of the variation observed in the AGXT gene could be due to demographic history, rather than local adaptation to diet. [source]