Analysis Tools (analysis + tool)

Distribution by Scientific Domains
Distribution within Life Sciences

Kinds of Analysis Tools

  • data analysis tool
  • image analysis tool

  • Selected Abstracts

    SAVANT analysis of the microelectronics and photonics testbed solar cell data,

    Robert J. Walters
    Abstract An analysis of solar array data from the Microelectronic and Photonic Testbed (MPTB) space experiment is presented. The data are analyzed using the displacement damage dose (Dd) methodology developed by the US Naval Research Laboratory (NRL) as implemented in the Solar Array Verification and Analysis Tool (SAVANT). SAVANT is a WindowsTM -based computer code that predicts the on-orbit performance of a solar cell in a specified Earth orbit. The predicted solar cell performance produced by the SAVANT code are compared with the measured on-orbit data. In addition, the calculated data are compared with onboard dosimeter measurements. The results allow both a validation of the SAVANT code and a comparison of the space environment models with measured on-orbit data. The results show the models to match the measured data within a factor of 2. Published in 2005 by John Wiley & Sons, Ltd. [source]

    Hyperlink Analyses of the World Wide Web: A Review

    Han Woo Park
    We have recently witnessed the growth of hyperlink studies in the field of Internet research. Although investigations have been conducted across many disciplines and topics, their approaches can be largely divided into hyperlink network analysis (HNA) and Webometrics. This article is an extensive review of the two analytical methods, and a reflection on their application. HNA casts hyperlinks between Web sites (or Web pages) as social and communicational ties, applying standard techniques from Social Networks Analysis to this new data source. Webometrics has tended to apply much simpler techniques combined with a more in-depth investigation into the validity of hypotheses about possible interpretations of the results. We conclude that hyperlinks are a highly promising but problematic new source of data that can be mined for previously hidden patterns of information, although much care must be taken in the collection of raw data and in the interpretation of the results. In particular, link creation is an unregulated phenomenon and so it would not be sensible to assume that the meaning of hyperlinks in any given context is evident, without a systematic study of the context of link creation, and of the relationship between link counts, among other measurements. Social Networks Analysis tools and techniques form an excellent resource for hyperlink analysis, but should only be used in conjunction with improved techniques for data collection, validation and interpretation. [source]

    Getting more from your multicore: exploiting OpenMP from an open-source numerical scripting language

    Michael S. Noble
    Abstract We introduce SLIRP, a module generator for the S-Lang numerical scripting language, with a focus on its vectorization capabilities. We demonstrate how both SLIRP and S-Lang were easily adapted to exploit the inherent parallelism of high-level mathematical languages with OpenMP, allowing general users to employ tightly coupled multiprocessors in scriptable research calculations while requiring no special knowledge of parallel programming. Motivated by examples in the ISIS astrophysical modeling and analysis tool, performance figures are presented for several machine and compiler configurations, demonstrating beneficial speedups for real-world operations. Copyright © 2008 John Wiley & Sons, Ltd. [source]

    SCALEA: a performance analysis tool for parallel programs

    Hong-Linh Truong
    Abstract Many existing performance analysis tools lack the flexibility to control instrumentation and performance measurement for code regions and performance metrics of interest. Performance analysis is commonly restricted to single experiments. In this paper we present SCALEA, which is a performance instrumentation, measurement, analysis, and visualization tool for parallel programs that supports post-mortem performance analysis. SCALEA currently focuses on performance analysis for OpenMP, MPI, HPF, and mixed parallel programs. It computes a variety of performance metrics based on a novel classification of overhead. SCALEA also supports multi-experiment performance analysis that allows one to compare and to evaluate the performance outcome of several experiments. A highly flexible instrumentation and measurement system is provided which can be controlled by command-line options and program directives. SCALEA can be interfaced by external tools through the provision of a full Fortran90 OpenMP/MPI/HPF frontend that allows one to instrument an abstract syntax tree at a very high-level with C-function calls and to generate source code. A graphical user interface is provided to view a large variety of performance metrics at the level of arbitrary code regions, threads, processes, and computational nodes for single- and multi-experiment performance analysis. Copyright © 2003 John Wiley & Sons, Ltd. [source]

    Inflammation reduces HDL protection against primary cardiac risk

    James P. Corsetti
    Eur J Clin Invest 2010; 40 (6): 483,489 Abstract Background, We recently reported high high-density lipoprotein (HDL) cholesterol as a predictor of recurrent risk in a subgroup of postinfarction patients defined by hypercholesterolemia and high C-reactive protein (CRP) levels. We investigated whether a similar high-risk subgroup might exist for incident cardiovascular disease. Material and Methods, A graphical exploratory data analysis tool was used to identify high-risk subgroups in a male population-based cohort (n = 3405) from the prevention of renal and vascular end-stage disease study by generating 3-dimensional mappings of risk over the HDL-cholesterol/CRP domain with subsequent use of Kaplan,Meier analysis to verify high-risk. Within-subgroup risk was assessed using Cox proportional hazards regression and Kaplan,Meier analysis. Results, Mappings revealed two high-risk subgroups: a low HDL-cholesterol/high CRP subgroup and a high HDL-cholesterol/high CRP subgroup. The low HDL-cholesterol subgroup demonstrated a pattern of metabolic syndrome dyslipidemia contrasted with a predominantly unremarkable biomarker pattern for the high HDL-cholesterol subgroup. However, in the high HDL-cholesterol subgroup, CRP levels were higher than the low HDL-cholesterol subgroup; and within the high HDL-cholesterol subgroup, CRP predicted risk. Moreover, in the high HDL-cholesterol subgroup, risk was associated with lower triglyceride levels in conjunction with presumptively larger HDL particles. Conclusions, High HDL-cholesterol and high CRP levels define a subgroup of men at high-risk for incident cardiovascular disease. High HDL cholesterol-associated risk likely relates to impaired HDL particle remodelling in the setting of inflammation. This approach may facilitate identification of additional inflammation-related mechanisms underlying high HDL cholesterol-associated risk; and potentially influence management of such patients. [source]

    GIS visualisation and analysis of mobile hydroacoustic fisheries data: a practical example

    A. R. COLEY
    Abstract, Hydroacoustic remote sensing of fish populations residing in large freshwater bodies has become a widely used and effective monitoring tool. However, easy visualisation of the data and effective analysis is more problematic. The use of GIS-based interpolations enables easy visualisation of survey data and an analysis tool for investigating fish populations. Three years of hydroacoustic surveys of Cardiff Bay in South Wales presented an opportunity to develop analysis and visualisation techniques. Inverse distance weighted (IDW) interpolation was used to show the potential of such techniques in analysing survey data both spatially (1-year survey) and temporally (by looking at the spatial changes between years). IDW was fairly successful in visualising the hydroacoustic data for Cardiff Bay. However, other techniques may improve on this initial work and provide improved analysis, total density estimates and statistically derived estimations of prediction error. [source]

    Threatened archaeological, historic, and cultural resources of the Georgia Coast: Identification, prioritization and management using GIS technology

    Michael H. Robinson
    Archaeological sites in beach and estuarine environments are continually threatened by diverse natural marine processes. Shoreline erosion, bluff retreat, and sea level rise all present potential for site destruction. Using historic maps, aerial imagery, and field survey methods in a GIS, 21 potentially significant archaeological sites on Georgia barrier islands were selected for determination of site-specific rates of shoreline change using a powerful, new, moving-boundary GIS analysis tool. A prioritized list of sites, based on the order of site loss from erosion, was generated to assist coastal managers in identifying and documenting sites most at risk. From the original selection of 21 sites, 11 sites were eroding, 8 shorelines were stable, and 2 shorelines were accreting. The methodology outlined here produces critical information on archaeological site loss rates and provides a straightforward means of prioritizing sites for detailed documentation. © 2010 Wiley Periodicals, Inc. [source]

    Knowledge acquisition for the development of an upper-body work-related musculoskeletal disorders analysis tool

    Isabel Lopes Nunes
    ERGO_X is a fuzzy expert system that supports workstation ergonomic analysis and provides advice on corrective measures aimed at improving the overall quality of the ergonomic design. ERGO_X was designed in a modular way to make further developments easier and to allow the selection of different ergonomic analysis contexts. The modularity feature mainly is a result of the knowledge base modular structure. Each module was built as a multilevel tree fuzzy relation. This relation reflects the interaction between attributes that are used to evaluate the level of severity of the relevant risk factors that are present at the analyzed workstation. The aim of this study is to address some aspects related to the knowledge acquisition process involved in the development of the ERGO_X knowledge base. In this regard, the author refers to her knowledge engineering activities in the development of a work-related musculoskeletal disorder module. © 2007 Wiley Periodicals, Inc. Hum Factors Man 17: 149,162, 2007. [source]

    Surface wavelets: a multiresolution signal processing tool for 3D computational modelling

    Kevin Amaratunga
    Abstract In this paper, we provide an introduction to wavelet representations for complex surfaces (surface wavelets), with the goal of demonstrating their potential for 3D scientific and engineering computing applications. Surface wavelets were originally developed for representing geometric objects in a multiresolution format in computer graphics. These wavelets share all of the major advantages of conventional wavelets, in that they provide an analysis tool for studying data, functions and operators at different scales. However, unlike conventional wavelets, which are restricted to uniform grids, surface wavelets have the power to perform signal processing operations on complex meshes, such as those encountered in finite element modelling. This motivates the study of surface wavelets as an efficient representation for the modelling and simulation of physical processes. We show how surface wavelets can be applied to partial differential equations, stated either in integral form or in differential form. We analyse and implement the wavelet approach for a model 3D potential problem using a surface wavelet basis with linear interpolating properties. We show both theoretically and experimentally that an O(h) convergence rate, hn being the mesh size, can be obtained by retaining only O((logN) 7/2N) entries in the discrete operator matrix, where N is the number of unknowns. The principles described here may also be extended to volumetric discretizations. Copyright © 2001 John Wiley & Sons, Ltd. [source]

    Use of image analysis techniques for objective quantification of the efficacy of different hair removal methods

    S. Bielfeldt
    In the field of consumer-used cosmetics for hair removal and hair growth reduction, there is a need for improved quantitative methods to enable the evaluation of efficacy and claim support. Optimized study designs and investigated endpoints are lacking to compare the efficacy of standard methods, like shaving or plucking, with new methods and products, such as depilating instruments or hair-growth-reducing cosmetics. Non-invasive image analysis, using a high-performance microscope combined with an optimized image analysis tool, was investigated to assess hair growth. In one step, high-resolution macrophotographs of the legs of female volunteers after shaving and plucking with cold wax were compared to observe short-term hair regrowth. In a second step, images obtained after plucking with cold wax were taken over a long-term period to assess the time, after which depilated hairs reappeared on the skin surface. Using image analysis, parameters like hair length, hair width, and hair projection area were investigated. The projection area was found to be the parameter most independent of possible image artifacts such as irregularities in skin or low contrast due to hair color. Therefore, the hair projection area was the most appropriate parameter to determine the time of hair regrowth. This point of time is suitable to assess the efficacy of different hair removal methods or hair growth reduction treatments by comparing the endpoint after use of the hair removal method to be investigated to the endpoint after simple shaving. The closeness of hair removal and visible signs of skin irritation can be assessed as additional quantitative parameters from the same images. Discomfort and pain rating by the volunteers complete the set of parameters, which are required to benchmark a new hair removal method or hair-growth-reduction treatment. Image analysis combined with high-resolution imaging techniques is a powerful tool to objectively assess parameters like hair length, hair width, and projection area. To achieve reliable data and to reduce well known image-analysis artifacts, it was important to optimize the technical equipment for use on human skin and to improve image analysis by adaptation of the image-processing procedure to the different skin characteristics of individuals, like skin color, hair color, and skin structure. [source]

    World Vision case study of Sigma, an analysis tool based on the Alterian database engine

    Scott Logie
    One of the major issues charities have been concerned with for many years is the inability to access donor records for marketing analysis purposes,donor information has literally been locked up. Charity database tools such as Raiser's Edge and Alms have been built to provide donor details on a record-by-record basis rather than to provide summary information across the entire base. These tools carry out this function very well and have the added benefit of providing data to call centre staff as they make or receive calls from donors. As each charity has to compete more intensely for their share of donor value, however, more detailed behavioural analysis of donor bases is required. To do this, access to the entire donor data is essential, not one record at a time, but structured in a way that allows ad hoc querying. This paper discusses the various technologies that can be used to access donor data for analytical purposes and explains the merits of a new database engine developed by Alterian that allows easy and fast access to many records across multiple data tables. It also shows how one organisation has used the engine to develop a bespoke analysis tool for the charity sector and how a leading relief and development agency, World Vision, is using this tool. Copyright © 2001 Henry Stewart Publications [source]

    Conformational analysis by intersection: CONAN

    Andrew Smellie
    Abstract As high throughput techniques in chemical synthesis and screening improve, more demands are placed on computer assisted design and virtual screening. Many of these computational methods require one or more three-dimensional conformations for molecules, creating a demand for a conformational analysis tool that can rapidly and robustly cover the low-energy conformational spaces of small molecules. A new algorithm of intersection is presented here, which quickly generates (on average <0.5 seconds/stereoisomer) a complete description of the low energy conformational space of a small molecule. The molecule is first decomposed into nonoverlapping nodes N (usually rings) and overlapping paths P with conformations (N and P) generated in an offline process. In a second step the node and path data are combined to form distinct conformers of the molecule. Finally, heuristics are applied after intersection to generate a small representative collection of conformations that span the conformational space. In a study of ,97,000 randomly selected molecules from the MDDR, results are presented that explore these conformations and their ability to cover low-energy conformational space. © 2002 Wiley Periodicals, Inc. J Comput Chem 24: 10,20, 2003 [source]

    An ecologist's guide to ecogenomics

    JOURNAL OF ECOLOGY, Issue 1 2007
    N. J. OUBORG
    Summary 1Currently, plant ecologists are increasingly adopting approaches and techniques from molecular biology. The new field of ecogenomics aims at understanding the mechanistic basis for adaptation and phenotypic variation by using genomic techniques to investigate the mechanistic and evolutionary basis of species interactions, and focuses on identifying the genes affected by evolution. 2While the entire toolbox of genomics is only available for model species such as Arabidopsis thaliana, we describe the options open to ecologists interested in pursuing an ecogenomics research program on ecologically relevant traits or phenomena in non-model species, for which part of the genomic toolbox may be currently unavailable. In these non-model species, a viable ecogenomics research program is possible with relatively modest effort. 3Four challenges to further development of ecogenomics are described and discussed: (i) the ecogenomic study of non-model species; (ii) reconciliation of experimental languages of ecology and evolutionary biology with molecular biology; (iii) development of specific ecogenomic data analysis tool; and (iv) adoption of a multidisciplinary cooperative research culture. 4An important task for ecologists is to provide the necessary ecological input (the ,eco' part) to ecogenomics. [source]

    The acceptability to stakeholders of mandatory nutritional labelling in France and the UK , findings from the PorGrow project

    M. Holdsworth
    Abstract Background:, Implementing a European Union (EU)-wide mandatory nutrition labelling scheme has been advocated as part a multi-pronged strategy to tackle obesity. The type of scheme needs to be acceptable to all key stakeholders. This study explored stakeholders' viewpoints of labelling in two contrasting food cultures (France and the UK) to see whether attitudes were influenced by sectoral interests and/or national context. Methods:, Using Multi Criteria Mapping, a decision analysis tool that assesses stakeholder viewpoints, quantitative and qualitative data were gathered during tape-recorded interviews. In France and the UK, 21 comparable stakeholders appraised nutritional labelling with criteria of their own choosing (i.e. feasibility, societal benefits, social acceptability, efficacy in addressing obesity, additional health benefits) and three criteria relating to cost (to industry; public sector; individuals). When scoring, interviewees provided both optimistic (best case) and pessimistic (worst case) judgements. Results:, Overall, mandatory nutritional labelling was appraised least favourably in France. Labelling performed worse under optimistic (best case) scenarios in France, for five out of eight sets of criteria. French stakeholders viewed labelling as expensive, having fewer benefits to society and as being marginally less effective than UK stakeholders did. However, French interviewees thought implementing labelling was feasible and would provide additional health benefits. British and French stakeholders made similar quantitative judgements on how socially acceptable mandatory labelling would be. Conclusions:, There is agreement between some stakeholder groups in the two different countries, especially food chain operators. However, cultural differences emerged that could influence the impact of an EU-wide mandatory labelling scheme in both countries. [source]

    Prediction of Polymer Properties in LDPE Reactors

    Gary J. Wells
    Abstract Summary: A new analysis tool is presented that uses the governing kinetic scheme to predict properties of low-density polyethylene (LDPE) such as the detailed shape of the molecular weight distribution (MWD). A model that captures mixing details of autoclave reactor operation is used to provide a new criterion for the onset of MWD shouldering. Kinetic effects are shown to govern the existence of MWD shoulders in LDPE reactors, even when operation is far from perfectly-mixed. MWD shoulders occur when the mean reaction environment has a relatively high radical concentration and has a high polymer content, and is at a low temperature. Such conditions maximize long chain formation by polymer transfer and combination-termination, while limiting chain scission. For imperfectly-mixed reactors, the blending of polymer-distributions produced in different spatial locations has a small effect on the composite MWD. However, for adiabatic LDPE autoclaves, imperfect mixing broadens the stable range of mean reactor conditions, and thereby increases the possibility for MWD shouldering. Polymer MWD produced in an LDPE autoclave reactor by various kinetic mechanisms. [source]

    A New Video Image Analysis System to Study Red Blood Cell Dynamics and Oxygenation in Capillary Networks

    MICROCIRCULATION, Issue 6 2005
    ABSTRACT Objective: The authors present a Measurement and Analysis System for Capillary Oxygen Transport (MASCOT) to study red blood cell (RBC) dynamics and oxygenation in capillary networks. The system enables analysis of capillaries to study geometry and morphology and provides values for capillary parameters such as diameter and segment length. It also serves as an analysis tool for capillary RBC flow characteristics, including RBC velocity, lineal density, and supply rate. Furthermore, the system provides a means of determining the oxygen saturation of hemoglobin contained within RBCs, by analysis of synchronized videotapes containing images at two wavelengths, enabling the quantification of the oxygen content of individual RBCs. Methods: Video recordings of RBC flow at two wavelengths, 420 nm (isosbestic) and 436 nm (oxygen sensitive), are made using a dual camera video microscopy system. The 420-nm recording is used to generate images based on the variance of light intensity fluctuations that help to identify capillaries in a given field of view that are in sharp focus and exhibit flow of individual RBCs separated by plasma gaps. A region of interest enclosing the desired capillary is defined and a fixed number of successive video frames at the two wavelengths are captured. Next a difference image is created, which delineates the RBC column, whose width is used to estimate the internal diameter of the capillary. The 420-nm images are also used to identify the location and centroid of each RBC within the capillary. A space,time image is generated to compute the average RBC velocity. Lineal density is calculated as the number of RBCs per unit length of a capillary segment. The mean optical density (OD) of each RBC is calculated at both wavelengths, and the average SO2 for each cell is determined from OD436/OD420. Results and Conclusions: MASCOT is a robust and flexible system that requires simple hardware, including a SGI workstation fitted with an audio-visual module, a VCR, and an oscilloscope. Since the new system provides information on an individual cell basis from entire capillary segments, the authors believe that results obtained using MASCOT will be more accurate than those obtained from previous systems. Due to its flexibility and ease of extension to other applications, MASCOT has the potential to be applied widely as an analysis tool for capillary oxygen transport measurements. [source]

    microsatellite analyser (MSA): a platform independent analysis tool for large microsatellite data sets

    Daniel Dieringer
    Abstract In molecular ecology the analysis of large microsatellite data sets is becoming increasingly popular. Here we introduce a new software tool, which is specifically designed to facilitate the analysis of large microsatellite data sets. All common microsatellite summary statistics and distances can be calculated. Furthermore, the microsatellite analyser (msa) software offers an improved method to deal with inbred samples (such as Drosophila isofemale lines). Executables are available for Windows and Macintosh computers. [source]

    The Behavior Engineering Model at Work on a Small Scale: Using Task Clarification, Self-Monitoring, and Public Posting to Improve Customer Service

    John Austin
    ABSTRACT Gilbert's (1978/1996) Behavior Engineering Model (BEM) can enable the success of novice performance engineers by prompting appropriate front-end analysis. This paper describes the third author's first performance improvement project conducted in the customer service department at an insurance agency. Front-end performance analysis informed the design of an intervention package that addressed particular environment and person variables. This package included task clarification, employee self-monitoring, and public posting of group performance. A multiple baseline design across behaviors was used to assess the effects of the intervention. The performance targets were: 1) the percentage of transactions where Customer Service Representatives (CSRs) used customer names, and 2) the percentage of transactions where CSRs suggested additional services available to customers. Average performance during intervention was more than 50% better than average baseline performance for both targets. Results are discussed in terms of the utility of the BEM as a front-end analysis tool that can guide novice performance engineers to build simple and inexpensive, yet effective, performance improvement interventions. [source]

    Capturing Government Policy on the Left,Right Scale: Evidence from the United Kingdom, 1956,2006

    POLITICAL STUDIES, Issue 4 2009
    Armèn Hakhverdian
    The left,right scheme is the most widely used and parsimonious representation of political competition. Yet, long time series of the left,right position of governments are sparse. Existing methods are of limited use in dynamic settings due to insufficient time points which hinders the proper specification of time-series regressions. This article analyses legislative speeches in order to construct an annual left,right policy variable for Britain from 1956 to 2006. Using a recently developed content analysis tool, known as Wordscores, it is shown that speeches yield valid and reliable estimates for the left,right position of British government policy. Long time series such as the one proposed in this article are vital to building dynamic macro-level models of politics. This measure is cross-validated with four independent sources: (1) it compares well to expert surveys; (2) a rightward trend is found in post-war British government policy; (3) Conservative governments are found to be more right wing in their policy outputs than Labour governments; (4) conventional accounts of British post-war politics support the pattern of government policy movement on the left,right scale. [source]

    Proteomic profiling reveals comprehensive insights into adrenergic receptor-mediated hypertrophy in neonatal rat cardiomyocytes

    Zijian Li
    Abstract Myocardial adrenergic receptors (ARs) play important roles in cardiac hypertrophy. However, the detailed molecular mechanism of AR-mediated cardiac hypertrophy remains elusive to date. To gain full insight into how ARs are involved in the regulation of cardiac hypertrophy, protein expression profiling was performed with comparative proteomics approach on neonatal rat cardiomyocytes. Forty-six proteins were identified as differentially expressed in hypertrophic cardiomyocytes induced by AR stimulation. To better understand the biological significance of the obtained proteomic data, we utilized the ingenuity pathway analysis tool to construct biological networks and analyze function and pathways that might associate with AR-mediated cardiac hypertrophy. Pathway analysis strongly suggested that ROS may be involved in the development of AR-mediated cardiac hypertrophy, which was then confirmed by further experimentation. The results showed that a marked increase in ROS production was detected in AR-mediated cardiac hypertrophy and blocking of ROS production significantly inhibited AR-mediated cardiac hypertrophy. We further proved that the ROS production was through NADPH oxidase or the mitochondrial electron transport chain and this ROS accumulation resulted in activation of extracellular signal-regulated kinase 1/2 leading to AR-mediated cardiac hypertrophy. These experimental results support the hypothesis, from the ingenuity pathway analysis, that AR-mediated cardiac hypertrophy is associated with the dysregulation of a complicated oxidative stress-regulatory network. In conclusion, our results provide a basis for understanding the detailed molecular mechanisms of AR-mediated cardiac hypertrophy. [source]

    Construction of statistical shape atlases for bone structures based on a two-level framework,

    Chenyu Wu
    Abstract Background The statistical shape atlas is a 3D medical image analysis tool that encodes shape variations between populations. However, efficiency, accuracy and finding the correct correspondence are still unsolved issues during the construction of the atlas. Methods We developed a two-level-based framework that speeds up the registration process while maintaining accuracy of the atlas. We also proposed a semi-automatic strategy to achieve segmentation and registration simultaneously, without knowing any prior information about the shape. Results We have separately constructed the atlas for the femur and spine. The experimental results demonstrate the efficiency and accuracy of our methods. Conclusions Our two-level framework and semi-automatic strategy are able to efficiently construct the atlas for bone structures without losing accuracy. We can handle either 3D surface data or raw DICOM images. Copyright © 2009 John Wiley & Sons, Ltd. [source]

    Heritability of vasculopathy, autoimmune disease, and fibrosis in systemic sclerosis: A population-based study

    ARTHRITIS & RHEUMATISM, Issue 7 2010
    Tracy Frech
    Objective To investigate the familiality of systemic sclerosis (SSc) in relation to Raynaud's phenomenon (RP) (a marker of vasculopathy), other autoimmune inflammatory disease, and fibrotic interstitial lung disease (ILD). Methods A genealogic resource, the Utah Population Database (UPDB), was used to test heritability of RP, other autoimmune disease, and ILD. Diseases were defined by International Classification of Diseases, Ninth Revision codes and identified from statewide discharge data, the University of Utah Health Science Center Enterprise Data Warehouse, and death certificates and were linked to the UPDB for analysis. Familial standardized incidence ratio (FSIR), relative risks (RRs) to first-, second-, third-, and fourth-degree relatives for SSc, RP, other autoimmune disease, and ILD (with 95% confidence intervals [95% CIs]), and population attributable risk (PAR) were calculated. Results A software kinship analysis tool was used to analyze 1,037 unique SSc patients. Fifty SSc families had significant FSIRs, ranging from 2.07 to 17.60. The adjusted PAR was ,8%. The RRs were significant for other autoimmune disease in the first-degree relatives (2.49 [95% CI 1.99,3.41], P = 2.42 × 10,15) and second-degree relatives (1.48 [95% CI 1.34,2.39], P = 0.002), for RP in first-degree relatives (6.38 [95% CI 3.44,11.83], P = 4.04 × 10,9) and second-degree relatives (2.39 [95% CI 1.21,4.74], P = 0.012), and for ILD in first-degree relatives (1.53 [95% CI 1.04,2.26], P = 0.03), third-degree relatives (1.47 [95% CI 1.18,1.82], P = 0.0004), and fourth-degree relatives (1.2 [95% CI 1.06,1.35], P = 0.004). Conclusion These data suggest that SSc pedigrees include more RP, autoimmune inflammatory disease, and ILD than would be expected by chance. In SSc pedigrees, genetic predisposition to vasculopathy is the most frequent risk among first-degree relatives. [source]

    Survey-gap analysis in expeditionary research: where do we go from here?

    V. A. FUNK
    Research expeditions into remote areas to collect biological specimens provide vital information for understanding biodiversity. However, major expeditions to little-known areas are expensive and time consuming, time is short, and well-trained people are difficult to find. In addition, processing the collections and obtaining accurate identifications takes time and money. In order to get the maximum return for the investment, we need to determine the location of the collecting expeditions carefully. In this study we used environmental variables and information on existing collecting localities to help determine the sites of future expeditions. Results from other studies were used to aid in the selection of the environmental variables, including variables relating to temperature, rainfall, lithology and distance between sites. A survey gap analysis tool based on ,ED complementarity' was employed to select the sites that would most likely contribute the most new taxa. The tool does not evaluate how well collected a previously visited site survey site might be; however, collecting effort was estimated based on species accumulation curves. We used the number of collections and/or number of species at each collecting site to eliminate those we deemed poorly collected. Plants, birds, and insects from Guyana were examined using the survey gap analysis tool, and sites for future collecting expeditions were determined. The south-east section of Guyana had virtually no collecting information available. It has been inaccessible for many years for political reasons and as a result, eight of the first ten sites selected were in that area. In order to evaluate the remainder of the country, and because there are no immediate plans by the Government of Guyana to open that area to exploration, that section of the country was not included in the remainder of the study. The range of the ED complementarity values dropped sharply after the first ten sites were selected. For plants, the group for which we had the most records, areas selected included several localities in the Pakaraima Mountains, the border with the south-east, and one site in the north-west. For birds, a moderately collected group, the strongest need was in the north-west followed by the east. Insects had the smallest data set and the largest range of ED complementarity values; the results gave strong emphasis to the southern parts of the country, but most of the locations appeared to be equidistant from one another, most likely because of insufficient data. Results demonstrate that the use of a survey gap analysis tool designed to solve a locational problem using continuous environmental data can help maximize our resources for gathering new information on biodiversity. © 2005 The Linnean Society of London, Biological Journal of the Linnean Society, 2005, 85, 549,567. [source]

    Multiobjective flux balancing using the NISE method for metabolic network analysis

    Young-Gyun Oh
    Abstract Flux balance analysis (FBA) is well acknowledged as an analysis tool of metabolic networks in the framework of metabolic engineering. However, FBA has a limitation for solving a multiobjective optimization problem which considers multiple conflicting objectives. In this study, we propose a novel multiobjective flux balance analysis method, which adapts the noninferior set estimation (NISE) method (Solanki et al., 1993) for multiobjective linear programming (MOLP) problems. NISE method can generate an approximation of the Pareto curve for conflicting objectives without redundant iterations of single objective optimization. Furthermore, the flux distributions at each Pareto optimal solution can be obtained for understanding the internal flux changes in the metabolic network. The functionality of this approach is shown by applying it to a genome-scale in silico model of E. coli. Multiple objectives for the poly(3-hydroxybutyrate) [P(3HB)] production are considered simultaneously, and relationships among them are identified. The Pareto curve for maximizing succinic acid production vs. maximizing biomass production is used for the in silico analysis of various combinatorial knockout strains. This proposed method accelerates the strain improvement in the metabolic engineering by reducing computation time of obtaining the Pareto curve and analysis time of flux distribution at each Pareto optimal solution. © 2009 American Institute of Chemical Engineers Biotechnol. Prog., 2009 [source]

    Performance of Buildings under Earthquakes in Barcelona, Spain

    Alex H. Barbat
    The seismic hazard in the area of the city is described by means of the reduced 5% damped elastic response spectrum. Obtaining fragility curves for the most important building types of an urban center requires an important amount of information about the structures and the use of nonlinear structural analysis tools. The information on the buildings of Barcelona was obtained by collecting, arranging, improving, and completing the database of the housing and current buildings. The buildings existing in Barcelona are mainly of two types: unreinforced masonry structures and reinforced concrete buildings with waffled slab floors. In addition, the Arc-View software was used to create a GIS tool for managing the collected information to develop seismic risk scenarios. This study shows that the vulnerability of the buildings is significant and therefore, in spite of the medium to low seismic hazard in the area of the city, the expected seismic risk is considerable. [source]

    A test suite for parallel performance analysis tools

    Michael Gerndt
    Abstract Parallel performance analysis tools must be tested as to whether they perform their task correctly, which comprises at least three aspects. First, it must be ensured that the tools neither alter the semantics nor distort the run-time behavior of the application under investigation. Next, it must be verified that the tools collect the correct performance data as required by their specification. Finally, it must be checked that the tools perform their intended tasks and detect relevant performance problems. Focusing on the latter (correctness) aspect, testing can be done using synthetic test functions with controllable performance properties, possibly complemented by real-world applications with known performance behavior. A systematic test suite can be built from synthetic test functions and other components, possibly with the help of tools to assist the user in putting the pieces together into executable test programs. Clearly, such a test suite can be highly useful to builders of performance analysis tools. It is surprising that, up until now, no systematic effort has been undertaken to provide such a suite. In this paper we describe the APART Test Suite (ATS) for checking the correctness (in the above sense) of parallel performance analysis tools. In particular, we describe a collection of synthetic test functions which allows one to easily construct both simple and more complex test programs with desired performance properties. We briefly report on experience with MPI and OpenMP performance tools when applied to the test cases generated by ATS. Copyright © 2006 John Wiley & Sons, Ltd. [source]

    SCALEA: a performance analysis tool for parallel programs

    Hong-Linh Truong
    Abstract Many existing performance analysis tools lack the flexibility to control instrumentation and performance measurement for code regions and performance metrics of interest. Performance analysis is commonly restricted to single experiments. In this paper we present SCALEA, which is a performance instrumentation, measurement, analysis, and visualization tool for parallel programs that supports post-mortem performance analysis. SCALEA currently focuses on performance analysis for OpenMP, MPI, HPF, and mixed parallel programs. It computes a variety of performance metrics based on a novel classification of overhead. SCALEA also supports multi-experiment performance analysis that allows one to compare and to evaluate the performance outcome of several experiments. A highly flexible instrumentation and measurement system is provided which can be controlled by command-line options and program directives. SCALEA can be interfaced by external tools through the provision of a full Fortran90 OpenMP/MPI/HPF frontend that allows one to instrument an abstract syntax tree at a very high-level with C-function calls and to generate source code. A graphical user interface is provided to view a large variety of performance metrics at the level of arbitrary code regions, threads, processes, and computational nodes for single- and multi-experiment performance analysis. Copyright © 2003 John Wiley & Sons, Ltd. [source]

    Towards Fast Measurement of the Electron Temperature in the SOL of ASDEX Upgrade Using Swept Langmuir Probes

    H.W. Müller
    Abstract On ASDEX Upgrade first experiments were made using single probes with a voltage sweep frequency up to 100kHz. Possibilities and limitations using fast swept probes with a standard diagnostic and analysis tools are discussed. A good agreement between the data derived from fast swept single probe characteristics and floating as well as saturation current measurements was found. In a stationary (non ELMing) plasma the data of the fast swept probe are compared to standard slow swept probes (kHz range) showing an improvement of the measurement by faster sweeping. While ELM filaments already could be resolved the access of electron temperature fluctuations in small scale turbulence still has to be improved. Further comparisons are done in ELMy H-mode with combined ball-pen probe/floating potential measurements which can deliver electron temperatures with 25 , s time resolution at reduced spatial resolution compared to pin probes. During ELMs the electron temperatures derived from the ball-pen probe and fast swept single probes agreed (© 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]

    MRI-based morphometric analysis of typical and atypical brain development

    David N. Kennedy
    Abstract The neuroinformatics landscape in which human brain morphometry occurs has advanced dramatically over the past few years. Rapid advancement in image acquisition methods, image analysis tools and interpretation of morphometric results make the study of in vivo anatomic analysis both challenging and rewarding. This has revolutionized our expectations for current and future diagnostic and investigative work with the developing brain. This paper will briefly cover the methods of morphometric analysis that available for neuroanatomic analysis, and tour some sample results from a prototype retrospective database of neuroanatomic volumetric information. From these observations, issues regarding the anatomic variability of developmental maturation of neuroanatomic structures in both typically and atypically developing populations can be discussed. MRDD Research Reviews 2003;9:155,160. © 2003 Wiley-Liss, Inc. [source]

    Vehicle fatigue damage caused by road irregularities

    ABSTRACT Road roughness causes fatigue-inducing loads in travelling vehicles. Road sections with a high degree of roughness are of special interest because these have a significant impact on vehicle's fatigue life. This study is focused on the statistical description and analysis of vehicle damage caused by irregularities. Standard statistical analysis tools are not straightforwardly applicable because of the non-stationary property of the irregularities. However, it is found that the road irregularities' influence on vehicles can be accurately described using a ,local' narrow-band approximation of the fatigue damage intensity. [source]