Home About us Contact | |||
Small Set (small + set)
Selected AbstractsA standards-based Grid resource brokering service supporting advance reservations, coallocation, and cross-Grid interoperabilityCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 18 2009Erik Elmroth Abstract The problem of Grid-middleware interoperability is addressed by the design and analysis of a feature-rich, standards-based framework for all-to-all cross-middleware job submission. The architecture is designed with focus on generality and flexibility and builds on extensive use, internally and externally, of (proposed) Web and Grid services standards such as WSRF, JSDL, GLUE, and WS-Agreement. The external use provides the foundation for easy integration into specific middlewares, which is performed by the design of a small set of plugins for each middleware. Currently, plugins are provided for integration into Globus Toolkit 4 and NorduGrid/ARC. The internal use of standard formats facilitates customization of the job submission service by replacement of custom components for performing specific well-defined tasks. Most importantly, this enables the easy replacement of resource selection algorithms by algorithms that address the specific needs of a particular Grid environment and job submission scenario. By default, the service implements a decentralized brokering policy, striving to optimize the performance for the individual user by minimizing the response time for each job submitted. The algorithms in our implementation perform resource selection based on performance predictions, and provide support for advance reservations as well as coallocation of multiple resources for coordinated use. The performance of the system is analyzed with focus on overall service throughput (up to over 250 jobs per min) and individual job submission response time (down to under 1,s). Copyright © 2009 John Wiley & Sons, Ltd. [source] Compiling data-parallel programs for clusters of SMPsCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 2-3 2004Siegfried Benkner Abstract Clusters of shared-memory multiprocessors (SMPs) have become the most promising parallel computing platforms for scientific computing. However, SMP clusters significantly increase the complexity of user application development when using the low-level application programming interfaces MPI and OpenMP, forcing users to deal with both distributed-memory and shared-memory parallelization details. In this paper we present extensions of High Performance Fortran (HPF) for SMP clusters which enable the compiler to adopt a hybrid parallelization strategy, efficiently combining distributed-memory with shared-memory parallelism. By means of a small set of new language features, the hierarchical structure of SMP clusters may be specified. This information is utilized by the compiler to derive inter-node data mappings for controlling distributed-memory parallelization across the nodes of a cluster and intra-node data mappings for extracting shared-memory parallelism within nodes. Additional mechanisms are proposed for specifying inter- and intra-node data mappings explicitly, for controlling specific shared-memory parallelization issues and for integrating OpenMP routines in HPF applications. The proposed features have been realized within the ADAPTOR and VFC compilers. The parallelization strategy for clusters of SMPs adopted by these compilers is discussed as well as a hybrid-parallel execution model based on a combination of MPI and OpenMP. Experimental results indicate the effectiveness of the proposed features. Copyright © 2004 John Wiley & Sons, Ltd. [source] A flexible framework for consistency managementCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 1 2002S. Weber Abstract Recent distributed shared memory (DSM) systems provide increasingly more support for the sharing of objects rather than portions of memory. However, like earlier DSM systems these distributed shared object systems (DSO) still force developers to use a single protocol, or a small set of given protocols, for the sharing of application objects. This limitation prevents the applications from optimizing their communication behaviour and results in unnecessary overhead. A current general trend in software systems development is towards customizable systems, for example frameworks, reflection, and aspect-oriented programming all aim to give the developer greater flexibility and control over the functionality and performance of their code. This paper describes a novel object-oriented framework that defines a DSM system in terms of a consistency model and an underlying coherency protocol. Different consistency models and coherency protocols can be used within a single application because they can be customized, by the application programmer, on a per-object basis. This allows application specific semantics to be exploited at a very fine level of granularity and with a resulting improvement in performance. The framework is implemented in JAVA and the speed-up obtained by a number of applications that use the framework is reported. Copyright © 2002 John Wiley & Sons, Ltd. [source] An image warping approach to spatio-temporal modellingENVIRONMETRICS, Issue 8 2005Sofia Aberg Abstract In this article we present a spatio-temporal dynamic model that can be realized using image warping. Image warping is a non-linear deformation which maps every point in one image plane to a point in another image plane. Using thin-plate splines, these deformations are defined by how a small set of points is mapped, making the method computationally tractable. In our case the dynamics of the process is modelled by thin-plate spline deformations and how they vary in time. Thus we make no assumption of stationarity in time. Finding the deformation between two images in the space,time series is a trade-off between a good match of the images and a smooth, physically plausible, deformation. This is formulated as a penalized likelihood problem, where the likelihood measures how good the match is and the penalty comes from a prior model on the deformation. The dynamic model we suggest can be used to make forecasts and also to estimate the uncertainties associated with these. An introduction to image warping and thin-plate splines is given as well as an application where the methodology is applied to the problem of nowcasting radar precipitation. Copyright © 2005 John Wiley & Sons, Ltd. [source] REVIEW: A comparison of selected quantitative trait loci associated with alcohol use phenotypes in humans and mouse modelsADDICTION BIOLOGY, Issue 2 2010Cindy L. Ehlers ABSTRACT Evidence for genetic linkage to alcohol and other substance dependence phenotypes in areas of the human and mouse genome have now been reported with some consistency across studies. However, the question remains as to whether the genes that underlie the alcohol-related behaviors seen in mice are the same as those that underlie the behaviors observed in human alcoholics. The aims of the current set of analyses were to identify a small set of alcohol-related phenotypes in human and in mouse by which to compare quantitative trait locus (QTL) data between the species using syntenic mapping. These analyses identified that QTLs for alcohol consumption and acute and chronic alcohol withdrawal on distal mouse chromosome 1 are syntenic to a region on human chromosome 1q where a number of studies have identified QTLs for alcohol-related phenotypes. Additionally, a QTL on human chromosome 15 for alcohol dependence severity/withdrawal identified in two human studies was found to be largely syntenic with a region on mouse chromosome 9, where two groups have found QTLs for alcohol preference. In both of these cases, while the QTLs were found to be syntenic, the exact phenotypes between humans and mice did not necessarily overlap. These studies demonstrate how this technique might be useful in the search for genes underlying alcohol-related phenotypes in multiple species. However, these findings also suggest that trying to match exact phenotypes in humans and mice may not be necessary or even optimal for determining whether similar genes influence a range of alcohol-related behaviors between the two species. [source] Generation of a transgenic mouse line expressing GFP-Cre protein from a Hoxb4 neural enhancerGENESIS: THE JOURNAL OF GENETICS AND DEVELOPMENT, Issue 2 2008Elena Rivkin Abstract Here, we describe a transgenic mouse line, in which expression of green fluorescent protein fused to Cre recombinase (GFP-Cre) is directed by the early neuronal enhancer (ENE) of Hoxb4. In E9.0,13.5 transgenic embryos, Cre activity coincided with endogenous Hoxb4 throughout the neural tube up to the r6/r7 boundary in the hindbrain, the dorsal root ganglia, and the Xth cranial ganglia. Unexpectedly, Cre activity was also consistently detected in the trigeminal (Vth) cranial nerve, which is devoid of endogenous Hoxb4 expression. Strong GFP dependent fluorescence appeared slightly later in E9.5,E11.5 embryos, and reflected the later expression pattern expected for Hoxb4-ENE directed expression in the neural tube up to the r7/r8 not r6/r7 boundary. Thus, with the exception of the trigeminal nerve, this reporter faithfully reproduces endogenous embryonic neural Hoxb4 expression, and provides an excellent reagent for in vivo gene manipulations in neuronal Hoxb4 positive cells as well as the developing trigeminal nerve. This transgenic mouse line should prove especially useful for determining the fate map of neuronal populations arising in rhombomeres 7 and 8 on its own and in combination with the small set of other existing rhombomere-specific Cre recombinase expressing lines. genesis 46:119,124, 2008. © 2008 Wiley-Liss, Inc. [source] Efficient Simulation of P Values for Linkage AnalysisGENETIC EPIDEMIOLOGY, Issue 2 2004Kyunghee K. Song Abstract In many genetic linkage analyses, the P value is obtained through simulation since the underlying distribution of the test statistic is complex and unknown. However, this can be very computationally intensive. A "bootstrap/replicate pool" approach has been suggested that generates P values more efficiently in terms of computation by resampling sums from a small set of simulated replicates for each pedigree. The replicate pool idea has been successfully applied, but, to our knowledge, has never been theoretically studied. An entirely different method for increasing the computational efficiency of P value simulation is Besag and Clifford's sequential sampling method. We propose an algorithm which combines Besag and Clifford's method with the replicate pool method to efficiently estimate P values for linkage studies. We derive variance expressions for the P value estimates from the replicate pool method and from our proposed hybrid method, and use these to show that the hybrid estimator has a substantial advantage over the other methods in most situations. Genet Epidemiol 26: 88,96, 2004. © 2004 Wiley-Liss, Inc. [source] Multi-variable and multi-site calibration and validation of SWAT in a large mountainous catchment with high spatial variabilityHYDROLOGICAL PROCESSES, Issue 5 2006Wenzhi Cao Abstract Many methods developed for calibration and validation of physically based distributed hydrological models are time consuming and computationally intensive. Only a small set of input parameters can be optimized, and the optimization often results in unrealistic values. In this study we adopted a multi-variable and multi-site approach to calibration and validation of the Soil Water Assessment Tool (SWAT) model for the Motueka catchment, making use of extensive field measurements. Not only were a number of hydrological processes (model components) in a catchment evaluated, but also a number of subcatchments were used in the calibration. The internal variables used were PET, annual water yield, daily streamflow, baseflow, and soil moisture. The study was conducted using an 11-year historical flow record (1990,2000); 1990,94 was used for calibration and 1995,2000 for validation. SWAT generally predicted well the PET, water yield and daily streamflow. The predicted daily streamflow matched the observed values, with a Nash,Sutcliffe coefficient of 0·78 during calibration and 0·72 during validation. However, values for subcatchments ranged from 0·31 to 0·67 during calibration, and 0·36 to 0·52 during validation. The predicted soil moisture remained wet compared with the measurement. About 50% of the extra soil water storage predicted by the model can be ascribed to overprediction of precipitation; the remaining 50% discrepancy was likely to be a result of poor representation of soil properties. Hydrological compensations in the modelling results are derived from water balances in the various pathways and storage (evaporation, streamflow, surface runoff, soil moisture and groundwater) and the contributions to streamflow from different geographic areas (hill slopes, variable source areas, sub-basins, and subcatchments). The use of an integrated multi-variable and multi-site method improved the model calibration and validation and highlighted the areas and hydrological processes requiring greater calibration effort. Copyright © 2005 John Wiley & Sons, Ltd. [source] The paucity of multimethod research: a review of the information systems literatureINFORMATION SYSTEMS JOURNAL, Issue 3 2003John Mingers Abstract. ,It has commonly been argued that the use of different research methods within the information system (IS) discipline and within individual pieces of research will produce richer and more reliable results. This paper reports on a survey of the IS literature to discover the extent of multimethod research. The findings are that such work is relatively scarce, and where it occurs involves only a small set of traditional methods. Possible reasons for this are discussed. [source] Output feedback stabilization of constrained systems with nonlinear predictive controlINTERNATIONAL JOURNAL OF ROBUST AND NONLINEAR CONTROL, Issue 3-4 2003Rolf Findeisen Abstract We present an output feedback stabilization scheme for uniformly completely observable nonlinear MIMO systems combining nonlinear model predictive control (NMPC) and high-gain observers. The control signal is recalculated at discrete sampling instants by an NMPC controller using a system model for the predictions. The state information necessary for the prediction is provided by a continuous time high-gain observer. The resulting ,optimal' control signal is open-loop implemented until the next sampling instant. With the proposed scheme semi-global practical stability is achieved. That is, for initial conditions in any compact set contained in the region of attraction of the NMPC state feedback controller, the system states will enter any small set containing the origin, if the high-gain observers is sufficiently fast and the sampling time is small enough. In principle the proposed approach can be used for a variety of state feedback NMPC schemes. Copyright © 2003 John Wiley & Sons, Ltd. [source] Assessing the merits and faults of holistic and disaggregated judgmentsJOURNAL OF BEHAVIORAL DECISION MAKING, Issue 3 2010Hal R. Arkes Abstract Three studies explored both the advantages of and subjects' preferences for a disaggregated judgment procedure and a holistic one. The task in our first two studies consisted of evaluating colleges; the third study asked participants to evaluate job applicants. Holistic ratings consisted of providing an overall evaluation while considering all of the characteristics of the evaluation objects; disaggregated ratings consisted of evaluating each cue independently. Participants also made paired comparisons of the evaluation objects. We constructed preference orders for the disaggregated method by aggregating these ratings (unweighted or weighted characteristics). To compare the holistic, disaggregated, and weighted-disaggregated method we regressed the four cues on the participant's holistic rating, on the linearly aggregated disaggregated ratings, and on the average weighted disaggregated rating, using the participant's "importance points" for each cue as weights. Both types of combined disaggregated ratings related more closely to the cues in terms of proportion of variance accounted for in Experiments 1 and 2. In addition, the disaggregated ratings were more closely related to the paired-comparison orderings, but Experiment 2 showed that this was true for a small set (10) but not a large set (60) of evaluation objects. Experiment 3 tested the "gamesmanship" hypothesis: People prefer holistic ratings because it is easier to incorporate illegitimate but appealing criteria into one's judgment. The results suggested that the disaggregated procedure generally produced sharper distinctions between the most relevant and least relevant cues. Participants in all three of these studies preferred the holistic ratings despite their statistical inferiority. Copyright © 2009 John Wiley & Sons, Ltd. [source] A common model approach to macroeconomics: using panel data to reduce sampling errorJOURNAL OF FORECASTING, Issue 3 2005William T. Gavin Abstract Is there a common model inherent in macroeconomic data? Macroeconomic theory suggests that market economies of various nations should share many similar dynamic patterns; as a result, individual country empirical models, for a wide variety of countries, often include the same variables. Yet, empirical studies often find important roles for idiosyncratic shocks in the differing macroeconomic performance of countries. We use forecasting criteria to examine the macrodynamic behaviour of 15 OECD countries in terms of a small set of familiar, widely used core economic variables, omitting country-specific shocks. We find this small set of variables and a simple VAR ,common model' strongly support the hypothesis that many industrialized nations have similar macroeconomic dynamics. Copyright © 2005 John Wiley & Sons, Ltd. [source] Survey of the year 2005 commercial optical biosensor literatureJOURNAL OF MOLECULAR RECOGNITION, Issue 6 2006Rebecca L. Rich Abstract We identified 1113 articles (103 reviews, 1010 primary research articles) published in 2005 that describe experiments performed using commercially available optical biosensors. While this number of publications is impressive, we find that the quality of the biosensor work in these articles is often pretty poor. It is a little disappointing that there appears to be only a small set of researchers who know how to properly perform, analyze, and present biosensor data. To help focus the field, we spotlight work published by 10 research groups that exemplify the quality of data one should expect to see from a biosensor experiment. Also, in an effort to raise awareness of the common problems in the biosensor field, we provide side-by-side examples of good and bad data sets from the 2005 literature. Copyright © 2006 John Wiley & Sons, Ltd. [source] Ethanol-Regulated Genes That Contribute to Ethanol Sensitivity and Rapid Tolerance in DrosophilaALCOHOLISM, Issue 2 2010Eric C. Kong Background:, Increased ethanol intake, a major predictor for the development of alcohol use disorders, is facilitated by the development of tolerance to both the aversive and pleasurable effects of the drug. The molecular mechanisms underlying ethanol tolerance development are complex and are not yet well understood. Methods:, To identify genetic mechanisms that contribute to ethanol tolerance, we examined the time course of gene expression changes elicited by a single sedating dose of ethanol in Drosophila, and completed a behavioral survey of strains harboring mutations in ethanol-regulated genes. Results:, Enrichment for genes in metabolism, nucleic acid binding, olfaction, regulation of signal transduction, and stress suggests that these biological processes are coordinately affected by ethanol exposure. We also detected a coordinate up-regulation of genes in the Toll and Imd innate immunity signal transduction pathways. A multi-study comparison revealed a small set of genes showing similar regulation, including increased expression of 3 genes for serine biosynthesis. A survey of Drosophila strains harboring mutations in ethanol-regulated genes for ethanol sensitivity and tolerance phenotypes revealed roles for serine biosynthesis, olfaction, transcriptional regulation, immunity, and metabolism. Flies harboring deletions of the genes encoding the olfactory co-receptor Or83b or the sirtuin Sir2 showed marked changes in the development of ethanol tolerance. Conclusions:, Our findings implicate novel roles for these genes in regulating ethanol behavioral responses. [source] Force constant calculation from Raman and IR spectra in the three non-cubic phases of BaCeO3JOURNAL OF RAMAN SPECTROSCOPY, Issue 5 2001H. C. Gupta Raman spectroscopic data for BaCeO3 were modeled with the use of a small set of force constants in its three phases (Pnma, Imma and R3c) using experimental IR and Raman data. An IR reflectance spectrum of BaCeO3 in its Pnma phase is provided for the first time. Raman band symmetry assignments in the Pnma phase were re-examined and the assignments of Genet et al. were confirmed. The distribution of the Raman bands of the Pnma phase between the zone center and X point in Imma is provided and band assignments were extended to the Imma and R3c phases. Calculated values for both Raman- and IR-active modes are provided in the three phases. The variations in force constants throughout the phase transitions are briefly discussed. Copyright © 2001 John Wiley & Sons, Ltd. [source] Species and structural diversity of church forests in a fragmented Ethiopian Highland landscapeJOURNAL OF VEGETATION SCIENCE, Issue 5 2010Alemayehu Wassie Abstract Question: Thousands of small isolated forest fragments remain around churches ("church forests") in the almost completely deforested Ethiopian Highlands. We questioned how the forest structure and composition varied with altitude, forest area and human influence. Location: South Gondar, Amhara National Regional State, Northern Ethiopia. Methods: The structure and species composition was assessed for 810 plots in 28 church forests. All woody plants were inventoried, identified and measured (stem diameter) in seven to 56 10 m x 10-m plots per forest. Results: In total, 168 woody species were recorded, of which 160 were indigeneous. The basal area decreased with tree harvest intensity; understorey and middle-storey density (<5 cm DBH trees) decreased with grazing; overstorey density (>5 cm DBH trees) increased with altitude. The dominance of a small set of species increased with altitude and grazing intensity. Species richness decreased with altitude, mainly due to variation in the richness of the overstorey community. Moreover, species richness in the understorey decreased with grazing intensity. Conclusions: We show how tree harvesting intensity, grazing intensity and altitude contribute to observed variations in forest structure, composition and species richness. Species richness was, however, not related to forest area. Our study emphasizes the significant role played by the remaining church forests for conservation of woody plant species in North Ethiopian Highlands, and the need to protect these forests for plant species conservation purposes. [source] How much does simplification of probability forecasts reduce forecast quality?METEOROLOGICAL APPLICATIONS, Issue 1 2008F. J. Doblas-Reyes Abstract Probability forecasts from an ensemble are often discretized into a small set of categories before being distributed to the users. This study investigates how such simplification can affect the forecast quality of probabilistic predictions as measured by the Brier score (BS). An example from the European Centre for Medium-Range Weather Forecasts (ECMWF) operational seasonal ensemble forecast system is used to show that the simplification of the forecast probabilities reduces the Brier skill score (BSS) by as much as 57% with respect to the skill score obtained with the full set of probabilities issued from the ensemble. This is more obvious for a small number of probability categories and is mainly due to a decrease in forecast resolution of up to 36%. The impact of the simplification as a function of the ensemble size is also discussed. The results suggest that forecast quality should be made available for the set of probabilities that the forecast user has access to as well as for the complete set of probabilities issued by the ensemble forecasting system. Copyright © 2008 Royal Meteorological Society [source] Tropical storm impact in Central AmericaMETEOROLOGICAL APPLICATIONS, Issue 1 2006Sabino Palmieri Abstract In this study of tropical storm impacts in Central America, the relationship between physical variables (available in ,real time') and damage is explored, and a simple tool for early approximate evaluation of the impact is developed. Land track and energy dissipation appear as the most interesting parameters that modulate the hurricane impact. Because of the difficulty of attaching a monetary estimate to the damage caused in a large number of cases (as is required in a statistical approach), an ,Impact Index' based on the logarithm of casualties is introduced. Thereafter, within a subset of events in which damage in monetary terms is known, a rough link between damage and the Impact Index is derived. Shortly after a new event, as soon as land track and energy dissipation are known, either by means of an empirical equation or using a contour graph, the Impact Index may be determined. Another empirical equation allows a rough estimate of damage in monetary units, but because this estimate is based on a limited number of cases, it must be treated with caution. The methodology is tested for a small set of independent cases. Vulnerability to tropical cyclones depends not only on natural factors but also on sociopolitical conditions. A coupled sociological and environmental approach is believed to be the best way to improve the early impact estimate methodology. Copyright © 2006 Royal Meteorological Society. [source] What can the age composition of a population tell us about the age composition of its out-migrants?POPULATION, SPACE AND PLACE (PREVIOUSLY:-INT JOURNAL OF POPULATION GEOGRAPHY), Issue 1 2007Jani S. Little Abstract Preliminary findings show that the age structure of a population can provide valuable information about the age composition of its out-migrants, and that this relationship can become a key ingredient in the proposed new method for estimating the age profile of out-migrants when accurate data are not available. The method relies on the Rogers-Castro model schedule to consistently and accurately represent age profiles of out-migration, and the results show that variation among these out-migration schedules can be captured by a typology based on a small set of clusters, or families of schedules. Membership of the clusters is then predicted from simple measures of population composition using discriminant function analysis. The investigation is based on data for US states, CMSAs, MSAs and non-metropolitan counties, and their outflows of migrants between 1995 and 2000. The measures of population age composition come from official 1995 intercensal age-specific population estimates for the same geographical units. Copyright © 2007 John Wiley & Sons, Ltd. [source] A MARKET UTILITY-BASED MODEL FOR CAPACITY SCHEDULING IN MASS SERVICESPRODUCTION AND OPERATIONS MANAGEMENT, Issue 2 2003JOHN C. GOODALE Only a small set of employee scheduling articles have considered an objective of profit or contribution maximization, as opposed to the traditional objective of cost (including opportunity costs) minimization. In this article, we present one such formulation that is a market utility-based model for planning and scheduling in mass services (MUMS). MUMS is a holistic approach to market-based service capacity scheduling. The MUMS framework provides the structure for modeling the consequences of aligning competitive priorities and service attributes with an element of the firm's service infrastructure. We developed a new linear programming formulation for the shift-scheduling problem that uses market share information generated by customer preferences for service attributes. The shift-scheduling formulation within the framework of MUMS provides a business-level model that predicts the economic impact of the employee schedule. We illustrated the shift-scheduling model with empirical data, and then compared its results with models using service standard and productivity standard approaches. The result of the empirical analysis provides further justification for the development of the market-based approach. Last, we discuss implications of this methodology for future research. [source] Identification of potential substrate proteins for the periplasmic Escherichia coli chaperone SkpPROTEINS: STRUCTURE, FUNCTION AND BIOINFORMATICS, Issue 23-24 2008Svenja Jarchow Abstract The "seventeen kilodalton protein" (Skp) is a predominant periplasmic chaperone of Escherichia coli, which is involved in the biogenesis of abundant outer membrane proteins (OMPs) such as OmpA, PhoE, and LamB. In this study the substrate profile of Skp was investigated in a proteomics approach. Skp was overexpressed in a deficient E. coli strain as a fusion protein with the Strep,tag and captured, together with any host proteins associated with it, from the periplasmic cell extract under mild conditions via one-step Strep,Tactin affinity chromatography. Copurified substrate proteins were then identified by high resolution 2-DE with immobilized pH-gradients, followed by MALDI-TOF MS. Apart from the known Skp substrates, including OmpA and LamB, more than 30 other interacting proteins were detected, especially from the outer membrane, among these FadL and BtuB, and from the periplasm such as MalE and OppA. Thus, Skp does not only serve as a specialized chaperone for a small set of OMPs, but it seems to exhibit a broader substrate spectrum, including soluble periplasmic proteins. These findings should prompt further investigation into the physiological role of Skp and may promote its use for the bacterial production of biochemically active heterologous proteins whose folding requires secretion into the oxidizing milieu of the periplasm. [source] Identification of breast cancer biomarkers in transgenic mouse models: A proteomics approachPROTEOMICS - CLINICAL APPLICATIONS, Issue 6-7 2010Wendy Rodenburg Abstract Purpose: Transgenic mouse models for cancer circumvent many challenges that hamper human studies aimed at biomarker discovery. Lower biological variances among mice combined with controllable factors such as food uptake and health status may enable the detection of more subtle protein expression differences. This is envisioned to result in the identification of biomarkers better discriminating cancer cases from controls. Experimental design: The current study used two innovative mouse models for breast-cancer to identify new serum biomarkers. Multi-analyte profiling technique was used to analyze 70 proteins in individual serum samples of non-tumor and mammary tumor-bearing Tg.NK (MMTV/c-neu) mice. Results: A small set of proteins fully differentiated tumor samples from controls. These comprised osteopontin, interleukin-18, cystatin C and CD40 antigen. Comparison of protein expression in another breast-cancer mouse model, the humanized p53.R270H mice, showed common discriminatory expression of osteopontin. However, other biomarkers showed distinct expression in the two different breast-cancer models, indicating that different mammary tumor sub-types with respect to molecular and estrogen receptor status reveal divergent serum biomarker sets. Conclusions and clinical relevance: The current study supports the concept that serum proteins can discriminate mammary tumor cases from controls, and yielded interesting biomarkers that need further testing and validation in human studies. [source] Estimation and evidence in forensic anthropology: Sex and raceAMERICAN JOURNAL OF PHYSICAL ANTHROPOLOGY, Issue 1 2009Lyle W. Konigsberg Abstract Forensic anthropology typically uses osteological and/or dental data either to estimate characteristics of unidentified individuals or to serve as evidence in cases where there is a putative identification. In the estimation context, the problem is to describe aspects of an individual that may lead to their eventual identification, whereas in the evidentiary context, the problem is to provide the relative support for the identification. In either context, individual characteristics such as sex and race may be useful. Using a previously published forensic case (Steadman et al. (2006) Am J Phys Anthropol 131:15,26) and a large (N = 3,167) reference sample, we show that the sex of the individual can be reliably estimated using a small set of 11 craniometric variables. The likelihood ratio from sex (assuming a 1:1 sex ratio for the "population at large") is, however, relatively uninformative in "making" the identification. Similarly, the known "race" of the individual is relatively uninformative in "making" the identification, because the individual was recovered from an area where the 2000 US census provides a very homogenous picture of (self-identified) race. Of interest in this analysis is the fact that the individual, who was recovered from Eastern Iowa, classifies very clearly with [Howells 1973. Cranial Variation in Man: A Study by Multivariate Analysis of Patterns of Difference Among Recent Human Populations. Cambridge, MA: Peabody Museum of Archaeology and Ethnology; 1989. Skull Shape and the Map: Craniometric Analyses in the Dispersion of Modern Homo. Cambridge, MA: Harvard University Press]. Easter Islander sample in an analysis with uninformative priors. When the Iowa 2000 Census data on self-reported race are used for informative priors, the individual is clearly identified as "American White." This analysis shows the extreme importance of an informative prior in any forensic application. Am J Phys Anthropol 2009. © 2009 Wiley-Liss, Inc. [source] Expression analysis suggests novel roles for the plastidic phosphate transporter Pht2;1 in auto- and heterotrophic tissues in potato and ArabidopsisTHE PLANT JOURNAL, Issue 1 2004Christine Rausch Summary A cDNA encoding Pht2;1 from potato, a new member of the plant Pht2 gene family of low-affinity orthophosphate (Pi) transporters, was isolated. The expression pattern of the corresponding gene as well as its ortholog from Arabidopsis was analyzed and the encoded proteins were localized in the two plants. Pht2;1 expression is strongly upregulated by light in potato and Arabidopsis leaf tissue. RNA gel blot analysis, reverse transcription-polymerase chain reaction (RT-PCR), promoter/GUS, and protein/green fluorescent protein (GFP) fusion studies, respectively, indicate that the gene is expressed in both auto- and heterotrophic tissues and its encoded protein is localized to the plastids. The similar patterns of Pht2;1 gene regulation in potato and Arabidopsis prompted us to screen publicly available gene expression data from 228 Arabidopsis oligonucleotide microarrays covering 83 different experimental conditions. Modulation of Pht2;1 transcript levels was overall moderate, except for a limited number of experimental conditions where Pht2;1 mRNA concentrations varied between 2- and 3.7-fold. Overall, these analyses suggest involvement of the Pht2;1 protein in cell wall metabolism in young, rapidly growing tissues, independent of other Pi transporters such as the high-affinity Solanum tuberosum Pi transporter 1 (StPT1). Cluster analysis allowed identification of colinear or antiparallel expression profiles of a small set of genes involved in post-translational regulation, and photosynthetic carbon metabolism. These data give clues about the possible biological function of Pht2;1 and shed light on the complex web of interactions in which Pht2;1 could play a role. [source] Measuring disease activity and functional status in patients with scleroderma and Raynaud's phenomenonARTHRITIS & RHEUMATISM, Issue 9 2002Peter A. Merkel Objective To document disease activity and functional status in patients with scleroderma (systemic sclerosis [SSc]) and Raynaud's phenomenon (RP) and to determine the sensitivity to change, reliability, ease of use, and validity of various outcome measures in these patients. Methods Patients with SSc and moderate-to-severe RP participating in a multicenter RP treatment trial completed daily diaries documenting the frequency and duration of RP attacks and recorded a daily Raynaud's Condition Score (RCS). Mean scores for the 2-week periods prior to baseline (week 0), end of trial (week 6), and posttrial followup (week 12) were calculated. At weeks 0, 6, and 12, physicians completed 3 global assessment scales and performed clinical assessments of digital ulcers and infarcts; patients completed the Health Assessment Questionnaire (HAQ), the Arthritis Impact Measurement Scales 2 (AIMS2) mood and tension subscales, 5 specific SSc/RP-related visual analog scales (VAS), and 3 other VAS global assessments. We used these measures to document baseline disease activity and to assess their construct validity, sensitivity to change, and reliability in trial data. Results Two hundred eighty-one patients (248 women, 33 men; mean age 50.4 years [range 18,82 years]) from 14 centers participated. Forty-eight percent had limited cutaneous SSc; 52% had diffuse cutaneous SSc. Fifty-nine patients (21%) had digital ulcers at baseline. Patients had 3.89 ± 2.33 (mean ± SD) daily RP attacks (range 0.8,14.6), with a duration of 82.1 ± 91.6 minutes/attack. RCS for RP activity (possible range 0,10) was 4.30 ± 1.92. HAQ scores (0,3 scale) indicated substantial disability at baseline (total disability 0.86, pain 1.19), especially among the subscales pertaining to hand function (grip, eating, dressing). AIMS2 mood and tension scores were fairly high, as were many of the VAS scores. Patients with digital ulcers had worse RCS, pain, HAQ disability (overall, grip, eating, and dressing), physician's global assessment, and tension, but no significant difference in the frequency of RP, duration of RP, patient's global assessment, or mood, compared with patients without digital ulcers. VAS scores for digital ulcers as rated by the patients were not consistent with the physician's ratings. Factor analysis of the 18 measures showed strong associations among variables in 4 distinct domains: disease activity, RP measures, digital ulcer measures, and mood/tension. Reliability of the RCS, HAQ pain and disability scales, and AIMS2 mood and tension subscales was high. The RP measures demonstrated good sensitivity to change (effect sizes 0.33,0.76). Conclusion Our findings demonstrate that the significant activity, disability, pain, and psychological impact of RP and digital ulcers in SSc can be measured by a small set of valid and reliable outcome measures. These outcome measures provide information beyond the quantitative metrics of RP attacks. We propose a core set of measures for use in clinical trials of RP in SSc patients that includes the RCS, patient and physician VAS ratings of RP activity, a digital ulcer/infarct measure, measures of disability and pain (HAQ), and measures of psychological function (AIMS2). [source] REVERSIBLE JUMP MARKOV CHAIN MONTE CARLO METHODS AND SEGMENTATION ALGORITHMS IN HIDDEN MARKOV MODELSAUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 2 2010R. Paroli Summary We consider hidden Markov models with an unknown number of regimes for the segmentation of the pixel intensities of digital images that consist of a small set of colours. New reversible jump Markov chain Monte Carlo algorithms to estimate both the dimension and the unknown parameters of the model are introduced. Parameters are updated by random walk Metropolis,Hastings moves, without updating the sequence of the hidden Markov chain. The segmentation (i.e. the estimation of the hidden regimes) is a further aim and is performed by means of a number of competing algorithms. We apply our Bayesian inference and segmentation tools to digital images, which are linearized through the Peano,Hilbert scan, and perform experiments and comparisons on both synthetic images and a real brain magnetic resonance image. [source] Tracing the crimson thread: United Kingdom residents holding probated South Australian assets, 1905,1915AUSTRALIAN ECONOMIC HISTORY REVIEW, Issue 3 2003Martin Shanahan At the beginning of the twentieth century, Australia and the United Kingdom enjoyed close economic, cultural and social links. One filament of the ,crimson thread' between the two countries was the personal asset holdings of individuals who possessed wealth in both countries. This article examines the asset holdings of estates probated in South Australia and the United Kingdom, and describes the relative wealth of a small set of trans-national wealth holders. They are found to be an elite group whose wealth placed them in the upper reaches of the wealthy in both the UK and Australia. [source] Consumer acceptability, sensory properties and expert quality judgements of Australian Cabernet Sauvignon and Shiraz winesAUSTRALIAN JOURNAL OF GRAPE AND WINE RESEARCH, Issue 1 2010K.A. LATTEY Abstract Background and Aims:, This study aimed to determine what sensory attributes most drive consumer and expert acceptance for Cabernet Sauvignon and Shiraz wines. Methods and Results:, The sensory attributes of a set of commercial wines were quantified by a trained panel. A subset was assessed blind for liking by 203 consumers and for quality by 67 winemakers. For the total group of consumers, wines with low levels of ,bitterness', ,hotness', ,metallic', ,smoky' and ,pepper' were preferred. In addition, four consumer clusters were identified, each with different sensory drivers of preference, with the attributes ,red berry', ,floral', ,caramel' and ,vanilla' aroma, ,acidity', ,green' flavour and astringency being of importance in distinguishing the different clusters' acceptance scores. The winemakers' quality scores had little relationship with consumer response, although both groups gave low ratings to wines with Brettanomyces -related flavour. Conclusions:, A relatively small set of sensory attributes were of greatest importance to consumer liking, and these generally dominate varietal differences. Winemakers' quality concepts do not closely align with those of the consumers. Significance of the Study:, This study identifies sensory properties of red wines which could be maximised as well as those which should be reduced, allowing producers to better meet consumers' preferences. [source] Micro biochemical engineering to accelerate the design of industrial-scale downstream processes for biopharmaceutical proteinsBIOTECHNOLOGY & BIOENGINEERING, Issue 3 2008N.J. Titchener-Hooker Abstract The article examines how a small set of easily implemented micro biochemical engineering procedures combined with regime analysis and bioprocess models can be used to predict industrial scale performance of biopharmaceutical protein downstream processing. This approach has been worked on in many of our studies of individual operations over the last 10 years and allows preliminary evaluation to be conducted much earlier in the development pathway because of lower costs. It then permits the later large scale trials to be more highly focused. This means that the risk of delays during bioprocess development and of product launch are reduced. Here we draw the outcomes of this research together and illustrate its use in a set of typical operations; cell rupture, centrifugation, filtration, precipitation, expanded bed adsorption, chromatography and for common sources, E. coli, two yeasts and mammalian cells (GS-NSO). The general approach to establishing this method for other operations is summarized and new developments outlined. The technique is placed against the background of the scale-down methods that preceded it and complementary ones that are being examined in parallel. The article concludes with a discussion of the advantages and limitations of the micro biochemical engineering approach versus other methods. Biotechnol. Bioeng. 2008;100: 473,487. © 2008 Wiley Periodicals, Inc. [source] The Small World of Canadian Capital Markets: Statistical Mechanics of Investment Bank Syndicate Networks, 1952,1989CANADIAN JOURNAL OF ADMINISTRATIVE SCIENCES, Issue 4 2004Joel A.C. Baum We investigate the structure of investment bank syndicate networks in Canada. We consider two banks to be connected if they have participated in an underwriting syndicate together, and construct networks of such connections using data drawn from the Record of New Issues (Financial Data Group). We show that these interfirm networks form "small worlds", in which banks are both locally clustered and globally connected by short paths of intermediate banks, and are "scale free", in which the connectivity of the network is highly skewed and with most banks tied to a small set of prominent banks. We examine changes over time in the network's small-world and scale-free properties, and demonstrate their theoretical and practical implications for the structure and operation of Canadian capital markets by linking these properties to the network's cliquey-ness, resilience, and speed of information transmission. Résumé Cette étude porte sur la structure des réseaux que for-ment les syndicats d'émission des banques d'investissement au Canada. Nous posons que deux banques sont liées si elles ont participé ensemble à un syndicat d'émission, et nous retraçons les réseaux de liens en utilisant des données extraites du Record of New Issues (Financial Data Group). Nous montrons que ces réseaux interorganisationnels (RIO)forment des petits mondes dans lesquels les banques sont à la fois localement regroupées et mondialement reliées par des courts chemins de banques intermédiaires. Les RIO sont également sans échelle (scale free): la connectivité dans le réseau est fortement inégale et la plupart des banques sont liées à un petit nombre de banques dominantes. Nous examinons l'évolution des propriétés de petit monde et d'absence d'échelle du réseau et mettons en Evidence leurs implications théoriques et pratiques pour la structure et le fonctionnement du marché canadien des capitaux en reliant ces propriétés aux caractères de clique, de résilience et de vitesse de transmission de l'information du réseau. [source] |