Realistic

Distribution by Scientific Domains

Terms modified by Realistic

  • realistic alternative
  • realistic application
  • realistic approach
  • realistic assessment
  • realistic assumption
  • realistic boundary condition
  • realistic case
  • realistic condition
  • realistic data
  • realistic description
  • realistic estimate
  • realistic evaluation
  • realistic example
  • realistic expectation
  • realistic goal
  • realistic level
  • realistic model
  • realistic models
  • realistic prediction
  • realistic problem
  • realistic result
  • realistic scenario
  • realistic simulation
  • realistic situation
  • realistic solution
  • realistic understanding
  • realistic value
  • realistic view

  • Selected Abstracts


    Realistic and efficient rendering of free-form knitwear

    COMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 1 2001
    Hua Zhong
    Abstract We present a method for rendering knitwear on free-form surfaces. This method has three main advantages. First, it renders yarn microstructure realistically and efficiently. Second, the rendering efficiency of yarn microstructure does not come at the price of ignoring the interactions between the neighboring yarn loops. Such interactions are modeled in our system to further enhance realism. Finally, our approach gives the user intuitive control on a few key aspects of knitwear appearance: the fluffiness of the yarn and the irregularity in the positioning of the yarn loops. The result is a system that efficiently produces highly realistic rendering of free-form knitwear with user control on key aspects of visual appearance. Copyright © 2001 John Wiley & Sons, Ltd. [source]


    Raymond Aron: Too Realistic to Be a Realist?

    CONSTELLATIONS: AN INTERNATIONAL JOURNAL OF CRITICAL AND DEMOCRATIC THEORY, Issue 4 2007
    Pierre Hassner
    First page of article [source]


    Coping in Children and Adolescents with Obesity: The Costs and Benefits of Realistic versus Unrealistic Weight Evaluations

    APPLIED PSYCHOLOGY: HEALTH AND WELL-BEING, Issue 2 2010
    Stefanie Meier
    The study analysed differences in coping strategies between obese and non-obese children and adolescents (age: 8,14 years) in response to a social stressor. Physicians' diagnoses of obesity and self-reports on height and weight as well as gender and age were considered. A sample of 757 participants responded to an established German coping questionnaire. In addition to general coping strategies, two more potentially weight-related coping strategies were assessed. Adolescent obese girls who reported height and weight realistically showed particularly little social support-seeking behavior. Media use in general increased with age, but was especially high for adolescent obese boys who evaluated themselves as obese. Finally, girls in general and obese children and adolescents who evaluated themselves as overweight or obese showed higher stress-related eating. With regard to coping it seems to be a disadvantage for obese children and adolescents to see themselves as obese. In contrast, obese children and adolescents who have unrealistically positive self-evaluations of their weight report coping strategies similar to those reported by normal weight children and adolescents. It is assumed that positive self-evaluations of body weight prevent especially obese adolescents from inactivity and social isolation. Findings are relevant for the design of interventions to treat obesity. [source]


    Animating Quadrupeds: Methods and Applications

    COMPUTER GRAPHICS FORUM, Issue 6 2009
    Ljiljana Skrba
    I.3.7 [Computer Graphics]: 3D Graphics and Realism , Animation Abstract Films like Shrek, Madagascar, The Chronicles of Narnia and Charlotte's web all have something in common: realistic quadruped animations. While the animation of animals has been popular for a long time, the technical challenges associated with creating highly realistic, computer generated creatures have been receiving increasing attention recently. The entertainment, education and medical industries have increased the demand for simulation of realistic animals in the computer graphics area. In order to achieve this, several challenges need to be overcome: gathering and processing data that embodies the natural motion of an animal , which is made more difficult by the fact that most animals cannot be easily motion-captured; building accurate kinematic models for animals, with adapted animation skeletons in particular; and developing either kinematic or physically-based animation methods, either by embedding some a priori knowledge about the way that quadrupeds locomote and/or adopting examples of real motion. In this paper, we present an overview of the common techniques used to date for realistic quadruped animation. This includes an outline of the various ways that realistic quadruped motion can be achieved, through video-based acquisition, physics based models, inverse kinematics or some combination of the above. [source]


    Artificial Animals and Humans: From Physics to Intelligence

    COMPUTER GRAPHICS FORUM, Issue 3 2002
    Demetri Terzopoulos
    The confluence of virtual reality and artificial life, an emerging discipline that spans the computational and biological sciences, has yielded synthetic worlds inhabited by realistic, artificial flora and fauna. Artificial animals are complex synthetic organisms that possess functional biomechanical bodies, sensors, and brains with locomotion, perception, behavior, learning, and cognition centers. Artificial humans and other animals are of interest in computer graphics because they are self-animating characters that dramatically advance the state of the art of production animation and interactive game technologies. More broadly, these biomimetic autonomous agents in their realistic virtual worlds also foster deeper, computationally oriented insights into natural living systems. [source]


    A Multiresolution Model for Soft Objects Supporting Interactive Cuts and Lacerations

    COMPUTER GRAPHICS FORUM, Issue 3 2000
    Fabio Ganovelli
    Performing a really interactive and physically-based simulation of complex soft objects is still an open problem in computer animation/simulation. Given the application domain of virtual surgery training, a complete model should be quite realistic, interactive and should enable the user to modify the topology of the objects. Recent papers propose the adoption of multiresolution techniques to optimize time performance by representing at high resolution only the object parts considered more important or critical. The speed up obtainable at simulation time are counterbalanced by the need of a preprocessing phase strongly dependent on the topology of the object, with the drawback that performing dynamic topology modification becomes a prohibitive issue. In this paper we present an approach that couples multiresolution and topological modifications, based on the adoption of a particle systems approach to the physical simulation. Our approach is based on a tetrahedral decomposition of the space, chosen both for its suitability to support a particle system and for the ready availability of many techniques recently proposed for the simplification and multiresolution management of 3D simplicial decompositions. The multiresolution simulation system is designed to ensure the required speedup and to support dynamic changes of the topology, e.g. due to cuts or lacerations of the represented tissue. [source]


    Some benefits of dichotomization in psychiatric and criminological research

    CRIMINAL BEHAVIOUR AND MENTAL HEALTH, Issue 2 2000
    Professor David P. Farrington PhD FBA
    Background The product-moment correlation r is widely used in criminology and psychiatry to measure strength of association. However, most criminological and psychiatric variables contravene its underlying assumptions. Aim To compare statistical measures of association based on dichotomous variables with the use of r. Method Explanatory variables for delinquency are investigated in the Pittsburgh Youth Study using a sample of 506 boys aged 13,14. Results Dichotomization does not necessarily cause a decrease in measured strength of associations. Conclusions about the most important explanatory variables for delinquency were not greatly affected by using dichotomous as opposed to continuous variables, by different dichotomization splits, or by using logistic versus OLS multiple regression. Non-linear relationships, interaction effects and multiple risk factor individuals were easily studied using dichotomous data. Conclusions Dichotomization produces meaningful findings that are easily understandable to a wide audience. Measures of association for dichotomous variables, such as the odds ratio, have many advantages and are often more realistic and meaningful measures of strength of relationship than the product-moment correlation r. Copyright © 2000 Whurr Publishers Ltd. [source]


    Misleading hallucinations in unrecognized narcolepsy

    ACTA PSYCHIATRICA SCANDINAVICA, Issue 4 2003
    A. Sz
    Objective: To describe psychosis-like hallucinatory states in unrecognized narcolepsy. Method: Two patients with hypnagogic/hypnapompic hallucinations are presented. Results: Both patients had realistic and complex , multi-modal and scenic-daytime sexual hallucinations leading, in the first case, to a legal procedure because of false accusation, and in the second, to serious workplace conflicts. Both patients were convinced of the reality of their hallucinatory experiences but later both were able to recognize their hallucinatory character. Clinical data, a multiple sleep latency test, polysomnography, and HLA typing revealed that both patients suffered from narcolepsy. Conclusion: We suggest that in unrecognized narcolepsy with daytime hypnagogic/hypnapompic hallucinations the diagnostic procedure may mistakenly incline towards delusional psychoses. Daytime realistic hypnagogic/hypnapompic hallucinations may also have forensic consequences and mislead legal evaluation. Useful clinical features in differentiating narcolepsy from psychoses are: the presence of other narcoleptic symptoms, features of hallucinations, and response to adequate medication. [source]


    Use of terrain variables for mapping gully erosion susceptibility in Lebanon

    EARTH SURFACE PROCESSES AND LANDFORMS, Issue 12 2007
    Rania Bou Kheir
    Abstract This paper predicts the geographic distribution and size of gullies across central Lebanon using a geographic information system (GIS) and terrain analysis. Eleven primary (elevation; upslope contributing area; aspect; slope; plan, profile and tangential curvature; flow direction; flow width; flow path length; rate of change of specific catchment area along the direction of flow) and three secondary (steady-state; quasi-dynamic topographic wetness; sediment transport capacity) topographic variables were generated and used along with digital data collected from other sources (soil, geology) to statistically explain gully erosion field measurements. Three tree-based regression models were developed using (1) all variables, (2) primary topographic variables only and (3) different pairs of variables. The best regression tree model combined the steady-state topographic wetness and sediment transport capacity indices and explained 80% of the variability in field gully measurements. This model proved to be simple, quick, realistic and practical, and it can be applied to other areas of the Mediterranean region with similar environmental conditions, thereby providing a tool to help with the implementation of plans for soil conservation and sustainable management. Copyright © 2007 John Wiley & Sons, Ltd. [source]


    Empirical estimate of fundamental frequencies and damping for Italian buildings

    EARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 8 2009
    Maria Rosaria Gallipoli
    Abstract The aim of this work is to estimate the fundamental translational frequencies and relative damping of a large number of existing buildings, performing ambient vibration measurements. The first part of the work is devoted to the comparison of the results obtained with microtremor measurements with those obtained from earthquake recordings using four different techniques: horizontal-to-vertical spectral ratio, standard spectral ratio, non-parametric damping analysis (NonPaDAn) and half bandwidth method. We recorded local earthquakes on a five floors reinforced concrete building with a pair of accelerometers located on the ground and on top floor, and then collected microtremors at the same location of the accelerometers. The agreement between the results obtained with microtremors and earthquakes has encouraged extending ambient noise measurements to a large number of buildings. We analysed the data with the above-mentioned methods to obtain the two main translational frequencies in orthogonal directions and their relative damping for 80 buildings in the urban areas of Potenza and Senigallia (Italy). The frequencies determined with different techniques are in good agreement. We do not have the same satisfactory results for the estimates of damping: the NonPaDAn provides estimates that are less dispersed and grouped around values that appear to be more realistic. Finally, we have compared the measured frequencies with other experimental results and theoretical models. Our results confirm, as reported by previous authors, that the theoretical period,height relationships overestimate the experimental data. Copyright © 2008 John Wiley & Sons, Ltd. [source]


    Parameter identification of framed structures using an improved finite element model-updating method,Part I: formulation and verification

    EARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 5 2007
    Eunjong Yu
    Abstract In this study, we formulate an improved finite element model-updating method to address the numerical difficulties associated with ill conditioning and rank deficiency. These complications are frequently encountered model-updating problems, and occur when the identification of a larger number of physical parameters is attempted than that warranted by the information content of the experimental data. Based on the standard bounded variables least-squares (BVLS) method, which incorporates the usual upper/lower-bound constraints, the proposed method (henceforth referred to as BVLSrc) is equipped with novel sensitivity-based relative constraints. The relative constraints are automatically constructed using the correlation coefficients between the sensitivity vectors of updating parameters. The veracity and effectiveness of BVLSrc is investigated through the simulated, yet realistic, forced-vibration testing of a simple framed structure using its frequency response function as input data. By comparing the results of BVLSrc with those obtained via (the competing) pure BVLS and regularization methods, we show that BVLSrc and regularization methods yield approximate solutions with similar and sufficiently high accuracy, while pure BVLS method yields physically inadmissible solutions. We further demonstrate that BVLSrc is computationally more efficient, because, unlike regularization methods, it does not require the laborious a priori calculations to determine an optimal penalty parameter, and its results are far less sensitive to the initial estimates of the updating parameters. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    Inelastic earthquake response of single-story asymmetric buildings: an assessment of simplified shear-beam models

    EARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 12 2003
    K. G. Stathopoulos
    Abstract The inelastic seismic torsional response of simple structures is examined by means of shear-beam type models as well as with plastic hinge idealization of one-story buildings. Using mean values of ductility factors, obtained for groups of ten earthquake motions, as the basic index of post-elastic response, the following topics are examined with the shear-beam type model: mass eccentric versus stiffness eccentric systems, effects of different types of motions and effects of double eccentricities. Subsequently, comparisons are made with results obtained using a more realistic, plastic hinge type model of single-story reinforced concrete frame buildings designed according to a modern Code. The consequences of designing for different levels of accidental eccentricity are also examined for the aforementioned frame buildings. Copyright © 2003 John Wiley & Sons, Ltd. [source]


    Extinction debt on oceanic islands

    ECOGRAPHY, Issue 2 2010
    Kostas A. Triantis
    Habitat destruction is the leading cause of species extinctions. However, there is typically a time-lag between the reduction in habitat area and the eventual disappearance of the remnant populations. These "surviving but ultimately doomed" species represent an extinction debt. Calculating the magnitude of such future extinction events has been hampered by potentially inaccurate assumptions about the slope of species,area relationships, which are habitat- and taxon-specific. We overcome this challenge by applying a method that uses the historical sequence of deforestation in the Azorean Islands, to calculate realistic and ecologically-adjusted species,area relationships. The results reveal dramatic and hitherto unrecognized levels of extinction debt, as a result of the extensive destruction of the native forest:>95%, in<600,yr. Our estimations suggest that more than half of the extant forest arthropod species, which have evolved in and are dependent on the native forest, might eventually be driven to extinction. Data on species abundances from Graciosa Island, where only a very small patch of secondary native vegetation still exists, as well as the number of species that have not been found in the last 45,yr, despite the extensive sampling effort, offer support to the predictions made. We argue that immediate action to restore and expand native forest habitat is required to avert the loss of numerous endemic species in the near future. [source]


    The importance of interspecific interactions for breeding-site selection: peregrine falcons seek proximity to raven nests

    ECOGRAPHY, Issue 6 2004
    Fabrizio Sergio
    The advent of GIS is initiating a rapid increase in the utilization of wildlife-habitat models as tools for species and habitat management. However, such models rarely include estimates of interspecific interactions among explanatory variables. We tested the importance of such variables by using the peregrine falcon Falco peregrinus, a medium-sized raptor frequently reported to be affected by heterospecifics, as a model species. In an Alpine population, compared to random locations, peregrines selected breeding sites farther from conspecifics, on taller cliffs, with higher availability of farmland and closer to raven Corvus corax nests. Within suitable habitat, peregrines selected sites near ravens and far from elevations associated with golden eagle Aquila chrysaetos nests. Productivity increased with cliff size, farmland availability (rich in the local main prey) and with proximity to ravens, suggesting that the observed choices were adaptive. Finally, at the regional level, peregrine density peaked at low elevation and was positively associated with raven density. The results suggested an active breeding association of peregrines with ravens, which may provide early-warning cues against predators and safe alternative nest-sites. They also confirmed the importance of including estimates of interspecific interactions among explanatory variables, which may: 1) make models more realistic; 2) increase their predictive power by lowering unexplained variance due to unmeasured factors; 3) provide unexpected results such as the cryptic, large-scale breeding association of our study; and 4) stimulate further hypothesis formulation and testing, ultimately leading to deeper ecological knowledge of the study system. [source]


    Estimating the number of alcohol-attributable deaths: methodological issues and illustration with French data for 2006

    ADDICTION, Issue 6 2010
    Grégoire Rey
    ABSTRACT Aims Computing the number of alcohol-attributable deaths requires a series of hypotheses. Using French data for 2006, the potential biases are reviewed and the sensitivity of estimates to various hypotheses evaluated. Methods Self-reported alcohol consumption data were derived from large population-based surveys. The risks of occurrence of diseases associated with alcohol consumption and relative risks for all-cause mortality were obtained through literature searches. All-cause and cause-specific population alcohol-attributable fractions (PAAFs) were calculated. In order to account for potential under-reporting, the impact of adjustment on sales data was tested. The 2006 mortality data were restricted to people aged between 15 and 75 years. Results When alcohol consumption distribution was adjusted for sales data, the estimated number of alcohol-attributable deaths, the sum of the cause-specific estimates, was 20 255. Without adjustment, the estimate fell to 7158. Using an all-cause mortality approach, the adjusted number of alcohol-attributable deaths was 15 950, while the non-adjusted estimate was a negative number. Other methodological issues, such as computation based on risk estimates for all causes for ,all countries' or only ,European countries', also influenced the results, but to a lesser extent. Discussion The estimates of the number of alcohol-attributable deaths varied greatly, depending upon the hypothesis used. The most realistic and evidence-based estimate seems to be obtained by adjusting the consumption data for national alcohol sales, and by summing the cause-specific estimates. However, interpretation of the estimates must be cautious in view of their potentially large imprecision. [source]


    Plant,soil feedbacks and invasive spread

    ECOLOGY LETTERS, Issue 9 2006
    Jonathan M. Levine
    Abstract Plant invaders have been suggested to change soil microbial communities and biogeochemical cycling in ways that can feedback to benefit themselves. In this paper, we ask when do these feedbacks influence the spread of exotic plants. Because answering this question is empirically challenging, we show how ecological theory on ,pushed' and ,pulled' invasions can be used to examine the problem. We incorporate soil feedbacks into annual plant invasion models, derive the conditions under which such feedbacks affect spread, and support our approach with simulations. We show that in homogeneous landscapes, strong positive feedbacks can influence spreading velocity for annual invaders, but that empirically documented feedbacks are not strong enough to do so. Moreover, to influence spread, invaders must modify the soil environment over a spatial scale larger than is biologically realistic. Though unimportant for annual invader spread in our models, feedbacks do affect invader density and potential impact. We discuss how future research might consider the way landscape structure, dispersal patterns, and the time scales over which plant,soil feedbacks develop regulate the effects of such feedbacks on invader spread. [source]


    LAND RICH AND DATA POOR: MODELLING REQUIREMENTS IN AUSTRALIA'S FAR NORTH

    ECONOMIC PAPERS: A JOURNAL OF APPLIED ECONOMICS AND POLICY, Issue 3 2005
    Natalie Stoeckl
    Economic models have long been used as a way of organising and presenting information for policy makers interested in large regions,e.g. nations,and recent advances in information technology make the goal of developing models for decision makers in other locales a realistic one. The research on which this paper focuses was part of large project investigating the feasibility and desirability of developing a multi-disciplinary computer model of the Australian Savannas. In the large project, researchers were broken in to three teams: those considering the biophysical, demographic, and economic aspects of the modelling problem. This paper presents findings from part of the economic component of the investigation: that which sought information from key local ,stakeholders' about the type of information that would be useful to them. Responses indicate that many of Australia's existing economic models are capable of providing the ,right' type of information; but at too coarse a geographic scale for those in remote regions. Evidently, there is a need for developing other models. [source]


    IS EDUCATIONAL POLICY MAKING RATIONAL , AND WHAT WOULD THAT MEAN, ANYWAY?

    EDUCATIONAL THEORY, Issue 5 2009
    Eric Bredo
    In Moderating the Debate: Rationality and the Promise of American Education, Michael Feuer raises concerns about the consequences of basing educational policy on the model of rational choice drawn from economics. Policy making would be better and more realistic, he suggests, if it were based on a newer procedural model drawn from cognitive science. In this essay Eric Bredo builds on Feuer's analysis by offering a more systematic critique of the traditional model of rationality that Feuer criticizes, a more critical evaluation of the procedural model that he favors, and a recommendation that the situational model he does not consider may have some benefits over both. This analysis shows that the traditional model presupposes an actor that cannot learn or develop. While the actor in the procedural model can learn, Bredo contends that it cannot develop, that is, it cannot outgrow its initial assumptions and values. Only the situational model allows for learning and development, important in a model to be used in the field of education. Bredo also considers in his analysis the social-relational assumptions built into the traditional, procedural, and situational models and the likely ethical consequences of acting on them. [source]


    Web-based virtual patients in dentistry: factors influencing the use of cases in the Web-SP system

    EUROPEAN JOURNAL OF DENTAL EDUCATION, Issue 1 2009
    N. Zary
    Abstract We studied the students' acceptance and utilization of virtual patients (VPs) authored by faculty using the Web-SP system over two consecutive years. We also studied factors of importance for the utilization of VPs for self-assessment. Both year-groups studied found the Web-SP system easy to use and their overall opinion of Web-SP was positive (Median: 5, p25-p75: 4-5). They found the VPs engaging, realistic, fun to use, instructive and relevant to their course. Students used, on average, 9.68 VPs per course, which constitutes 43 percent of the available VPs. The number of VPs available seemed to be sufficient for the target course, even if some of the students preferred a higher number of VPs. Of the VPs encountered, 71% (CI: 68-75%) were VPs with feedback, and correspondingly 29% of the VPs chosen were without feedback. The difference in utilization between both types of VPs was significant, at p < 0.001. Thus, the students clearly favoured VPs with feedback compared to VPs without feedback. There were three modes of engagement in which the VP was utilized. Mode 1 was the preferred mode for VPs without feedback, while mode 3 was dominant for VPs with feedback.. Whether or not a VP was selected for review during a teacher led seminar or not, did not affect student behaviour, at least on the surface. Teacher led seminars may still be of importance to provide credibility to the VPs by integrating them into the curriculum. [source]


    Simplified yet highly accurate enzyme kinetics for cases of low substrate concentrations

    FEBS JOURNAL, Issue 19 2009
    Hanna M. Härdin
    Much of enzyme kinetics builds on simplifications enabled by the quasi-steady-state approximation and is highly useful when the concentration of the enzyme is much lower than that of its substrate. However, in vivo, this condition is often violated. In the present study, we show that, under conditions of realistic yet high enzyme concentrations, the quasi-steady-state approximation may readily be off by more than a factor of four when predicting concentrations. We then present a novel extension of the quasi-steady-state approximation based on the zero-derivative principle, which requires considerably less theoretical work than did previous such extensions. We show that the first-order zero-derivative principle, already describes much more accurately the true enzyme dynamics at enzyme concentrations close to the concentration of their substrates. This should be particularly relevant for enzyme kinetics where the substrate is an enzyme, such as in phosphorelay and mitogen-activated protein kinase pathways. We illustrate this for the important example of the phosphotransferase system involved in glucose uptake, metabolism and signaling. We find that this system, with a potential complexity of nine dimensions, can be understood accurately using the first-order zero-derivative principle in terms of the behavior of a single variable with all other concentrations constrained to follow that behavior. [source]


    Rational Pricing of Internet Companies Revisited

    FINANCIAL REVIEW, Issue 4 2001
    Eduardo S. Schwartz
    G12 Abstract In this article we expand and improve the Internet company valuation model of Schwartz and Moon (2000) in numerous ways. By using techniques from real options theory and modern capital budgeting, the earlier paper demonstrated that uncertainty about key variables plays a major role in the valuation of high growth Internet companies. Presently, we make the model more realistic by providing for stochastic costs and future financing, and also by including capital expenditures and depreciation in the analysis. Perhaps more importantly, we offer insights into the practical implementation the model. An important challenge to implementing the original model was estimating the various parameters of the model. Here, we improve the procedure by setting the speed of adjustment parameters equal to one another, by tying the implied half-life of the revenue growth process to analyst forecasts, and by inferring the risk-adjustment parameter from the observed beta of the company's stock price. We illustrate these extensions in a valuation of the company eBay. [source]


    Evaluation of large-scale stocking of early stages of brown trout, Salmo trutta, to angler catches in the French,Swiss part of the River Doubs

    FISHERIES MANAGEMENT & ECOLOGY, Issue 2 2003
    A. Champigneulle
    Abstract Around 500 000 brown trout, Salmo trutta L., alevins are stocked annually in the 24-km section of the River Doubs under study. All the alevins stocked in the period 1994,1996 were identifiable by fluoromarking their otoliths with tetracycline chlorhydrate. Anglers' catches, between June 1997 and September 1998, comprised trout aged 1+ to 7+ , but most (90% +) were 2+ to 3+ or 4+ , with the majority at 2+ and 3+. There was no significant difference in the size for a given age between marked and unmarked angled trout. The contribution of stocked fish in anglers' catches was around 22% for the 1995 cohort. The contribution of stocking (cohorts 1994 to 1995,1996) to the 1998 catches was around 23% (95% confidence limits: 19,27%). The estimated recapture rate was three to four trout per 1000 alevins stocked for the 1995 cohort. The major contribution (78%) of natural recruitment to anglers' catches suggests that the fishery management based on natural recruitment is still realistic in this part of River Doubs. [source]


    Optimization of segmented linear Paul traps and transport of stored particles

    FORTSCHRITTE DER PHYSIK/PROGRESS OF PHYSICS, Issue 8-10 2006
    S. Schulz
    Abstract Single ions held in linear Paul traps are promising candidates for a future quantum computer. Here, we discuss a two-layer microstructured segmented linear ion trap. The radial and axial potentials are obtained from numeric field simulations and the geometry of the trap is optimized. As the trap electrodes are segmented in the axial direction, the trap allows the transport of ions between different spatial regions. Starting with realistic numerically obtained axial potentials, we optimize the transport of an ion such that the motional degrees of freedom are not excited, even though the transport speed far exceeds the adiabatic regime. In our optimization we achieve a transport within roughly two oscillation periods in the axial trap potential compared to typical adiabatic transports that take of the order 102 oscillations. Furthermore heating due to quantum mechanical effects is estimated and suppression strategies are proposed. [source]


    From ancient genes to modern communities: the cellular stress response and the evolution of plant strategies

    FUNCTIONAL ECOLOGY, Issue 5 2005
    S. PIERCE
    Summary 1Two major plant strategy theories attempt to explain how phenotype determines community structure. Crucially, CSR plant strategy theory suggests that stress and sporadic resource availability favour conservative phenotypes, whereas the resource-ratio hypothesis views the spatial heterogeneity of resources as selecting for optimal foraging in chronically unproductive habitats. Which view is most realistic? 2The ecophysiology literature demonstrates that stress is comprised of two processes: (1) limitation of resource supply to metabolism; and (2) damage to biomembranes, proteins and genetic material (chronic stress). Thus stress is defined mechanistically as the suboptimal performance of metabolism. 3Adaptations to limitation buffer metabolism against variability in external resource supply; internal storage pools are more consistent. Chronic stress elicits the same ancient cellular stress response in all cellular life: investment in stress metabolites that preserve the integrity and compartmentalization of metabolic components in concert with molecular damage-repair mechanisms. 4The cellular stress response was augmented by morphological innovations during the Silurian,Devonian terrestrial radiation, during which nutrient limitation appears to have been a principal selection pressure (sensu CSR theory). 5The modern stress,tolerator syndrome is conservative and supports metabolism in limiting or fluctuating environmental conditions: standing resource pools with high investment/maintenance costs impose high internal diffusion resistances and limit inherent growth rate (sensu CSR theory). 6The resource-ratio hypothesis cannot account for the cellular stress response or the crucial role of ombrotrophy in primary succession. CSR theory agrees with current understanding of the cellular stress response, terrestrial radiation and modern adaptations recorded in chronically unproductive habitats, and is applicable as CSR classification. [source]


    AVO investigations of shallow marine sediments

    GEOPHYSICAL PROSPECTING, Issue 2 2001
    M. Riedel
    Amplitude-variation-with-offset (AVO) analysis is based on the Zoeppritz equations, which enable the computation of reflection and transmission coefficients as a function of offset or angle of incidence. High-frequency (up to 700 Hz) AVO studies, presented here, have been used to determine the physical properties of sediments in a shallow marine environment (20 m water depth). The properties that can be constrained are P- and S-wave velocities, bulk density and acoustic attenuation. The use of higher frequencies requires special analysis including careful geometry and source and receiver directivity corrections. In the past, marine sediments have been modelled as elastic materials. However, viscoelastic models which include absorption are more realistic. At angles of incidence greater than 40°, AVO functions derived from viscoelastic models differ from those with purely elastic properties in the absence of a critical angle of incidence. The influence of S-wave velocity on the reflection coefficient is small (especially for low S-wave velocities encountered at the sea-floor). Thus, it is difficult to extract the S-wave parameter from AVO trends. On the other hand, P-wave velocity and density show a considerably stronger effect. Attenuation (described by the quality factor Q) influences the reflection coefficient but could not be determined uniquely from the AVO functions. In order to measure the reflection coefficient in a seismogram, the amplitudes of the direct wave and the sea-floor reflection in a common-midpoint (CMP) gather are determined and corrected for spherical divergence as well as source and streamer directivity. At CMP locations showing the different AVO characteristics of a mud and a boulder clay, the sediment physical properties are determined by using a sequential-quadratic-programming (SQP) inversion technique. The inverted sediment physical properties for the mud are: P-wave velocity ,=1450±25 m/s, S-wave velocity ,=90±35 m/s, density ,=1220±45 kg/m3, quality factor for P-wave QP=15±200, quality factor for S-wave QS=10±30. The inverted sediment physical properties for the boulder clay are: ,=1620±45 m/s,,=360±200 m/s,,=1380±85 kg/m3,QP=790±660,QS=25±10. [source]


    Dynamic versus static models in cost-effectiveness analyses of anti-viral drug therapy to mitigate an influenza pandemic

    HEALTH ECONOMICS, Issue 5 2010
    Anna K. Lugnér
    Abstract Conventional (static) models used in health economics implicitly assume that the probability of disease exposure is constant over time and unaffected by interventions. For transmissible infectious diseases this is not realistic and another class of models is required, so-called dynamic models. This study aims to examine the differences between one dynamic and one static model, estimating the effects of therapeutic treatment with antiviral (AV) drugs during an influenza pandemic in the Netherlands. Specifically, we focus on the sensitivity of the cost-effectiveness ratios to model choice, to the assumed drug coverage, and to the value of several epidemiological factors. Therapeutic use of AV-drugs is cost-effective compared with non-intervention, irrespective of which model approach is chosen. The findings further show that: (1) the cost-effectiveness ratio according to the static model is insensitive to the size of a pandemic, whereas the ratio according to the dynamic model increases with the size of a pandemic; (2) according to the dynamic model, the cost per infection and the life-years gained per treatment are not constant but depend on the proportion of cases that are treated; and (3) the age-specific clinical attack rates affect the sensitivity of cost-effectiveness ratio to model choice. Copyright © 2009 John Wiley & Sons, Ltd. [source]


    Interior layout design of passenger vehicles with RAMSIS

    HUMAN FACTORS AND ERGONOMICS IN MANUFACTURING & SERVICE INDUSTRIES, Issue 2 2005
    Christian Vogt
    The interior of passenger vehicles and the adapting of interior components to the human body are designed with historical guidelines, based on the experiences of the manufacturer. In contrast to this, the aim of the following study is to create a consistent and theoretically justified procedure to design the interior layout. Using the advantages of virtual design, this will be done with the software tool RAMSIS from scratch. First, four theoretical seating concepts are generated, each fixing one point of the human body (eye point, H-point, hand point, or heel point) at fixed coordinates for all anthropometric types. Then, the most practical concept is applied together with the geometry of a given vehicle. To generate a realistic and ergonomic seating concept, studies are made concerning the posture of legs and feet in relation to the pedals of the vehicle. The result is a final seating concept with fields of adjustment for seat and steering wheel. © 2005 Wiley Periodicals, Inc. Hum Factors Man 15: 197,212, 2005. [source]


    Evaluation of in silico splice tools for decision-making in molecular diagnosis,

    HUMAN MUTATION, Issue 7 2008
    Claude Houdayer
    Abstract It appears that all types of genomic nucleotide variations can be deleterious by affecting normal pre-mRNA splicing via disruption/creation of splice site consensus sequences. As it is neither pertinent nor realistic to perform functional testing for all of these variants, it is important to identify those that could lead to a splice defect in order to restrict transcript analyses to the most appropriate cases. Web-based tools designed to provide such predictions are available. We evaluated the performance of six of these tools (Splice Site Prediction by Neural Network [NNSplice], Splice-Site Finder [SSF], MaxEntScan [MES], Automated Splice-Site Analyses [ASSA], Exonic Splicing Enhancer [ESE] Finder, and Relative Enhancer and Silencer Classification by Unanimous Enrichment [RESCUE]-ESE) using 39 unrelated retinoblastoma patients carrying different RB1 variants (31 intronic and eight exonic). These 39 patients were screened for abnormal splicing using puromycin-treated cell lines and the results were compared to the predictions. As expected, 17 variants impacting canonical AG/GT splice sites were correctly predicted as deleterious. A total of 22 variations occurring at loosely defined positions (±60 nucleotides from an AG/GT site) led to a splice defect in 19 cases and 16 of them were classified as deleterious by at least one tool (84% sensitivity). In other words, three variants escaped in silico detection and the remaining three were correctly predicted as neutral. Overall our results suggest that a combination of complementary in silico tools is necessary to guide molecular geneticists (balance between the time and cost required by RNA analysis and the risk of missing a deleterious mutation) because the weaknesses of one in silico tool may be overcome by the results of another tool. Hum Mutat 29(7), 975,982, 2008. © 2008 Wiley-Liss, Inc. [source]


    Incorporating variable source area hydrology into a curve-number-based watershed model

    HYDROLOGICAL PROCESSES, Issue 25 2007
    Elliot M. Schneiderman
    Abstract Many water quality models use some form of the curve number (CN) equation developed by the Soil Conservation Service (SCS; U.S. Depart of Agriculture) to predict storm runoff from watersheds based on an infiltration-excess response to rainfall. However, in humid, well-vegetated areas with shallow soils, such as in the northeastern USA, the predominant runoff generating mechanism is saturation-excess on variable source areas (VSAs). We reconceptualized the SCS,CN equation for VSAs, and incorporated it into the General Watershed Loading Function (GWLF) model. The new version of GWLF, named the Variable Source Loading Function (VSLF) model, simulates the watershed runoff response to rainfall using the standard SCS,CN equation, but spatially distributes the runoff response according to a soil wetness index. We spatially validated VSLF runoff predictions and compared VSLF to GWLF for a subwatershed of the New York City Water Supply System. The spatial distribution of runoff from VSLF is more physically realistic than the estimates from GWLF. This has important consequences for water quality modeling, and for the use of models to evaluate and guide watershed management, because correctly predicting the coincidence of runoff generation and pollutant sources is critical to simulating non-point source (NPS) pollution transported by runoff. Copyright © 2007 John Wiley & Sons, Ltd. [source]


    Designing and Implementing an Information System for the Dental Office of Branckowitz & Young

    ACCOUNTING PERSPECTIVES, Issue 4 2008
    Alex Nikitkov
    ABSTRACT This case provides students with the opportunity to create a functional information system (IS) for a service company. The case facilitates a guided hands-on experience where students learn to analyze a business entity in the context of its environment; recognize what business processes comprise an entity's value chain; and develop, document, and implement a tailor-made IS to support the entity's operation. In order to keep the amount of development realistic and the system transparent for students, the case focuses on a small service company: a dental office. The case uses a resource,events,agents (REA) analytical framework for modeling and Microsoft Access for IS implementation. The case is structured modularly, enabling instructors to either explain material or demonstrate analysis/development of a segment of an IS in class and then challenge the students to complete the module's development following the instructor's example. Instructors have the flexibility to give students fewer (or additional) directions in developing the information system, depending on the students' backgrounds and abilities. Instructors also have a choice to limit the scope of the development and implementation to any number of four business processes. [source]