Home About us Contact | |||
Unrealistic
Terms modified by Unrealistic Selected AbstractsUser transparency: a fully sequential programming model for efficient data parallel image processingCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 6 2004F. J. Seinstra Abstract Although many image processing applications are ideally suited for parallel implementation, most researchers in imaging do not benefit from high-performance computing on a daily basis. Essentially, this is due to the fact that no parallelization tools exist that truly match the image processing researcher's frame of reference. As it is unrealistic to expect imaging researchers to become experts in parallel computing, tools must be provided to allow them to develop high-performance applications in a highly familiar manner. In an attempt to provide such a tool, we have designed a software architecture that allows transparent (i.e. sequential) implementation of data parallel imaging applications for execution on homogeneous distributed memory MIMD-style multicomputers. This paper presents an extensive overview of the design rationale behind the software architecture, and gives an assessment of the architecture's effectiveness in providing significant performance gains. In particular, we describe the implementation and automatic parallelization of three well-known example applications that contain many fundamental imaging operations: (1) template matching; (2) multi-baseline stereo vision; and (3) line detection. Based on experimental results we conclude that our software architecture constitutes a powerful and user-friendly tool for obtaining high performance in many important image processing research areas. Copyright © 2004 John Wiley & Sons, Ltd. [source] Why bartering biodiversity failsCONSERVATION LETTERS, Issue 4 2009Susan Walker Abstract Regulatory biodiversity trading (or biodiversity "offsets") is increasingly promoted as a way to enable both conservation and development while achieving "no net loss" or even "net gain" in biodiversity, but to date has facilitated development while perpetuating biodiversity loss. Ecologists seeking improved biodiversity outcomes are developing better assessment tools and recommending more rigorous restrictions and enforcement. We explain why such recommendations overlook and cannot correct key causes of failure to protect biodiversity. Viable trading requires simple, measurable, and interchangeable commodities, but the currencies, restrictions, and oversight needed to protect complex, difficult-to-measure, and noninterchangeable resources like biodiversity are costly and intractable. These safeguards compromise trading viability and benefit neither traders nor regulatory officials. Political theory predicts that (1) biodiversity protection interests will fail to counter motivations for officials to resist and relax safeguards to facilitate exchanges and resource development at cost to biodiversity, and (2) trading is more vulnerable than pure administrative mechanisms to institutional dynamics that undermine environmental protection. Delivery of no net loss or net gain through biodiversity trading is thus administratively improbable and technically unrealistic. Their proliferation without credible solutions suggests biodiversity offset programs are successful "symbolic policies," potentially obscuring biodiversity loss and dissipating impetus for action. [source] What really happens with the electron gas in the famous Franck-Hertz experiment?CONTRIBUTIONS TO PLASMA PHYSICS, Issue 3-4 2003F. Sigeneger Abstract The interpretation of the anode current characteristics obtained in the famous Franck-Hertz experiment of 1914 led to the verification of Bohr's predictions of quantised atomic states. This fundamental experiment has been often repeated, and nowadays is generally part of the curriculum in modern physics education. However, the interpretation of the experiment is typically based upon significant simplifying assumptions, some quite unrealistic. This is the case especially in relation to the kinetics of the electron gas, which is in reality quite complex, due mainly to non-uniformities in the electric field, caused by a combination of accelerating and retarding components. This non-uniformity leads to a potential energy valley in which the electrons are trapped. The present state of understanding of such effects, and their influence upon the anode characteristics, is quite unsatisfactory. In this article a rigorous study of a cylindrical Franck-Hertz experiment is presented, using mercury vapour, the aim being to reveal and explain what really happens with the electrons under realistic experimental conditions. In particular, the anode current characteristics are investigated over a range of mercury vapour pressures appropriate to the experiment to clearly elaborate the effects of elastic collisions (ignored in typical discussions) on the power budget, and the trapping of electrons in the potential energy valley. [source] Distribution of Aggregate Utility Using Stochastic Elements of Additive Multiattribute Utility ModelsDECISION SCIENCES, Issue 2 2000Herbert Moskowitz ABSTRACT Conventionally, elements of a multiattribute utility model characterizing a decision maker's preferences, such as attribute weights and attribute utilities, are treated as deterministic, which may be unrealistic because assessment of such elements can be imprecise and erroneous, or differ among a group of individuals. Moreover, attempting to make precise assessments can be time consuming and cognitively demanding. We propose to treat such elements as stochastic variables to account for inconsistency and imprecision in such assessments. Under these assumptions, we develop procedures for computing the probability distribution of aggregate utility for an additive multiattribute utility function (MAUF), based on the Edgeworth expansion. When the distributions of aggregate utility for all alternatives in a decision problem are known, stochastic dominance can then be invoked to filter inferior alternatives. We show that, under certain mild conditions, the aggregate utility distribution approaches normality as the number of attributes increases. Thus, only a few terms from the Edgeworth expansion with a standard normal density as the base function will be sufficient for approximating an aggregate utility distribution in practice. Moreover, the more symmetric the attribute utility distributions, the fewer the attributes to achieve normality. The Edgeworth expansion thus can provide a basis for a computationally viable approach for representing an aggregate utility distribution with imprecisely specified attribute weights and utilities assessments (or differing weights and utilities across individuals). Practical guidelines for using the Edgeworth approximation are given. The proposed methodology is illustrated using a vendor selection problem. [source] Introduced species as evolutionary trapsECOLOGY LETTERS, Issue 3 2005Martin A. Schlaepfer Abstract Invasive species can alter environments in such a way that normal behavioural decision-making rules of native species are no longer adaptive. The evolutionary trap concept provides a useful framework for predicting and managing the impact of harmful invasive species. We discuss how native species can respond to changes in their selective regime via evolution or learning. We also propose novel management strategies to promote the long-term co-existence of native and introduced species in cases where the eradication of the latter is either economically or biologically unrealistic. [source] THE FITNESS EFFECT OF MUTATIONS ACROSS ENVIRONMENTS: A SURVEY IN LIGHT OF FITNESS LANDSCAPE MODELSEVOLUTION, Issue 12 2006Guillaume Martin Abstract The fitness effects of mutations on a given genotype are rarely constant across environments to which this genotype is more or less adapted, that is, between more or less stressful conditions. This can have important implications, especially on the evolution of ecological specialization. Stress is thought to increase the variance of mutations' fitness effects, their average, or the number of expressed mutations. Although empirical evidence is available for these three mechanisms, their relative magnitude is poorly understood. In this paper, we propose a simple approach to discriminate between these mechanisms, using a survey of empirical measures of mutation effects in contrasted environments. This survey, across various species and environments, shows that stress mainly increases the variance of mutations effects on fitness, with a much more limited impact on their average effect or on the number of expressed mutations. This pattern is consistent with a simple model in which fitness is a Gaussian function of phenotypes around an environmentally determined optimum. These results suggest that a simple, mathematically tractable landscape model may not be quantitatively as unrealistic as previously suggested. They also suggest that mutation parameter estimates may be strongly biased when measured in stressful environments. [source] Region-specific assessment of greenhouse gas mitigation with different manure management strategies in four agroecological zonesGLOBAL CHANGE BIOLOGY, Issue 12 2009SVEN G. SOMMER Abstract Livestock farming systems are major sources of trace gases contributing to emissions of the greenhouse gases (GHG) nitrous oxide (N2O) and methane (CH4), i.e. N2O accounts for 10% and CH4 for 30% of the anthropogenic contributions to net global warming. This paper presents scenario assessments of whole-system effects of technologies for reducing GHG emissions from livestock model farms using slurry-based manure management. Changes in housing and storage practice, mechanical separation, and incineration of the solid fraction derived from separation were evaluated in scenarios for Sweden, Denmark, France, and Italy. The results demonstrated that changes in manure management can induce significant changes in CH4 and N2O emissions and carbon sequestration, and that the effect of introducing environmental technologies may vary significantly with livestock farming practice and interact with climatic conditions. Shortening the in-house manure storage time reduced GHG emissions by 0,40%. The largest GHG reductions of 49 to, in one case, 82% were obtained with a combination of slurry separation and incineration, the latter process contributing to a positive GHG balance of the system by substituting fossil fuels. The amount and composition of volatile solids (VS) and nitrogen pools were main drivers in the calculations performed, and requirements to improve the assessment of VS composition and turnover during storage and in the field were identified. Nevertheless, the results clearly showed that GHG emission estimates will be unrealistic, if the assumed manure management or climatic conditions do not properly represent a given country or region. The results also showed that the mitigation potential of specific manure management strategies and technologies varied depending on current management and climatic conditions. [source] Assessing the impact of mixing assumptions on the estimation of streamwater mean residence timeHYDROLOGICAL PROCESSES, Issue 12 2010Fabrizio Fenicia Abstract Catchment streamwater mean residence time (Tmr) is an important descriptor of hydrological systems, reflecting their storage and flow pathway properties. Tmr is typically inferred from the composition of stable water isotopes (oxygen-18 and deuterium) in observed rainfall and discharge. Currently, lumped parameter models based on convolution and sinewave functions are usually used for tracer simulation. These traditional models are based on simplistic assumptions that are often known to be unrealistic, in particular, steady flow conditions, linearity, complete mixing and others. However, the effect of these assumptions on Tmr estimation is seldom evaluated. In this article, we build a conceptual model that overcomes several assumptions made in traditional mixing models. Using data from the experimental Maimai catchment (New Zealand), we compare a complete-mixing (CM) model, where rainfall water is assumed to mix completely and instantaneously with the total catchment storage, with a partial-mixing (PM) model, where the tracer input is divided between an ,active' and a ,dead' storage compartment. We show that the inferred distribution of Tmr is strongly dependent on the treatment of mixing processes and flow pathways. The CM model returns estimates of Tmr that are well identifiable and are in general agreement with previous studies of the Maimai catchment. On the other hand, the PM model,motivated by a priori catchment insights,provides Tmr estimates that appear exceedingly large and highly uncertain. This suggests that water isotope composition measurements in rainfall and discharge alone may be insufficient for inferring Tmr. Given our model hypothesis, we also analysed the effect of different controls on Tmr. It was found that Tmr is controlled primarily by the storage properties of the catchment, rather than by the speed of streamflow response. This provides guidance on the type of information necessary to improve Tmr estimation. Copyright © 2010 John Wiley & Sons, Ltd. [source] Concentration,discharge relationships reflect chemostatic characteristics of US catchmentsHYDROLOGICAL PROCESSES, Issue 13 2009Sarah E. Godsey Abstract Concentration,discharge relationships have been widely used as clues to the hydrochemical processes that control runoff chemistry. Here we examine concentration,discharge relationships for solutes produced primarily by mineral weathering in 59 geochemically diverse US catchments. We show that these catchments exhibit nearly chemostatic behaviour; their stream concentrations of weathering products such as Ca, Mg, Na, and Si typically vary by factors of only 3 to 20 while discharge varies by several orders of magnitude. Similar patterns are observed at the inter-annual time scale. This behaviour implies that solute concentrations in stream water are not determined by simple dilution of a fixed solute flux by a variable flux of water, and that rates of solute production and/or mobilization must be nearly proportional to water fluxes, both on storm and inter-annual timescales. We compared these catchments' concentration,discharge relationships to the predictions of several simple hydrological and geochemical models. Most of these models can be forced to approximately fit the observed concentration,discharge relationships, but often only by assuming unrealistic or internally inconsistent parameter values. We propose a new model that also fits the data and may be more robust. We suggest possible tests of the new model for future studies. The relative stability of concentration under widely varying discharge may help make aquatic environments habitable. It also implies that fluxes of weathering solutes in streams, and thus fluxes of alkalinity to the oceans, are determined primarily by water fluxes. Thus, hydrology may be a major driver of the ocean-alkalinity feedback regulating climate change. Copyright © 2009 John Wiley & Sons, Ltd. [source] Withdrawal behavior and depression in infancyINFANT MENTAL HEALTH JOURNAL, Issue 4 2007Antoine Guedeney This paper describes the history of the concept of infant depression, which has been at the beginning of the discipline of infant mental health, and reviews classification and diagnosis issues, along with some animal models. Several diagnostic criteria have yielded different prevalence rates, and some being unrealistic, but we still do not know when infant depression begins, what its outcome is, and what are its different aspects. It is suggested that infant depression needs a certain amount of emotional and cognitive development to unfold, and that it might not exist before 18,24 months of age, a crossover during which major autoreflexive, cognitive, and emotional abilities emerge. Depression could be an outcome of attachment disorganization in infancy, as depression and disorganization seem to share the same learned helpnessness psychopathological process. Developmental psychopathology considers trouble more from a dimensional point of view rather than from a categorical one, and more as the result of several factors with a sequential action rather than the effect of a genetic disorder with direct expression. Before the limit of 18,24 months, the concept of relational withdrawal seems more applicable and useful. [source] DEM analysis of bonded granular geomaterialsINTERNATIONAL JOURNAL FOR NUMERICAL AND ANALYTICAL METHODS IN GEOMECHANICS, Issue 17 2008S. Utili Abstract In this paper, the application of the distinct element method (DEM) to frictional cohesive (c, ,) geomaterials is described. A new contact bond model based on the Mohr,Coulomb failure criterion has been implemented in PFC2D. According to this model, the bond strength can be clearly divided into two distinct micromechanical contributions: an intergranular friction angle and a cohesive bond force. A parametric analysis, based on several biaxial tests, has been run to validate the proposed model and to calibrate the micromechanical parameters. Simple relationships between the macromechanical strength parameters (c, ,) and the corresponding micromechanical quantities have been obtained so that they can be used to model boundary value problems with the DEM without need of further calibration. As an example application, the evolution of natural cliffs subject to weathering has been studied. Different weathering scenarios have been considered for an initially vertical cliff. Firstly, the case of uniform weathering has been studied. Although unrealistic, this case has been considered in order to validate the DEM approach by comparison against analytical predictions available from limit analysis. Secondly, non-uniform weathering has been studied. The results obtained clearly show that with the DEM it is possible to realistically model boundary value problems of bonded geomaterials, which would be overwhelmingly difficult to do with other numerical techniques. Copyright © 2008 John Wiley & Sons, Ltd. [source] Constitutive model for quasi-static deformation of metallic sandwich coresINTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 13 2004Zhenyu Xue Abstract All-metal sandwich construction holds promise for significant improvements in stiffness, strength and blast resistance for built-up plate structures. Analysis of the performance of sandwich plates under various loads, static and dynamic, requires modelling of face sheets and core with some fidelity. While it is possible to model full geometric details of the core for a few selected problems, this is unnecessary and unrealistic for larger complex structures under general loadings. In this paper, a continuum constitutive model is proposed as an alternative means of modelling the core. The constitutive model falls within the framework of a compressible rate-independent, anisotropic elastic,plastic solid. The general form of the model is presented, along with algorithmic aspects of its implementation in a finite element code, and selected problems are solved which benchmark the code against existing codes for limiting cases and which illustrate features specific to compressible cores. Three core geometries (pyramidal truss, folded plate, and square honeycomb) are considered in some detail. The validity of the approach is established by comparing numerical finite element simulations using the model with those obtained by a full three-dimensional meshing of the core geometry for each of the three types of cores for a clamped sandwich plate subject to uniform pressure load. Limitations of the model are also discussed. Copyright © 2004 John Wiley & Sons, Ltd. [source] An adaptive clinical Type 1 diabetes control protocol to optimize conventional self-monitoring blood glucose and multiple daily-injection therapyINTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, Issue 5 2009Xing-Wei Wong Abstract The objective of this study was to develop a safe, robust and effective protocol for the clinical control of Type 1 diabetes using conventional self-monitoring blood glucose (SMBG) measurements, and multiple daily injection (MDI) with insulin analogues. A virtual patient method is used to develop an in silico simulation tool for Type 1 diabetes using data from a Type 1 diabetes patient cohort (n=40) . The tool is used to test two prandial insulin protocols, an adaptive protocol (AC) and a conventional intensive insulin therapy (IIT) protocol (CC) against results from a representative control cohort as a function of SMBG frequency. With the prandial protocols, optimal and suboptimal basal insulin replacement using a clinically validated, forced-titration regimen is also evaluated. A Monte Carlo (MC) analysis using variability and error distributions derived from the clinical and physiological literature is used to test efficacy and robustness. MC analysis is performed for over 1 400 000 simulated patient hours. All results are compared with control data from which the virtual patients were derived. In conditions of suboptimal basal insulin replacement, the AC protocol significantly decreases HbA1c for SMBG frequencies ,6/day compared with controls and the CC protocol. With optimal basal insulin, mild and severe hypoglycaemia is reduced by 86,100% over controls for all SMBG frequencies. Control with the CC protocol and suboptimal basal insulin replacement saturates at an SMBG frequency of 6/day. The forced-titration regimen requires a minimum SMBG frequency of 6/day to prevent increased hypoglycaemia. Overaggressive basal dose titration with the CC protocol at lower SMBG frequencies is likely caused by uncorrected postprandial hyperglycaemia from the previous night. From the MC analysis, a defined peak in control is achieved at an SMBG frequency of 8/day. However, 90% of the cohort meets American Diabetes Association recommended HbA1c with just 2 measurements a day. A further 7.5% requires 4 measurements a day and only 2.5% (1 patient) required 6 measurements a day. In safety, the AC protocol is the most robust to applied MC error. Over all SMBG frequencies, the median for severe hypoglycaemia increases from 0 to 0.12% and for mild hypoglycaemia by 0,5.19% compared with the unrealistic no error simulation. While statistically significant, these figures are still very low and the distributions are well below those of the controls group. An adaptive control protocol for Type 1 diabetes is tested in silico under conditions of realistic variability and error. The adaptive (AC) protocol is effective and safe compared with conventional IIT (CC) and controls. As the fear of hypoglycaemia is a large psychological barrier to appropriate glycaemic control, adaptive model-based protocols may represent the next evolution of IIT to deliver increased glycaemic control with increased safety over conventional methods, while still utilizing the most commonly used forms of intervention (SMBG and MDI). The use of MC methods to evaluate them provides a relevant robustness test that is not considered in the no error analyses of most other studies. Copyright © 2008 John Wiley & Sons, Ltd. [source] A review on coal-to-liquid fuels and its coal consumptionINTERNATIONAL JOURNAL OF ENERGY RESEARCH, Issue 10 2010Mikael Höök Abstract Continued reliance on oil is unsustainable and this has resulted in interest in alternative fuels. Coal-to-liquids (CTL) can supply liquid fuels and have been successfully used in several cases, particularly in South Africa. This article reviews CTL theory and technology. Understanding the fundamental aspects of coal liquefaction technologies is vital for planning and policy-making, as future CTL systems will be integrated in a much larger global energy and fuel utilization system. Conversion ratios for CTL are generally estimated to be between 1 and 2 barrels/ton coal. This puts a strict limitation on future CTL capacity imposed by future coal production volumes, regardless of other factors such as economics, emissions or environmental concerns. Assuming that 10% of world coal production can be diverted to CTL, the contribution to liquid fuel supply will be limited to only a few mega barrels per day. This prevents CTL from becoming a viable mitigation plan for liquid fuel shortage on a global scale. However, it is still possible for individual nations to derive significant shares of their fuel supply from CTL, but those nations must also have access to equally significant coal production capacities. It is unrealistic to claim that CTL provides a feasible solution to liquid fuels shortages created by peak oil. For the most part, it can only be a minor contributor and must be combined with other strategies. Copyright © 2009 John Wiley & Sons, Ltd. [source] Evaluation of NOC Measures in Home Care Nursing PracticeINTERNATIONAL JOURNAL OF NURSING TERMINOLOGIES AND CLASSIFICATION, Issue 2003Gail M. Keenan PURPOSE To evaluate the reliability, validity, usefulness, and sensitivity of 89 NOC outcomes in two Visiting Nurse Associations in Michigan. METHODS Of a total 190 NOC outcomes 89 were assigned for testing. Interrater reliability and criterion validity were assessed a total of 50 times per outcome (on 50 different patients) across the study units. The total number of times the reliability and validity were assessed for each of the 89 measures studied ranged from 5,45. Three RN research assistants (RNRAs) oversaw and participated in data collection with the help of 15 clinicians. Convenience sampling was used to identify subjects. A roster of outcomes to be studied was maintained and matched with patient conditions whenever possible until the quota of outcomes assigned had been evaluated. Clinicians and RNRAs independently rated the outcomes and indicators applicable to the patient. NANDA diagnoses, NIC interventions, and medical diagnoses were recorded. FINDINGS A total of 258 patients (mean age 62) enrolled; 60% were women, 23% were from minority groups, and 78% had no college degree. Thirty-six of the 89 NOC measures were designated "clinically useful." The 10 outcomes with the highest interrater reliability were Caregiver Home Care Readiness; Caregiver Stressors; Caregiving Endurance Potential; Infection Status; Mobility Level; Safety Status: Physical Injury; Self-Care: Activities of Daily Living; Self-Care: Bathing; Self-Care: Hygiene; and Wound Healing: Secondary Intention. Criterion measurement and repeated ratings provided evidence to support the validity and sensitivity of the NOC outcomes. Evidence also suggested that NOC label level ratings could be a feasible, reliable, and valid method of evaluating nursing outcomes under actual use. For some measures, adjustments in the scales and anchors are needed to enhance reliability. For others, it may be unrealistic to reliably score in one encounter, thus scoring should be deferred until the clinician has adequate knowledge of the patient. CONCLUSIONS Continued study and refinement that are coordinated and integrated systematically strongly recommended. Comprehensive study in an automated system with a controlled format will increase the efficiency of future studies. [source] Choanoflagellates, choanocytes, and animal multicellularityINVERTEBRATE BIOLOGY, Issue 1 2004Manuel Maldonado Abstract. It is widely accepted that multicellular animals (metazoans) constitute a monophyletic unit, deriving from ancestral choanoflagellate-like protists that gave rise to simple choanocyte-bearing metazoans. However, a re-assessment of molecular and histological evidence on choanoflagellates, sponge choanocytes, and other metazoan cells reveals that the status of choanocytes as a fundamental cell type in metazoan evolution is unrealistic. Rather, choanocytes are specialized cells that develop from non-collared ciliated cells during sponge embryogenesis. Although choanocytes of adult sponges have no obvious homologue among metazoans, larval cells transdifferentiating into choanocytes at metamorphosis do have such homologues. The evidence reviewed here also indicates that sponge larvae are architecturally closer than adult sponges to the remaining metazoans. This may mean that the basic multicellular organismal architecture from which diploblasts evolved, that is, the putative planktonic archimetazoan, was more similar to a modern poriferan larva lacking choanocytes than to an adult sponge. Alternatively, it may mean that other metazoans evolved from a neotenous larva of ancient sponges. Indeed, the Porifera possess some features of intriguing evolutionary significance: (1) widespread occurrence of internal fertilization and a notable diversity of gastrulation modes, (2) dispersal through architecturally complex lecithotrophic larvae, in which an ephemeral archenteron (in dispherula larvae) and multiciliated and syncytial cells (in trichimella larvae) occur, (3) acquisition of direct development by some groups, and (4) replacement of choanocyte-based filter-feeding by carnivory in some sponges. Together, these features strongly suggest that the Porifera may have a longer and more complicated evolutionary history than traditionally assumed, and also that the simple anatomy of modern adult sponges may have resulted from a secondary simplification. This makes the idea of a neotenous evolution less likely than that of a larva-like choanocyte-lacking archimetazoan. From this perspective, the view that choanoflagellates may be simplified sponge-derived metazoans, rather than protists, emerges as a viable alternative hypothesis. This idea neither conflicts with the available evidence nor can be disproved by it, and must be specifically re-examined by further approaches combining morphological and molecular information. Interestingly, several microbial lin°Cages lacking choanocyte-like morphology, such as Corallochytrea, Cristidiscoidea, Ministeriida, and Mesomycetozoea, have recently been placed at the boundary between fungi and animals, becoming a promising source of information in addition to the choanoflagellates in the search for the unicellular origin of animal multicellularity. [source] X-ray diffraction analysis of stacking and twin faults in f.c.c. metals: a revision and allowance for texture and non-uniform fault probabilitiesJOURNAL OF APPLIED CRYSTALLOGRAPHY, Issue 2 2000L. Velterop A revision is presented of the original description by Warren [X-ray Diffraction, (1969), pp. 275,298. Massachusetts: Addison-Wesley] of the intensity distribution of powder-pattern reflections from f.c.c. metal samples containing stacking and twin faults. The assumptions (in many cases unrealistic) that fault probabilities need to be very small and equal for all fault planes and that the crystallites in the sample have to be randomly oriented have been removed. To elucidate the theory, a number of examples are given, showing how stacking and twin faults change the shape and position of diffraction peaks. It is seen that significant errors may arise from Warren's assumptions, especially in the peak maximum shift. Furthermore, it is explained how to describe powder-pattern reflections from textured specimens and specimens with non-uniform fault probabilities. Finally, it is discussed how stacking- and twin-fault probabilities (and crystallite sizes) can be determined from diffraction line-profile measurements. [source] The continuous smooth hockey stick: a newly proposed spawner-recruitment modelJOURNAL OF APPLIED ICHTHYOLOGY, Issue 6 2008R. Froese Summary Spawner-recruit relationships are important components of fisheries management. The two most widely used models have been criticized for unsatisfactory fits and biologically unreasonable extrapolations. A simple hockey stick model has been shown to provide more robust predictions, however, this model is not widely used, possibly because the abrupt change from density-dependence to density-independence is unrealistic and the piecewise model is difficult to fit. Here I present a continuous two-parameter model that resembles a smoothed hockey stick and provides parameter estimates similar to the piecewise hockey stick. The new model is easily parameterized with regular curve-fitting routines. [source] Offspring-driven local dispersal in female sand lizards (Lacerta agilis)JOURNAL OF EVOLUTIONARY BIOLOGY, Issue 6 2004K. Ryberg Abstract We report on a field study in which determinants of female breeding dispersal (i.e. the shift in the mean home range coordinates between successive breeding events) was investigated. Offspring were released in full sib groups (or half sib ones if there was within-clutch multiple paternity) at a separation distance from the females that varied between ,families'. This allowed for analysis of ,offspring nearness' effects on maternal dispersal. When a female's offspring were released more closely to her, she responded with greater dispersal. Furthermore, when the data set was truncated at 100 m maternal,offspring separation distance at offspring release (because perception at longer distances is likely to be unrealistic), maternal dispersal resulted in greater separation distance between female and offspring in the following year. A corresponding analysis for juveniles revealed no effect of maternal nearness on offspring dispersal but identified a significant effect of clutch size, to our surprise with dispersal declining with increasing clutch size. We discuss this result in a context of the ,public information hypothesis' (reinterpreted for juveniles in a nonsocial foraging species), suggesting that conspecific abundance perhaps acts as an indicator of local habitat quality. Thus, our analysis suggests a microgeographic structuring of the adult female population driven by genetic factors, either through inbreeding avoidance, or from simply avoiding individuals with a similar genotype regardless of their pedigree relatedness, while a nongenetic factor seems more important in their offspring. [source] Decisional autonomy of planetary roversJOURNAL OF FIELD ROBOTICS (FORMERLY JOURNAL OF ROBOTIC SYSTEMS), Issue 7 2007Félix Ingrand To achieve the ever increasing demand for science return, planetary exploration rovers require more autonomy to successfully perform their missions. Indeed, the communication delays are such that teleoperation is unrealistic. Although the current rovers (such as MER) demonstrate a limited navigation autonomy, and mostly rely on ground mission planning, the next generation (e.g., NASA Mars Science Laboratory and ESA Exomars) will have to regularly achieve long range autonomous navigation tasks. However, fully autonomous long range navigation in partially known planetary-like terrains is still an open challenge for robotics. Navigating hundreds of meters without any human intervention requires the robot to be able to build adequate representations of its environment, to plan and execute trajectories according to the kind of terrain traversed, to control its motions, and to localize itself as it moves. All these activities have to be planned, scheduled, and performed according to the rover context, and controlled so that the mission is correctly fulfilled. To achieve these objectives, we have developed a temporal planner and an execution controller, which exhibit plan repair and replanning capabilities. The planner is in charge of producing plans composed of actions for navigation, science activities (moving and operating instruments), communication with Earth and with an orbiter or a lander, while managing resources (power, memory, etc.) and respecting temporal constraints (communication visibility windows, rendezvous, etc.). High level actions also need to be refined and their execution temporally and logically controlled. Finally, in such critical applications, we believe it is important to deploy a component that protects the system against dangerous or even fatal situations resulting from unexpected interactions between subsystems (e.g., move the robot while the robot arm is unstowed) and/or software components (e.g., take and store a picture in a buffer while the previous one is still being processed). In this article we review the aforementioned capabilities, which have been developed, tested, and evaluated on board our rovers (Lama and Dala). After an overview of the architecture design principle adopted, we summarize the perception, localization, and motion generation functions required by autonomous navigation, and their integration and concurrent operation in a global architecture. We then detail the decisional components: a high level temporal planner that produces the robot activity plan on board, and temporal and procedural execution controllers. We show how some failures or execution delays are being taken care of with online local repair, or replanning. © 2007 Wiley Periodicals, Inc. [source] PRESERVATION OF COMMERCIAL FISH BALL QUALITY WITH EDIBLE ANTIOXIDANT-INCORPORATED ZEIN COATINGSJOURNAL OF FOOD PROCESSING AND PRESERVATION, Issue 5 2009LIAN-SYUN LIN ABSTRACT Fish ball, a surimi product rich in lipid and protein, is a popular food in Taiwan. Because lipid oxidation is one of the major deterioration reactions for fish ball, the feasibility of preservation of fish ball quality by the application of antioxidant-incorporated zein coating was investigated. Three antioxidants including butylated hydroxyanisole (BHA), butylated hydroxytoluene (BHT) and n-propyl gallate (PG) were used to formulate the antioxidant zein coatings. Infrared spectroscopy was used to confirm the successful incorporation of antioxidant with zein protein; peroxide value (POV), thiobarbituric acid reactive substance (TBARS) and weight loss were used as the quality indicators of fish ball stored at 4C. While all three types of antioxidant-incorporated zein coatings significantly retarded the quality deterioration, PG-incorporated zein coating exerted better quality preservation effectiveness than BHA- and BHT-incorporated zein coatings. PRACTICAL APPLICATIONS Edible coatings have been under research for several decades. However, most of the studies are conducted for the investigations of physiochemical or mechanical properties and usually using simulated food systems. The lack of applications on the commercial food products manufactured from food plants makes the edible coatings somewhat unrealistic. Not prepared in a laboratory for academic purpose only, the fish ball used in the present study was a real commercial product. The promising results of antioxidant-incorporated zein coatings on commercial products presented in this report will enhance the confidence of food manufacturers on the edible coatings. [source] Virtual colonoscopy: Issues in implementationJOURNAL OF MEDICAL IMAGING AND RADIATION ONCOLOGY, Issue 1 2005R Mendelson Summary The following issues and requirements related to the implementation of a CT colonography (CTC) service are important: (i) policies are needed regarding the indications for CTC. Concomitant with this is the need for education of potential referrers and patients. Expectations of the procedure, particularly by general practitioners, may be unrealistic and indications for referral may otherwise be inappropriate. At present there is not general acceptance of CTC for screening asymptomatic persons; (ii) a flexible approach to CT protocols is useful, dependant on the indication for and clinical context of referral, the age and body habitus of the patient; (iii) attention to the issues related to the special skills required by the reporting radiologist. While there is a temptation to regard CTC interpretation as an extension of skills used in interpreting other cross-sectional images, there is a need to realise that there are skills required specific to CTC and there should be adequate provision for training; (iv) matters related to reporting, such as reporting format, and lesions that will be reported/not reported; and (v) informed consent from the patient. Information should be provided with regard to the limitations of CTC, the implications of a positive finding and radiation dosage. [source] A query language for discovering semantic associations, Part I: Approach and formal definition of query primitivesJOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, Issue 11 2007Timo Niemi In contemporary query languages, the user is responsible for navigation among semantically related data. Because of the huge amount of data and the complex structural relationships among data in modern applications, it is unrealistic to suppose that the user could know completely the content and structure of the available information. There are several query languages whose purpose is to facilitate navigation in unknown structures of databases. However, the background assumption of these languages is that the user knows how data are related to each other semantically in the structure at hand. So far only little attention has been paid to how unknown semantic associations among available data can be discovered. We address this problem in this article. A semantic association between two entities can be constructed if a sequence of relationships expressed explicitly in a database can be found that connects these entities to each other. This sequence may contain several other entities through which the original entities are connected to each other indirectly. We introduce an expressive and declarative query language for discovering semantic associations. Our query language is able, for example, to discover semantic associations between entities for which only some of the characteristics are known. Further, it integrates the manipulation of semantic associations with the manipulation of documents that may contain information on entities in semantic associations. [source] SUSTAINING LOCAL WATERSHED INITIATIVES: LESSONS FROM LANDCARE AND WATERSHED COUNCILS,JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 5 2002Allan Curtis ABSTRACT: In the last decade, watershed groups (WG) established through government initiatives have become an important part of the natural resource management landscape in developed economies. In this paper, the authors reflect upon their research and experience with Landcare in Victoria, and to a lesser extent with Watershed Councils in Oregon, to identify the principles that appear fundamental to sustaining effective WG. In the first instance, these groups must be established at a local scale using social as well as biophysical boundaries. It is also critical that WG are embedded within a supportive institutional framework that identifies realistic roles for private landowners, local organizations such as WG, and regional planning bodies. Without broad stakeholder representation, the perceived benefits of participation are quickly forfeited. It is simply unrealistic to expect an effective network of WG to be sustained without substantial investment by government to provide for program management, group coordination, and cost sharing for on-ground work. There must also be the commitment and skills within a program to establish processes that build trust and competency amongst citizens and agencies. These principles should also provide a foundation for the critical evaluation of WG programs. [source] The judicial duty to give reasonsLEGAL STUDIES, Issue 1 2000H L Ho The desirability of having a general duty to give reasons for court decisions has been much debated in Commonwealth jurisdictions. In England, a series of recent cases has consistently upheld the duty, albeit with qualifications. The existence of this general duty is defensible in principle. However, exactly what is required to comply with the duty is not clear. The explanation the judge is expected to give may be analysed in terms of its structure, contents and standard. These aspects are dependent on many factors, such as the rationale underlying the duty, the limitations faced by the judicial system, the nature of the decision- making process, and the significance of the decision. While one can identify the major considerations that operate at a general level, the scope and extent of the duty to explain a particular decision are dependent on the circumstances of the case. This variability makes it difficult to be certain as to when a breach of the duty has occurred. The duty must meet the purposes for which it is imposed and at the same time must not be too unrealistic in its demands. [source] Showing and telling: The Difference that makes a DifferenceLITERACY, Issue 3 2001David Lewis In this article I attempt to clarify an essential difference between the ways in which pictures and words convey meaning. Despite the fact that the distinction between showing and telling is widely understood and clearly marked in ordinary language, it is often ignored when writers and researchers provide accounts of how children's picturebooks work. As a result, such accounts are often unrealistic, providing distorted images of picturebook text. I briefly examine one such attempt to differentiate and characterise various types of picturebook and then conclude by showing how Anthony Browne exploits the distinction between showing and telling to create the atmosphere of uncertainty and mystery in his classic book Gorilla. [source] Class in the Classroom: Engaging Hidden IdentitiesMETAPHILOSOPHY, Issue 4 2001Peter W. Wakefield Using Marcuse's theory of the total mobilization of advanced technology society along the lines of what he calls "the performance principle," I attempt to describe the complex composition of class oppression in the classroom. Students conceive of themselves as economic units, customers pursuing neutral interests in a morally neutral, socio-economic system of capitalist competition. The classic, unreflective conception of the classroom responds to this by implicitly endorsing individualism and ideals of humanist citizenship. While racism and cultural diversity have come to count as elements of liberal intelligence in most college curricula, attempts to theorize these aspects of social and individual identity and place them in a broader content of class appear radical and inconsistent with the humanistic notion that we all have control over who we are and what we achieve. But tags such as "radical" and "unrealistic" mark a society based on the performance principle. Marcuse allows us to recognize a single author behind elements of psychology, metaphysics, and capitalism. The fact that bell hooks hits upon a similar notion suggests that we might use Marcuse's theory of the truly liberatory potential of imagination to transform and reconceive our classrooms so that the insidious effects of class, racism, and individualistic apathy might be subverted. Specifically, I outline and place into this theoretical context three concrete pedagogical practices: (a) the use of the physical space of the classroom; (b) the performance of community through group readings and short full-class ceremonies, and (c) the symbolic modeling represented by interdisciplinary approaches to teaching. All three of these practices engage students in ways that co-curricularly subvert class (and, incidentally, race divisions) and allow students to imagine, and so engage in, political action for justice as they see it. [source] Wage Hikes as Supply and Demand ShockMETROECONOMICA, Issue 4 2003Jürgen Jerger ABSTRACT Wage hikes affect production costs and hence are usually analysed as supply shocks. There is a long-standing debate, however, about demand effects of wage variations. In this paper, we bring together these two arguments in a Kaldorian model with group-specific saving rates and a production technology that allows for redistribution between workers and entrepreneurs following a wage hike. We thereby pinpoint the conditions under which (a) wage variations affect aggregate demand and (b) the positive demand effects of wage hikes may even overcompensate the negative supply effects on aggregate employment (,purchasing power argument'). We conclude by noting that, whereas demand effects are very likely to occur, the conditions under which the purchasing power argument does indeed hold are very unrealistic. [source] Complications of type 1 diabetes: new molecular findingsMOUNT SINAI JOURNAL OF MEDICINE: A JOURNAL OF PERSONALIZED AND TRANSLATIONAL MEDICINE, Issue 4 2008Alin Stirban MD Abstract Interventions targeting the treatment of diabetic complications have not been nearly as successful as initially estimated, despite a marked improvement in therapeutic options for diabetes. The need for understanding why some very promising interventions have failed demands a closer look at the pathomechanisms of the complications. Great strides have been made in understanding the pathology, and several important hypotheses have emerged in recent years. On this basis, Brownlee and coworkers suggested a unifying hypothesis integrating various mechanisms discussed in past years with an overproduction of reactive oxygen species as an initiating cause. This hypothesis and further hypotheses, as well as mechanisms, are highlighted in this article. The field of pathomechanisms of diabetic complications is very wide, and any attempt to completely cover it within a single article is unrealistic. Therefore, our purpose is to present the most relevant concepts underlying diabetic complications in an attempt to contribute to a better understanding and pinpoint areas that warrant further research. Mt Sinai J Med 75:328,351, 2008. © 2008 Mount Sinai School of Medicine [source] Opisthobranch molluscs from the Tertiary of the Aquitaine Basin (south-western France), with descriptions of seven new species and a new genusPALAEONTOLOGY, Issue 3 2000Ángel Valdés An exceptionally well-preserved collection of Tertiary opisthobranch molluscs from the Aquitaine Basin, France, includes species of the order Notaspidea [Umbraculum sanctipaulensis sp. nov., Tylodina perversa (Gmelin), Spiricella unguiculus Rang and Des Moulins, Berthella aquitaniensis sp. nov., Berthella ateles sp. nov.], of the order Anaspidea [Akera cf. bullata Mu¨ller, Floribella corrugata (Cossmann), Floribella cossmanni sp. nov., Floribella rozieri sp. nov., Limondinia ornata gen. et sp. nov.] and of the order Sacoglossa [Volvatella faviae sp. nov.]. BerthellaaquitaniensisB. atelesV. faviae are the first fossil records of the families Volvatellidae and Pleurobranchidae. Floribella plicifera (Cossmann) and F.corrugata, originally assigned to the genus Philine, belong to the genus Floribella and constitute the oldest records of this genus. The fossil evidence indicates that in Umbraculum laudunensis and U. sanctipaulensis the shell probably covered most of the animal, whereas in the Recent U. umbraculum the shell only covers the central portion of the body. Tylodina perversa could be an old species that appeared during the early Miocene, more than 21 Ma. The Recent shells of Akera bullata are indistinguishable from fossils as old as the mid Eocene, but it may be biologically unrealistic to consider them to be the same species. The European species of Floribella evolved from the bullomorph shells of the early Eocene forms to the elongate shells of the early Miocene. The genus Volvatella is another example of marine tropical disjoint distributions and an excellent ecological indicator. [source] |