Forms Used (form + used)

Distribution by Scientific Domains


Selected Abstracts


Pre-operative screening for excessive alcohol consumption among patients scheduled for elective surgery

DRUG AND ALCOHOL REVIEW, Issue 2 2007
SWATI SHOURIE
Abstract Pre-operative intervention for excessive alcohol consumption among patients scheduled for elective surgery has been shown to reduce complications of surgery. However, successful intervention depends upon an effective and practical screening procedure. This study examines current screening practices for excessive alcohol consumption amongst patients scheduled for elective surgery in general hospitals. It also examines the appropriateness of potential sites and staff for pre-operative screening. Forms used routinely to assess alcohol consumption in the pre-admission clinics (PAC) of eight Sydney hospitals were examined. In addition, the appropriateness of six staff categories (surgeons, surgeons' secretaries, junior medical officer, anaesthetists, nurses and a research assistant) and of two sites (surgeons' office and PAC) in conducting additional screening was assessed at two hospitals. Outcomes included observed advantages and disadvantages of sites and personnel, and number of cases with excessive drinking identified. There was duplication in information collected routinely on alcohol use in the PACs in eight Sydney Hospitals. Questions on alcohol consumption in patient self-completion forms were not validated. The PAC provided for efficient screening but time to surgery was typically too short for successful intervention in many cases. A validated tool and efficient screening procedure is required to detect excessive drinking before elective surgery. Patients often present to the PAC too close to the time of surgery for any change in drinking to reverse alcohol's effects. The role of the referring general practitioner and of printed advice from the surgeon in preparing patients for surgery needs further investigation. [source]


Solving time-dependent PDEs using the material point method, a case study from gas dynamics

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN FLUIDS, Issue 7 2010
L. T. Tran
Abstract The material point method (MPM) developed by Sulsky and colleagues is currently being used to solve many challenging problems involving large deformations and/or fragementations with some success. In order to understand the properties of this method, an analysis of the considerable computational properties of MPM is undertaken in the context of model problems from gas dynamics. The MPM method in the form used here is shown both theoretically and computationally to have first-order accuracy for a standard gas dynamics test problem. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Does the source of nitrogen affect the response of subterranean clover to prolonged root hypoxia?

JOURNAL OF PLANT NUTRITION AND SOIL SCIENCE, Issue 2 2010
Faouzi Horchani
Abstract Nitrogen (N) is taken up by most plant species in the form of nitrate (NO) or ammonium (NH). The plant response to continuous ammonium nutrition is species-dependent. In this study, the effects of the source of N nutrition (NO, NH, or the mixture of NO and NH) on the response of clover (Trifolium subterraneum L. cv. 45C) plants to prolonged root hypoxia was studied. Under aerobic conditions, plant growth was strongly depressed by NH, compared to NO or mixed N nutrition, as indicated by the significant decrease in root and shoot-dry-matter production (DW), root and shoot water contents (WC), leaf chlorophyll concentration, and chlorophyll fluorescence parameters (F0, Fv/Fm). However, the N source had no effect on chlorophyll a,to,chlorophyll b ratio. Under hypoxic conditions, the negative effects of root hypoxia on plant-growth parameters (DW and WC), leaf chlorophyll concentration, and chlorophyll fluorescence parameters were alleviated by NH rather than NO supply. Concomitantly, shoot DW,to,root DW ratio, and root and leaf NH concentrations were significantly decreased, whereas root and leaf carbohydrate concentrations, glutamine synthetase activities, and protein concentrations were remarkably increased. The present data reveal that the N source (NO or NH) is a major factor affecting clover responses to hypoxic stress, with plants being more tolerant when NH is the N form used. The different sensitivity is discussed in terms of a competition for energy between nitrogen assimilation and plant growth. [source]


IMPACT ASSESSMENT MODEL FOR CLEAR WATER FISHES EXPOSED TO EXCESSIVELY CLOUDY WATER,

JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 3 2003
Charles P. Newcombe
ABSTRACT: A new type of empirical model described here enables real time assessment of impacts caused by excessive water cloudiness as a function of (a) reduced visual clarity (excessive cloudiness) and (b) duration of exposure to cloudy conditions, in fisheries or fish life stages adapted to life in clear water ecosystems. This model takes the familiar form used in earlier suspended sediment dose effect models where z is severity of ill effect (SEV), x is duration of exposure (h), y is black disk sighting range (y BD, m),a measure of water clarity, a is the intercept, and b and c are slope coefficients. As calibrated in this study the model is Severity of ill effect is ranked on a 15-step scale that ranges from 0 to 14, where zero represents nil effect and 14 represents 100 percent mortality. This model, based on peer consultation and limited meta analysis of peer reviewed reports, accomplishes the following: (a) identifies the threshold of the onset of ill effects among clear water fishes; (b) postulates the rate at which serious ill effects are likely to escalate as a function of reduced visual clarity and persistence; (c) provides a context (the "visual clarity" matrix, with its cell coordinates) to share and compare information about impacts as a function of visual clarity "climate" (d) demonstrates changes in predator prey interactions at exposures greater than and less than the threshold of direct ill effects; (e) calibrates trout reactive distance (cm) as function of water clarity in the form where y represents reactive distance (cm) and x represents visual clarity (black disk sighting range, cm), and where a and b are intercept and slope respectively, such that (f) identifies black disk sighting range, in meters, and its reciprocal, beam attenuation, as preferred monitoring variables; and (g) provides two additional optical quality variables (Secchi disk extinction distance and turbidity) which, suitably calibrated as they have been in this study, expand the range of monitoring options in situations in which the preferred technology,beam attenuation equipment or black disk sighting equipment,is unavailable or impractical to use. This new model demonstrates the efficacy of peer collaboration and defines new research horizons for its refinement. [source]


A static mapping heuristics to map parallel applications to heterogeneous computing systems

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 13 2005
Ranieri Baraglia
Abstract In order to minimize the execution time of a parallel application running on a heterogeneously distributed computing system, an appropriate mapping scheme is needed to allocate the application tasks to the processors. The general problem of mapping tasks to machines is a well-known NP-hard problem and several heuristics have been proposed to approximate its optimal solution. In this paper we propose a static graph-based mapping algorithm, called Heterogeneous Multi-phase Mapping (HMM), which permits suboptimal mapping of a parallel application onto a heterogeneous computing distributed system by using a local search technique together with a tabu search meta-heuristic. HMM allocates parallel tasks by exploiting the information embedded in the parallelism forms used to implement an application, and considering an affinity parameter, that identifies which machine in the heterogeneous computing system is most suitable to execute a task. We compare HMM with some leading techniques and with an exhaustive mapping algorithm. We also give an example of mapping of two real applications using HMM. Experimental results show that HMM performs well demonstrating the applicability of our approach. Copyright © 2005 John Wiley & Sons, Ltd. [source]


Impact of US and Canadian precursor regulation on methamphetamine purity in the United States

ADDICTION, Issue 3 2009
James K. Cunningham
ABSTRACT Aims Reducing drug purity is a major, but largely unstudied, goal of drug suppression. This study examines whether US methamphetamine purity was impacted by the suppression policy of US and Canadian precursor chemical regulation. Design Autoregressive integrated moving average (ARIMA)-intervention time,series analysis. Setting Continental United States and Hawaii (1985,May 2005). Interventions US federal regulations targeting precursors, ephedrine and pseudoephedrine, in forms used by large-scale producers were implemented in November 1989, August 1995 and October 1997. US regulations targeting precursors in forms used by small-scale producers (e.g. over-the-counter medications) were implemented in October 1996 and October 2001. Canada implemented federal precursor regulations in January 2003 and July 2003 and an essential chemical (e.g. acetone) regulation in January 2004. Measurements Monthly median methamphetamine purity series. Findings US regulations targeting large-scale producers were associated with purity declines of 16,67 points; those targeting small-scale producers had little or no impact. Canada's precursor regulations were associated with purity increases of 13,15 points, while its essential chemical regulation was associated with a 13-point decrease. Hawaii's purity was consistently high, and appeared to vary little with the 1990s/2000s regulations. Conclusions US precursor regulations targeting large-scale producers were associated with substantial decreases in continental US methamphetamine purity, while regulations targeting over-the-counter medications had little or no impact. Canada's essential chemical regulation was also associated with a decrease in continental US purity. However, Canada's precursor regulations were associated with purity increases: these regulations may have impacted primarily producers of lower-quality methamphetamine, leaving higher-purity methamphetamine on the market by default. Hawaii's well-known preference for ,ice' (high-purity methamphetamine) may have helped to constrain purity there to a high, attenuated range, possibly limiting its sensitivity to precursor regulation. [source]


Iron and Calcium Bioavailability of Fortified Foods and Dietary Supplements

NUTRITION REVIEWS, Issue 11 2002
Susan J. Fairweather-Tait DSc
Bioavailability is a key consideration when developing strategies for preventing mineral deficiencies through improved dietary supply. Factors that affect the bioavailability of iron and calcium, forms used for fortification and supplementation, and methods used to assess bioavailability are described. Illustrations of the impact of introducing iron-fortified foods in developing and industrialized countries are given, and the alternative approach of supplementation with iron and calcium is discussed. [source]


Efficient Calculation of P-value and Power for Quadratic Form Statistics in Multilocus Association Testing

ANNALS OF HUMAN GENETICS, Issue 3 2010
Liping Tong
Summary We address the asymptotic and approximate distributions of a large class of test statistics with quadratic forms used in association studies. The statistics of interest take the general form D=XTA X, where A is a general similarity matrix which may or may not be positive semi-definite, and X follows the multivariate normal distribution with mean , and variance matrix ,, where , may or may not be singular. We show that D can be written as a linear combination of independent ,2 random variables with a shift. Furthermore, its distribution can be approximated by a ,2 or the difference of two ,2 distributions. In the setting of association testing, our methods are especially useful in two situations. First, when the required significance level is much smaller than 0.05 such as in a genome scan, the estimation of p-values using permutation procedures can be challenging. Second, when an EM algorithm is required to infer haplotype frequencies from un-phased genotype data, the computation can be intensive for a permutation procedure. In either situation, an efficient and accurate estimation procedure would be useful. Our method can be applied to any quadratic form statistic and therefore should be of general interest. [source]