Home About us Contact | |||
Practical Solution (practical + solution)
Selected AbstractsBuchbesprechung: Corrosion , Prevention and Protection , Practical Solutions.CHEMIE-INGENIEUR-TECHNIK (CIT), Issue 8 2007E. Ghali, M. Elboujdaini., Von V. S. Sastri No abstract is available for this article. [source] Using pulsed gradient spin echo NMR for chemical mixture analysis: How to obtain optimum resultsCONCEPTS IN MAGNETIC RESONANCE, Issue 4 2002Brian Antalek Abstract Pulsed gradient spin echo NMR is a powerful technique for measuring diffusion coefficients. When coupled with appropriate data processing schemes, the technique becomes an exceptionally valuable tool for mixture analysis, the separation of which is based on the molecular size. Extremely fine differentiation may be possible in the diffusion dimension but only with high-quality data. For fully resolved resonances, components with diffusion coefficients that differ by less than 2% may be distinguished in mixtures. For highly overlapped resonances, the resolved spectra of pure components with diffusion coefficients that differ by less than 30% may be obtained. In order to achieve the best possible data quality one must be aware of the primary sources of artifacts and incorporate the necessary means to alleviate them. The origin of these artifacts are described, along with the methods necessary to observe them. Practical solutions are presented. Examples are shown that demonstrate the effects of the artifacts on the acquired data set. Many mixture analysis problems may be addressed with conventional high resolution pulsed field gradient probe technology delivering less than 0.5 T m,1 (50 G cm,1). © 2002 Wiley Periodicals, Inc. Concepts Magn Reson 14: 225,258, 2002. [source] Environmental impact assessment: Practical solutions to recurrent problems, part 1ENVIRONMENTAL QUALITY MANAGEMENT, Issue 4 2005David P. Lawrence First page of article [source] Water-soluble vitamins in fish ontogenyAQUACULTURE RESEARCH, Issue 5 2010Rune Waagbø Abstract Studies on vitamin requirement at early stages are difficult and vary in quality, both due to the scientific approach and vitamin analysis. Focus has been on water-soluble vitamins that cause dramatic losses of the offspring in practical farming situations or in wild life, like vitamin C and thiamine deficiencies respectively. Practical solutions including vitamin administration through brood stock and larvae diets have confirmed and corrected the vitamin deficiencies. For the other water-soluble vitamins, the situation is not so obvious. Descriptive studies of folate and vitamin B6 during fish ontogeny have shown a net loss of vitamin during endogenous feeding and a steady transfer of vitamin from the yolk sac into the body compartment, and finally, dramatic increases in body vitamin levels after the start of feeding. The kinetics of mass transfer with ontogeny appears, however, to differ between vitamins. Start of feeding of fish larvae with live or formulated feeds includes several challenges with respect to water-soluble vitamins, including aspects of live feed enrichment and stability, micro-diet leaching, variable feed intakes, immature gastrointestinal tract, variable bioavailability of vitamins and larvae vitamin storage capacity. Consequently, the exact minimum requirements are difficult to estimate and vitamin recommendations need to consider such conditions. [source] Connecting patient needs with treatment managementACTA PSYCHIATRICA SCANDINAVICA, Issue 2009R. Kerwin Objective:, To propose ideas for the development of a core strategy for monitoring patients with schizophrenia to ensure physical health and optimal treatment provision. Method:, A panel of European experts in the field of schizophrenia met in Bordeaux in June 2006 to discuss, ,Patient management optimisation through improved treatment monitoring.' Results:, Key consensus from the discussion deemed that weight gain, oral health and ECG parameters were core baseline parameters to be monitored in all patients with schizophrenia. Further, an identification of a patient's own barriers to treatment alongside local health service strategies might comprise elements of an individualised management strategy which would contribute to optimisation of treatment. Any monitoring strategy should be kept simple to encourage physician compliance. Conclusion:, A practical solution to the difficulties of providing holistic patient care would be to suggest a limited set of physical parameters to be monitored by physicians on a regular basis. [source] Testing of fine motor skills in dental studentsEUROPEAN JOURNAL OF DENTAL EDUCATION, Issue 1 2000Olaf Luck Manual skills form only a part of the capabilities required of future dentists, but they are a very important component, which should be tested. With regard to the dental specialities, the present study tested speciality-independent fine motor skills. No objective, practical solution has been found up to now. 88 dental students and, as a control group, 23 medical students were examined in the longitudinal study. In the course of the analysis, 4 fine motor tests were carried out at the beginning of the 2nd and 6th semesters. The tests comprised the tremometer test, the tremometer test with a mirror, the 2-hand sinusoid test and archery using the Game Gear by SEGA. The test devices facilitate primarily the testing of components of accuracy of movements, indirect working methods, and eye-hand coordination. In the comparison of performances on test day A, the medical students' performance was noticeably better. As testing progressed, results showed stagnation in the performance of the medical students and a significant improvement in the performance of the dental students. That means that the test system can be used for a test over the course of study, but not as an initial test. [source] Semantic confusion regarding the development of multisensory integration: a practical solutionEUROPEAN JOURNAL OF NEUROSCIENCE, Issue 10 2010Barry E. Stein Abstract There is now a good deal of data from neurophysiological studies in animals and behavioral studies in human infants regarding the development of multisensory processing capabilities. Although the conclusions drawn from these different datasets sometimes appear to conflict, many of the differences are due to the use of different terms to mean the same thing and, more problematic, the use of similar terms to mean different things. Semantic issues are pervasive in the field and complicate communication among groups using different methods to study similar issues. Achieving clarity of communication among different investigative groups is essential for each to make full use of the findings of others, and an important step in this direction is to identify areas of semantic confusion. In this way investigators can be encouraged to use terms whose meaning and underlying assumptions are unambiguous because they are commonly accepted. Although this issue is of obvious importance to the large and very rapidly growing number of researchers working on multisensory processes, it is perhaps even more important to the non-cognoscenti. Those who wish to benefit from the scholarship in this field but are unfamiliar with the issues identified here are most likely to be confused by semantic inconsistencies. The current discussion attempts to document some of the more problematic of these, begin a discussion about the nature of the confusion and suggest some possible solutions. [source] A heterogeneous computing system for data mining workflows in multi-agent environmentsEXPERT SYSTEMS, Issue 5 2006Ping Luo Abstract: The computing-intensive data mining (DM) process calls for the support of a heterogeneous computing system, which consists of multiple computers with different configurations connected by a high-speed large-area network for increased computational power and resources. The DM process can be described as a multi-phase pipeline process, and in each phase there could be many optional methods. This makes the workflow for DM very complex and it can be modeled only by a directed acyclic graph (DAG). A heterogeneous computing system needs an effective and efficient scheduling framework, which orchestrates all the computing hardware to perform multiple competitive DM workflows. Motivated by the need for a practical solution of the scheduling problem for the DM workflow, this paper proposes a dynamic DAG scheduling algorithm according to the characteristics of an execution time estimation model for DM jobs. Based on an approximate estimation of job execution time, this algorithm first maps DM jobs to machines in a decentralized and diligent (defined in this paper) manner. Then the performance of this initial mapping can be improved through job migrations when necessary. The scheduling heuristic used considers the factors of both the minimal completion time criterion and the critical path in a DAG. We implement this system in an established multi-agent system environment, in which the reuse of existing DM algorithms is achieved by encapsulating them into agents. The system evaluation and its usage in oil well logging analysis are also discussed. [source] Real Options: Meeting the Georgetown ChallangeJOURNAL OF APPLIED CORPORATE FINANCE, Issue 2 2005Thomas E. Copeland In response to the demand for a single, generally accepted real options methodology, this article proposes a four-step process leading to a practical solution to most applications of real option analysis. The first step is familiar: calculate the standard net present value of the project assuming no managerial flexibility, which results in a value estimate (and a "branch" of a decision tree) for each year of the project's life. The second step estimates the volatility of the value of the project and produces a value tree designed to capture the main sources of uncertainty. Note that the authors focus on the uncertainty about overall project value, which is driven by uncertainty in revenue growth, operating margins, operating leverage, input costs, and technology. The key point here is that, in contrast to many real options approaches, none of these variables taken alone is assumed to be a reliable surrogate for the uncertainty of the project itself. For example, in assessing the option value of a proven oil reserve, the relevant measure of volatility is the volatility not of oil prices, but of the value of the operating entity,that is, the project value without leverage. The third step attempts to capture managerial flexibility using a decision "tree" that illustrates the decisions to be made, their possible outcomes, and their corresponding probabilities. The article illustrate various kinds of applications, including a phased investment in a chemical plant (which is treated as a compound option) and an investment in a peak-load power plant (a switching option with changing variance, which precludes the use of constant risk-neutral probabilities as in standard decision tree analysis). The fourth and final step uses a "no-arbitrage" approach to form a replicating portfolio with the same payouts as the real option. For most corporate investment projects, it is impossible to locate a "twin security" that trades in the market. In the absence of such a security, the conventional NPV of a project (again, without flexibility) is the best candidate for a perfectly correlated underlying asset because it represents management's best estimate of value based on the expected cash flows of the project. [source] Adaptation and evaluation of the Randox full-range CRP assay on the Olympus AU2700®JOURNAL OF CLINICAL LABORATORY ANALYSIS, Issue 1 2007A.M. Dupuy Abstract The implementation of a high-sensitivity CRP (hs-CRP) assay as a routine laboratory parameter may be necessary. A single CRP method that could yield reliable results for the whole concentration range (0.1,200 mg/L) would be the most practical solution for the laboratory setting. The aim of this study was to assess the Randox full-range CRP assay on the Olympus AU2700® biochemistry analyzer and evaluate its analytical performance on serum and heparin plasma samples. The Randox CRP turbidimetric assay was compared with the existing CRP assay used routinely on the Olympus AU2700®. The analytical performance of the Randox CRP with both Olympus CRP reagents (CRP for normal application andhs-CRP) was good. We found that the Randox CRP method in the range of 0.5,160 mg/L was closely correlated to the Olympus CRP and hs-CRP for serum samples. According to a Bland-Altman analysis, the serum and heparinized samples showed an excellent agreement in CRP concentrations throughout the entire range (mean difference = ,0.035 ± 1.806 mg/L) as well as in CRP levels <10 mg/L. Our data indicate that Randox full-range CRP measurements using an immunoturbidimetry assay on Olympus systems perform as well for routine diagnostics as other high-sensitivity applications using serum or heparin plasma. J. Clin. Lab. Anal. 21:34,39, 2007. © 2007 Wiley-Liss, Inc. [source] Computerized cognitive,behaviour therapy for anxiety and depression: a practical solution to the shortage of trained therapistsJOURNAL OF PSYCHIATRIC & MENTAL HEALTH NURSING, Issue 5 2004S. VAN DEN BERG bsc Computerized cognitive,behaviour therapy (CCBT) programmes have been developed to help meet the enormous need for evidence-based psychological treatment of common mental health problems in the context of a severe shortage of trained therapists to meet that need. Randomized controlled trials have confirmed the efficacy of such programmes. We present the experience of a community mental health team (CMHT) resource centre with one such programme, Beating the Blues, together with outcome data on a small sample of its clients. We conclude that experience and data, taken together, demonstrate the practical benefits of CCBT in routine practice. [source] Laser Welding of Plastics , a Neat ThingLASER TECHNIK JOURNAL, Issue 5 2010The story of a popular laser application Industry has been dealing with the joining of plastics for over half a century. The wish for an economically viable method of joining components was there already when the injection molding process was developed. With the advent of industrial laser technology, laser welding developed into a practical solution for many plastics joining problems. [source] cDNA-AFLP reveals genes differentially expressed during the hypersensitive response of cassavaMOLECULAR PLANT PATHOLOGY, Issue 2 2005BENJAMIN P. KEMP SUMMARY The tropical staple cassava is subject to several major diseases, such as cassava bacterial blight, caused by Xanthomonas axonopodis pv. manihotis. Disease-resistant genotypes afford the only practical solution, yet despite the global importance of this crop, little is known about its defence mechanisms. cDNA-AFLP was used to isolate cassava genes differentially expressed during the hypersensitive reaction (HR) of leaves in response to an incompatible Pseudomonas syringae pathovar. Seventy-eight transcript-derived fragments (TDFs) showing differential expression (c. 75% up-regulated, 25% down-regulated) were identified. Many encoded putative homologues of known defence-related genes involved in signalling (e.g. calcium transport and binding, ACC oxidases and a WRKY transcription factor), cell wall strengthening (e.g. cinnamoyl coenzyme A reductase and peroxidase), programmed cell death (e.g. proteases, 26S proteosome), antimicrobial activity (e.g. proteases and ,-1,3-glucanases) and the production of antimicrobial compounds (e.g. DAHP synthase and cytochrome P450s). Full-length cDNAs including a probable matrix metalloprotease and a WRKY transcription factor were isolated from six TDFs. RT-PCR or Northern blot analysis showed HR-induced TDFs were maximally expressed at 24 h, although some were produced by 6 h; some were induced, albeit more slowly, in response to wounding. This work begins to reveal potential defence-related genes of this understudied, major crop. [source] Toward faster algorithms for dynamic traffic assignment.NETWORKS: AN INTERNATIONAL JOURNAL, Issue 1 2003Abstract Being first in a three-part series promising a practical solution to the user-equilibrium dynamic traffic assignment problem, this paper devises a parametric quickest-path tree algorithm, whose model makes three practical assumptions: (i) the traversal time of an arc i , j is a piecewise linear function of the arrival time at its i -node; (ii) the traversal time of a path is the sum of its arcs' traversal times; and (iii) the FIFO constraint holds, that is, later departure implies later arrival. The algorithm finds a quickest path, and its associated earliest arrival time, to every node for every desired departure time from the origin. Its parametric approach transforms a min-path tree for one departure-time interval into another for the next adjacent interval, whose shared boundary the algorithm determines on the fly. By building relatively few trees, it provides the topology explicitly and the arrival times implicitly of all min-path trees. Tests show the algorithm running upward of 10 times faster than the conventional brute-force approach, which explicitly builds a min-path tree for every departure time. Besides dynamic traffic assignment, other applications for which these findings have utility include traffic control planning, vehicle routing and scheduling, real-time highway route guidance, etc. © 2002 Wiley Periodicals, Inc. [source] Evaluation of photovoltaic modules based on sampling inspection using smoothed empirical quantiles,PROGRESS IN PHOTOVOLTAICS: RESEARCH & APPLICATIONS, Issue 1 2010Ansgar Steland Abstract An important issue for end users and distributors of photovoltaic (PV) modules is the inspection of the power output specification of a shipment. The question is whether or not the modules satisfy the specifications given in the data sheet, namely the nominal power output under standard test conditions, relative to the power output tolerance. Since collecting control measurements of all modules is usually unrealistic, decisions have to be based on random samples. In many cases, one has access to flash data tables of final output power measurements (flash data) from the producer. We propose to rely on the statistical acceptance sampling approach as an objective decision framework, which takes into account both the end users and producers risk of a false decision. A practical solution to the problem is discussed which has been recently found by the authors. The solution consists of estimates of the required optimal sample size and the associated critical value where the estimation uses the information contained in the additional flash data. We propose and examine an improved solution which yields even more reliable estimated sampling plans as substantiated by a Monte Carlo study. This is achieved by employing advanced statistical estimation techniques. Copyright © 2009 John Wiley & Sons, Ltd. [source] Mapping Systems and GIS: A Case Study using the Ghana National GridTHE GEOGRAPHICAL JOURNAL, Issue 4 2000GRAHAM THOMAS The problem of incompatible projections and conversion between mapping systems is of general concern to those involved in the collection of natural resources data. The Ghana National Grid (GNG) is an example of a mapping system that is not defined in image processing and GIS software and for which the transformation parameters are not readily available in the literature. Consequently, integrating GNG topographic map data within a GIS with data derived from other sources can be problematic. In this paper a practical solution for deriving the required transformation parameters to convert from the World Geodetic System of 1984 (WGS84) to the GNG system is demonstrated. The method uses a single geodetic control point, available 1:50 000 topographic maps and a SPOT satellite panchromatic image geo-referenced to GNG. The resultant parameters are applied to road survey data in Universal Transverse Mercator (UTM) format for overlay with the SPOT image. Despite the approximations made in applying the method, when compared against official estimates of the datum transformation parameters, this relatively simple procedure resulted in estimates that appear acceptable in regard to combining data sets at a nominal scale of 1:50000. [source] Recommendations for the Assessment and Reporting of Multivariable Logistic Regression in Transplantation LiteratureAMERICAN JOURNAL OF TRANSPLANTATION, Issue 7 2010A. C. Kalil Multivariable logistic regression is an important method to evaluate risk factors and prognosis in solid organ transplant literature. We aimed to assess the quality of this method in six major transplantation journals. Eleven analytical criteria and four documentation criteria were analyzed for each selected article that used logistic regression. A total of 106 studies (6%) out of 1,701 original articles used logistic regression analyses from January 1, 2005 to January 1, 2006. The analytical criteria and their respective reporting percentage among the six journals were: Linearity (25%); Beta coefficient (48%); Interaction tests (19%); Main estimates (98%); Ovefitting prevention (84%); Goodness-of-fit (3.8%); Multicolinearity (4.7%); Internal validation (3.8%); External validation (8.5%). The documentation criteria were reported as follows: Selection of independent variables (73%); Coding of variables (9%); Fitting procedures (49%); Statistical program (65%). No significant differences were found among different journals or between general versus subspecialty journals with respect to reporting quality. We found that the report of logistic regression is unsatisfactory in transplantation journals. Because our findings may have major consequences for the care of transplant patients and for the design of transplant clinical trials, we recommend a practical solution for the use and reporting of logistic regression in transplantation journals. [source] Support of Daily ECG Procedures in a Cardiology Department via the Integration of an Existing Clinical Database and a Commercial ECG Management SystemANNALS OF NONINVASIVE ELECTROCARDIOLOGY, Issue 3 2002Franco Chiarugi Dott. Background: In the context of HYGEIAnet, the regional health telematics network of Crete, a clinical cardiology database (CARDIS) has been installed in several hospitals. The large number of resting ECGs recorded daily made it a priority to have computerized support for the entire ECG procedure. Methods: Starting in late 2000, ICS-FORTH and Mortara Instrument, Inc., collaborated to integrate the Mortara E-Scribe/NT ECG management system with CAROIS in order to support daily ECG procedures. CARDIS was extended to allow automatic ordering of daily ECGs via E-Scribe/NT. The ECG order list is downloaded to the electrocardiographs and executed, the recorded ECGs are transmitted to E-Scribe/NT, where confirmed ECG records are linked back to CARDIS. A thorough testing period was used to identify and correct problems. An ECG viewer/printer was extended to read ECG files in E-Scribe/NT format. Results: The integration of E-Scribe/NT and CARDIS, enabling automatic scheduling of ECG orders and immediate availability of confirmed ECGs records for viewing and printing in the clinical database, took approximately 4 man months. The performance of the system is highly satisfactory and it is now ready for deployment in the hospital. Conclusions: Integration of a commercially available ECG management system with an existing clinical database can provide a rapid, practical solution that requires no major modifications to either software component. The success of this project makes us optimistic about extending CARDIS to support additional examination-procedures such as digital coronary angiography and ultrasound examinations. A.N.E. 2002;7(3):263,270 [source] High-Dimensional Cox Models: The Choice of Penalty as Part of the Model Building ProcessBIOMETRICAL JOURNAL, Issue 1 2010Axel Benner Abstract The Cox proportional hazards regression model is the most popular approach to model covariate information for survival times. In this context, the development of high-dimensional models where the number of covariates is much larger than the number of observations ( ) is an ongoing challenge. A practicable approach is to use ridge penalized Cox regression in such situations. Beside focussing on finding the best prediction rule, one is often interested in determining a subset of covariates that are the most important ones for prognosis. This could be a gene set in the biostatistical analysis of microarray data. Covariate selection can then, for example, be done by L1 -penalized Cox regression using the lasso (Tibshirani (1997). Statistics in Medicine16, 385,395). Several approaches beyond the lasso, that incorporate covariate selection, have been developed in recent years. This includes modifications of the lasso as well as nonconvex variants such as smoothly clipped absolute deviation (SCAD) (Fan and Li (2001). Journal of the American Statistical Association96, 1348,1360; Fan and Li (2002). The Annals of Statistics30, 74,99). The purpose of this article is to implement them practically into the model building process when analyzing high-dimensional data with the Cox proportional hazards model. To evaluate penalized regression models beyond the lasso, we included SCAD variants and the adaptive lasso (Zou (2006). Journal of the American Statistical Association101, 1418,1429). We compare them with "standard" applications such as ridge regression, the lasso, and the elastic net. Predictive accuracy, features of variable selection, and estimation bias will be studied to assess the practical use of these methods. We observed that the performance of SCAD and adaptive lasso is highly dependent on nontrivial preselection procedures. A practical solution to this problem does not yet exist. Since there is high risk of missing relevant covariates when using SCAD or adaptive lasso applied after an inappropriate initial selection step, we recommend to stay with lasso or the elastic net in actual data applications. But with respect to the promising results for truly sparse models, we see some advantage of SCAD and adaptive lasso, if better preselection procedures would be available. This requires further methodological research. [source] The Log Multinomial Regression Model for Nominal Outcomes with More than Two AttributesBIOMETRICAL JOURNAL, Issue 6 2007L. Blizzard Abstract An estimate of the risk or prevalence ratio, adjusted for confounders, can be obtained from a log binomial model (binomial errors, log link) fitted to binary outcome data. We propose a modification of the log binomial model to obtain relative risk estimates for nominal outcomes with more than two attributes (the "log multinomial model"). Extensive data simulations were undertaken to compare the performance of the log multinomial model with that of an expanded data multinomial logistic regression method based on the approach proposed by Schouten et al. (1993) for binary data, and with that of separate fits of a Poisson regression model based on the approach proposed by Zou (2004) and Carter, Lipsitz and Tilley (2005) for binary data. Log multinomial regression resulted in "inadmissable" solutions (out-of-bounds probabilities) exceeding 50% in some data settings. Coefficient estimates by the alternative methods produced out-of-bounds probabilities for the log multinomial model in up to 27% of samples to which a log multinomial model had been successfully fitted. The log multinomial coefficient estimates generally had lesser relative bias and mean squared error than the alternative methods. The practical utility of the log multinomial regression model was demonstrated with a real data example. The log multinomial model offers a practical solution to the problem of obtaining adjusted estimates of the risk ratio in the multinomial setting, but must be used with some care and attention to detail. (© 2007 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] Solid-State Structures and Properties of Europium and Samarium HydridesEUROPEAN JOURNAL OF INORGANIC CHEMISTRY, Issue 18 2010Holger Kohlmann Abstract The structural chemistry of europium and samarium hydrides in the solid state is very rich, ranging from typical ionic hydrides following the hydride-fluoride analogy to complex transition metal hydrides and interstitial hydrides. While crystal structure, electrical, and magnetic properties suggest that europium is divalent in all hydrides investigated so far, samarium is easily transformed to a trivalent oxidation state in its hydrides and shows similarities to other lanthanide(III) hydrides. The problem of neutron absorption of europium and samarium, hampering crystal structure solution and limiting the available structural information, is discussed in detail, and practical solutions for neutron diffraction experiments are given. [source] Channel estimation and physical layer adaptation techniques for satellite networks exploiting adaptive coding and modulationINTERNATIONAL JOURNAL OF SATELLITE COMMUNICATIONS AND NETWORKING, Issue 2 2008Stefano Cioni Abstract The exploitation of adaptive coding and modulation techniques for broadband multi-beam satellite communication networks operating at Ka-band and above has been shown to theoretically provide large system capacity gains. In this paper, the problem of how to accurately estimate the time-variant channel and how to adapt the physical layer taking into account the effects of estimator errors and (large) satellite propagation delays is analyzed, and practical solutions for both the forward and the reverse link are proposed. A novel pragmatic solution to the reverse link physical layer channel estimation in the presence of time-variant bursty interference has been devised. Physical layer adaptation algorithms jointly with design rules for hysteresis thresholds have been analytically derived. The imperfect physical layer channel estimation impact on the overall system capacity has been finally derived by means of an original semi-analytical approach. Through comprehensive system simulations for a realistic system study case, it is showed that the devised adaptation algorithms are able to successfully track critical Ka-band fading time series with a limited impact on the system capacity while satisfying the link outage probability requirement. Copyright © 2008 John Wiley & Sons, Ltd. [source] Outcomes of the International Union of Crystallography Commission on Powder Diffraction Round Robin on Quantitative Phase Analysis: samples 2, 3, 4, synthetic bauxite, natural granodiorite and pharmaceuticalsJOURNAL OF APPLIED CRYSTALLOGRAPHY, Issue 4 2002Nicola V. Y. Scarlett The International Union of Crystallography (IUCr) Commission on Powder Diffraction (CPD) has sponsored a round robin on the determination of quantitative phase abundance from diffraction data. The aims of the round robin have been detailed by Madsen et al. [J. Appl. Cryst. (2001), 34, 409,426]. In summary, they were (i) to document the methods and strategies commonly employed in quantitative phases analysis (QPA), especially those involving powder diffraction, (ii) to assess levels of accuracy, precision and lower limits of detection, (iii) to identify specific problem areas and develop practical solutions, (iv) to formulate recommended procedures for QPA using diffraction data, and (v) to create a standard set of samples for future reference. The first paper (Madsen et al., 2001) covered the results of sample 1 (a simple three-phase mixture of corundum, fluorite and zincite). The remaining samples used in the round robin covered a wide range of analytical complexity, and presented a series of different problems to the analysts. These problems included preferred orientation (sample 2), the analysis of amorphous content (sample 3), microabsorption (sample 4), complex synthetic and natural mineral suites, along with pharmaceutical mixtures with and without an amorphous component. This paper forms the second part of the round-robin study and reports the results of samples 2 (corundum, fluorite, zincite, brucite), 3 (corundum, fluorite, zincite, silica flour) and 4 (corundum, magnetite, zircon), synthetic bauxite, natural granodiorite and the synthetic pharmaceutical mixtures (mannitol, nizatidine, valine, sucrose, starch). The outcomes of this second part of the round robin support the findings of the initial study. The presence of increased analytical problems within these samples has only served to exacerbate the difficulties experienced by many operators with the sample 1 suite. The major difficulties are caused by lack of operator expertise, which becomes more apparent with these more complex samples. Some of these samples also introduced the requirement for skill and judgement in sample preparation techniques. This second part of the round robin concluded that the greatest physical obstacle to accurate QPA for X-ray based methods is the presence of absorption contrast between phases (microabsorption), which may prove to be insurmountable in some circumstances. [source] Outcomes of the International Union of Crystallography Commission on Powder Diffraction Round Robin on Quantitative Phase Analysis: samples 1a to 1hJOURNAL OF APPLIED CRYSTALLOGRAPHY, Issue 4 2001Ian C. Madsen The International Union of Crystallography (IUCr) Commission on Powder Diffraction (CPD) has sponsored a round robin on the determination of quantitative phase abundance from diffraction data. Specifically, the aims of the round robin were (i) to document the methods and strategies commonly employed in quantitative phase analysis (QPA), especially those involving powder diffraction, (ii) to assess levels of accuracy, precision and lower limits of detection, (iii) to identify specific problem areas and develop practical solutions, (iv) to formulate recommended procedures for QPA using diffraction data, and (v) to create a standard set of samples for future reference. Some of the analytical issues which have been addressed include (a) the type of analysis (integrated intensities or full-profile, Rietveld or full-profile, database of observed patterns) and (b) the type of instrument used, including geometry and radiation (X-ray, neutron or synchrotron). While the samples used in the round robin covered a wide range of analytical complexity, this paper reports the results for only the sample 1 mixtures. Sample 1 is a simple three-phase system prepared with eight different compositions covering a wide range of abundance for each phase. The component phases were chosen to minimize sample-related problems, such as the degree of crystallinity, preferred orientation and microabsorption. However, these were still issues that needed to be addressed by the analysts. The results returned indicate a great deal of variation in the ability of the participating laboratories to perform QPA of this simple three-component system. These differences result from such problems as (i) use of unsuitable reference intensity ratios, (ii) errors in whole-pattern refinement software operation and in interpretation of results, (iii) operator errors in the use of the Rietveld method, often arising from a lack of crystallographic understanding, and (iv) application of excessive microabsorption correction. Another major area for concern is the calculation of errors in phase abundance determination, with wide variations in reported values between participants. Few details of methodology used to derive these errors were supplied and many participants provided no measure of error at all. [source] Understanding and Preventing the Edge EffectJOURNAL OF INTERVENTIONAL CARDIOLOGY, Issue 1 2003EDOUARD CHENEAU M.D. Edge stenosis, combining neointimal proliferation and negative remodeling, remains a serious limitation of vascular brachytherapy. This review comprehensively presents terminology, definitions, mechanisms, and treatment strategies to better understand the complexities of edge narrowing. The major contributors to this phenomenon are known; understanding the practical solutions will enable us to further minimize the problem of the edge effect. (J Interven Cardiol 2003;16:1,7) [source] Supplementary prescribing: potential ways to reform hospital psychiatric careJOURNAL OF PSYCHIATRIC & MENTAL HEALTH NURSING, Issue 2 2006A. JONES phd rn The objective of this study was to explore perceptions held by nurses and psychiatrists towards the potential application of supplementary prescribing on acute psychiatric wards. Six focus groups were conducted with 19 nurses and seven psychiatrists who worked on three wards. Two major themes were identified: first, ways in which patients could receive care and treatment through supplementary prescribing and in new forms of partnership and second, ways by which nurses and psychiatrists could be organized to deliver their care through a supplementary prescribing framework. Nurses and psychiatrists were generally positive about the advent of prescribing and offered positive views as to how patient care could be improved and a general willingness for nurses to adapt and work differently. Findings from this exploratory study offer practical solutions to how supplementary prescribing could work on acute psychiatric wards. [source] Conservation of the Eurasian beaver Castor fiber: an olfactory perspectiveMAMMAL REVIEW, Issue 4 2010Róisín CAMPBELL-PALMER ABSTRACT 1Chemical communication in mammals includes an array of specific behaviours that are often ignored in terms of their potential relevance to conservation. Often used during territorial or social interactions between animals, chemical communication can also be used as a tool in reintroduction programmes. Reintroductions still exhibit high failure rates and methods to improve success should be investigated. The Eurasian beaver Castor fiber has been widely reintroduced across Europe after its near extinction in the 19th century. 2Using olfactory studies in the beaver, we aim to demonstrate how scent transfers a range of information about the sender which can be used to monitor social and territorial behaviour along with general well-being. Scent manipulation can be used to reduce human,beaver conflicts, and aid reintroduction success through reducing stress and territorial conflicts, and by influencing dispersal and settlement. 3Two species of beavers, the Eurasian beaver and the North American beaver Castor canadensis, occupy freshwater habitats throughout North America and in parts of South America, most of Europe and parts of Asia. Most of the reviewed literature concerns the wild Eurasian beaver, its chemical communication and conservation; however, captive studies and those addressing North American beavers are also included. 4Chemical communication is advanced and has been well documented in this highly territorial species. However, few studies directly link olfaction with conservation practices. 5Olfactory studies in beavers can provide non-invasive methods to monitor translocated animals and indicators of health. We conclude that chemical analysis, olfactory studies and behavioural manipulations involving semiochemicals have important impacts on conservation and can generate practical solutions to conservation problems including aiding animal capture, captive stress reduction, breeding pair formation and release site fidelity. [source] Challenges and Strategies Related to Hearing Loss Among Dairy FarmersTHE JOURNAL OF RURAL HEALTH, Issue 4 2005Louise Hass-Slavin MSc ABSTRACT: Context: Farming is often imagined to be a serene and idyllic business based on historical images of a man, a horse, and a plow. However, machinery and equipment on farms, such as older tractors, grain dryers, and vacuum pumps, can have noise levels, which may be dangerous to hearing with prolonged, unprotected exposure. Purpose: This qualitative study in Ontario, Canada, explored the challenges and coping strategies experienced by dairy farmers with self-reported hearing loss and communication difficulties. Through in-depth interviews, 13 farmers who experience significant hearing loss were questioned about the challenges they face as a result of hearing loss and the strategies they use to overcome or compensate for problems. Findings: The 2 major challenges encountered by dairy farmers with a hearing loss were: (1) obtaining information from individuals, within groups, and through electronic media; and (2) working with animals, machinery, and noise. To cope with these challenges, participants used strategies identified as problem and emotion focused. Conclusions: Four themes arose from analysis of the challenges encountered and strategies used: 1Hearing loss is experienced as a "familiar," but "private," problem for dairy farmers. 2Communication difficulties can negatively affect the quality of relationships on the farm. 3Safety and risk management are issues when farming with a hearing loss. 4The management or control of excessive noise is a complex problem, because there are no completely reliable yet practical solutions. [source] Mapping solutions to obesity: lessons from the Human Genome ProjectAUSTRALIAN AND NEW ZEALAND JOURNAL OF PUBLIC HEALTH, Issue 6 2008Catriona M. F. Bonfiglioli Abstract Objective: To discuss appropriate endpoints for research designed to prevent obesity. Research investigating practical solutions to the complex multi-factorial global obesity epidemic may be stalled by undue emphasis on reduced body weight as the only acceptable endpoint. Approach: Considering prevention research in cardiovascular disease and tobacco control, we contend that investigations of intermediate endpoints make an important contribution to the multi-faceted approach needed to combat the complex problem of obesity. Conclusion: Intermediate endpoints are respected in other public health areas: reductions in risk factors such as high blood cholesterol or smoking are acceptable study endpoints for research aimed at reducing heart disease or lung cancer. Likewise, practical endpoints can be valuable in studies investigating interventions to reduce identified and potential intermediate risk factors for obesity, such as soft drink consumption. Implications: Reduced obesity is the global aim but obesity is not caused by one exposure and will not be solved by a single modality intervention. A wider debate about endpoint selection may assist research which identifies individual building blocks of obesity prevention in the same way as individual gene mapping contributed to the Human Genome Project. [source] Nucleation and Expansion During Extrusion and Microwave Heating of Cereal FoodsCOMPREHENSIVE REVIEWS IN FOOD SCIENCE AND FOOD SAFETY, Issue 4 2003C.I. Moraru ABSTRACT Expansion of biopolymer matrices is the basis for the production of a wide variety of cereal foods. A limited number of manufacturing processes provide practical solutions for the development of an impressive variety of expanded products, just by changing process variables. It is therefore essential that the mechanisms involved in expansion are well known and controlled. This paper summarizes the knowledge of nucleation and expansion in extruded and microwaved products available to date. The effect of processing conditions and properties of the biopolymeric matrix on nucleation and expansion are discussed. Moisture content enables the glassy polymeric matrix to turn into rubbery state at process temperatures, which allows superheated steam bubbles to form at nuclei and then expand, expansion being governed by the biaxial extensional viscosity of the matrix. Nucleation and expansion theories are presented along with quantitative data that support them. [source] |