Practical Tool (practical + tool)

Distribution by Scientific Domains


Selected Abstracts


Variation Mode and Effect Analysis: a Practical Tool for Quality Improvement

QUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, Issue 8 2006
Per Johansson
Abstract This paper describes a statistically based engineering method, variation mode and effect analysis (VMEA), that facilitates an understanding of variation and highlights the product/process areas in which improvement efforts should be targeted. An industrial application is also described to illustrate how the VMEA can be used for quality improvement purposes. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Data Envelopment Analysis: A Practical Tool to Measure Performance

AUSTRALIAN ACCOUNTING REVIEW, Issue 2 2010
Paul Rouse
This paper describes a productivity method, data envelopment analysis (DEA), and how it can be used to measure performance using multiple performance measures. DEA compares organisations or parts of organisations that share common goals and use similar resources to produce similar products, and calculates the technical efficiency with which firms convert bundles of inputs into bundles of outputs. DEA has been used in both public and private settings, and the paper describes some of its applications within Australasia. A case study of New Zealand dairy farms is used to demonstrate the benchmarking capability of DEA. While built upon solid theoretical foundations, DEA is essentially a practical tool that can be used by academics for research as well as by managers and practitioners for improved performance measurement and accountability. [source]


Pose Controlled Physically Based Motion

COMPUTER GRAPHICS FORUM, Issue 4 2006
Raanan Fattal
Abstract In this paper we describe a new method for generating and controlling physically-based motion of complex articulated characters. Our goal is to create motion from scratch, where the animator provides a small amount of input and gets in return a highly detailed and physically plausible motion. Our method relieves the animator from the burden of enforcing physical plausibility, but at the same time provides full control over the internal DOFs of the articulated character via a familiar interface. Control over the global DOFs is also provided by supporting kinematic constraints. Unconstrained portions of the motion are generated in real time, since the character is driven by joint torques generated by simple feedback controllers. Although kinematic constraints are satisfied using an iterative search (shooting), this process is typically inexpensive, since it only adjusts a few DOFs at a few time instances. The low expense of the optimization, combined with the ability to generate unconstrained motions in real time yields an efficient and practical tool, which is particularly attractive for high inertia motions with a relatively small number of kinematic constraints. [source]


Decentralized Parametric Damage Detection Based on Neural Networks

COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 3 2002
Zhishen Wu
In this paper, based on the concept of decentralized information structures and artificial neural networks, a decentralized parametric identification method for damage detection of structures with multi-degrees-of-freedom (MDOF) is conducted. First, a decentralized approach is presented for damage detection of substructures of an MDOF structure system by using neural networks. The displacement and velocity measurements from a substructure of a healthy structure system and the restoring force corresponding to this substructure are used to train the decentralized detection neural networks for the purpose of identifying the corresponding substructure. By using the trained decentralized detection neural networks, the difference of the interstory restoring force between the damaged substructures and the undamaged substructures can be calculated. An evaluation index, that is, relative root mean square (RRMS) error, is presented to evaluate the condition of each substructure for the purpose of health monitoring. Although neural networks have been widely used for nonparametric identification, in this paper, the decentralized parametric evaluation neural networks for substructures are trained for parametric identification. Based on the trained decentralized parametric evaluation neural networks and the RRMS error of substructures, the structural parameter of stiffness of each subsystem can be forecast with high accuracy. The effectiveness of the decentralized parametric identification is evaluated through numerical simulations. It is shown that the decentralized parametric evaluation method has the potential of being a practical tool for a damage detection methodology applied to structure-unknown smart civil structures. [source]


IS A NEW AND GENERAL THEORY OF MOLECULAR SYSTEMATICS EMERGING?

EVOLUTION, Issue 1 2009
Scott V. Edwards
The advent and maturation of algorithms for estimating species trees,phylogenetic trees that allow gene tree heterogeneity and whose tips represent lineages, populations and species, as opposed to genes,represent an exciting confluence of phylogenetics, phylogeography, and population genetics, and ushers in a new generation of concepts and challenges for the molecular systematist. In this essay I argue that to better deal with the large multilocus datasets brought on by phylogenomics, and to better align the fields of phylogeography and phylogenetics, we should embrace the primacy of species trees, not only as a new and useful practical tool for systematics, but also as a long-standing conceptual goal of systematics that, largely due to the lack of appropriate computational tools, has been eclipsed in the past few decades. I suggest that phylogenies as gene trees are a "local optimum" for systematics, and review recent advances that will bring us to the broader optimum inherent in species trees. In addition to adopting new methods of phylogenetic analysis (and ideally reserving the term "phylogeny" for species trees rather than gene trees), the new paradigm suggests shifts in a number of practices, such as sampling data to maximize not only the number of accumulated sites but also the number of independently segregating genes; routinely using coalescent or other models in computer simulations to allow gene tree heterogeneity; and understanding better the role of concatenation in influencing topologies and confidence in phylogenies. By building on the foundation laid by concepts of gene trees and coalescent theory, and by taking cues from recent trends in multilocus phylogeography, molecular systematics stands to be enriched. Many of the challenges and lessons learned for estimating gene trees will carry over to the challenge of estimating species trees, although adopting the species tree paradigm will clarify many issues (such as the nature of polytomies and the star tree paradox), raise conceptually new challenges, or provide new answers to old questions. [source]


Application of a one-dimensional thermal flame spread model on predicting the rate of heat release in the SBI test

FIRE AND MATERIALS, Issue 2 2001
Tuula Hakkarainen
A one-dimensional thermal flame spread model was applied to predict the rate of heat release in the single burning item (SBI) test on the basis of the cone calorimeter data. The input parameters were selected according to the features of the SBI test and using particle board as a model tuning material. The features of the measured and calculated rate of heat release curves were compared for a series of 33 building products. The fire growth rate (FIGRA) indices were calculated to predict the classification in the forthcoming Euroclass system. The model gave correct classification for 90% of the products studied. An essential feature of the model is that only one cone calorimeter test at the exposure level of 50 kW m,2 is needed. The model, therefore, provides a practical tool for product development and quality control. Copyright © 2001 John Wiley & Sons, Ltd. [source]


A methodological framework of preparing economic evidence for selection of medicines in the Chinese setting

JOURNAL OF EVIDENCE BASED MEDICINE, Issue 3 2010
Xin Sun
Medicines are becoming a major component of health expenditure in China. Selection of effective and cost-effective medicines represents an important effort to improve medicines use. A guideline on cost-effectiveness studies has been available in China. This guideline, however, fails to be a practical tool to prepare and critically appraise economic evidence. This article discusses, in the Chinese context, the approach to integrating economic component into the medicines selection, and elaborates the methods of producing economic evidence, including conducing economic reviews and primary economic studies. [source]


Sequential immunohistochemical p53 expression in biopsies of oral lichen planus undergoing malignant evolution

JOURNAL OF ORAL PATHOLOGY & MEDICINE, Issue 3 2001
Guido Valente
Abstract: Transformation in squamous cell carcinoma (SCC) may occur in a small percentage of patients affected by oral lichen planus (OLP), but the pathogenesis remains to be elucidated. Overexpression of p53 protein was investigated immunohistochemically in 28 cases of OLP, followed up by sequential biopsies for up to 96 months. In 15 cases (Group 1), no dysplastic changes or neoplastic transformation occurred during the follow-up period; in 7 cases, OLP and SCC were synchronously observed (Group 2), whereas in another 6 cases (Group 3) SCC developed several months or years after diagnosis of OLP. The percentage of p53-positive epithelial cells at first diagnosis was significantly higher in the cases of Groups 2 and 3 than in those of Group 1. In contrast, evaluation of growth fraction by MIB-1 monoclonal antibody did not show any statistical differences among the three groups. Although no conclusions can be drawn about the molecular pathway leading to neoplastic transformation of OLP, or about the role of p53, the results indicate that immunohistochemical evaluation of p53 expression may be a practical tool to select cases of OLP with a high risk of neoplastic transformation. [source]


Imaging pharmaceutical tablets with optical coherence tomography

JOURNAL OF PHARMACEUTICAL SCIENCES, Issue 1 2010
Jakob M.A. Mauritz
Abstract Optical coherence tomography (OCT) is a recently developed optical technique that produces depth profiles of three-dimensional objects. It is a nondestructive interferometric method responding to refractive index variation in the sample under study and can reach a penetration depth of a few millimetres. OCT employs near-infrared (NIR) light and therefore provides a link between NIR spectroscopy and Terahertz (THz) measurements that are often used to characterise tablets. In this article we assess the potential of OCT as a reliable and practical tool in the analysis of pharmaceutical tablets and coatings. A variety of tablets were tested with different shapes, formulations and coatings. We consider the origins of contrast in the obtained images and demonstrate that it correlates strongly with the expected tablet structure. The influence of absorption and scattering are considered for the wavelength ranges used. The results show that OCT is a promising diagnostic tool with an important role to play in the tablet and coating technologies. The high measurement speed of OCT and its relative ease of implementation make it also an attractive candidate technology for in-line quality control during manufacturing. © 2009 Wiley-Liss, Inc. and the American Pharmacists Association J Pharm Sci 99:385,391, 2010 [source]


Developing clinical skills: a simple and practical tool

MEDICAL EDUCATION, Issue 11 2004
Ben Lawton
No abstract is available for this article. [source]


Simple Measures to Monitor ,-Cell Mass and Assess Islet Graft Dysfunction

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 2 2007
R. N. Faradji
The aim of this study was to develop a simple test for the assessment of islet graft dysfunction based on measures involving fasting C-peptide. Calculations were made to account for the dependence of C-peptide secretion on glucose concentration (C-peptide/glucose ratio [CP/G]) and adjusted for renal function by calculating the C-peptide/glucose-creatinine ratio (CP/GCr). Values from 22 recipients were analyzed at different times post-last islet infusion. Receiver operating characteristic curves were used to determine which of these measures best predicts high 90-minute glucose (90 min-Glc; >10 mmol/L) after a Mixed Meal Tolerance Test (MMTT). In this initial analysis, CP/G was found to be superior predicting high 90 min-Glc with a larger area under the ROC curve than C-peptide (p = 0.01) and CP/GCr (p = 0.06). We then correlated C-peptide and CP/G with islet equivalents-IEQ/kg infused, 90 min-Glc after MMTT and clinical outcome (,-score). C-peptide and CP/G in the first 3 months post-last islet infusion correlated with IEQ/kg infused. CP/G correlated with 90 min-Glc and ,-score. C-peptide and CP/G are good indicators of islet mass transplanted. CP/G is more indicative of graft dysfunction and clinical outcome than C-peptide alone. The ease of calculation and the good correlation with other tests makes this ratio a practical tool when monitoring and managing islet transplant recipients. [source]


A new practical tool for deriving a functional signature for herbaceous vegetation

APPLIED VEGETATION SCIENCE, Issue 2 2004
R. Hunt
Abstract Hypothesis: For any one time and place a ,functional signature' can be derived for a sample of herbaceous vegetation in a way that concisely represents the balance between the different clusters of functional attributes that are present among component species. Methods: We developed a spreadsheet-based tool for calculating functional signatures within the context of the C-S-R system of plant functional types. We used the tool to calculate and compare signatures for specimen British vegetation samples which differed in management regime and location in time. Conclusion: The integrative power of the ,C-S-R signature' is useful in comparative studies involving widely differing samples. Movements in the signature can be used to indicate degree of resistance, resilience, eutrophication and dereliction. Systems of plant functional types other than C-S-R might also be approached in this way. Availability: The tool can be downloaded free of charge from the first author's web pages or from the journal's electronic archive. [source]


Re-constructing the urban landscape through community mapping: an attractive prospect for sustainability?

AREA, Issue 2 2009
Frances Fahy
Community mapping is a relatively new tool with considerable potential in giving practical effect at the local level to sustainable development rhetoric. As a repository of socially constructed knowledge, it has considerable value in democratizing information both in terms of what is recorded and public access to it, in a manner that facilitates more meaningful participation of non-experts in planning and advocacy processes. Focusing on a community mapping project in Galway, Ireland, this research paper explores how the city's municipal authority is employing community mapping not just to record and promote the city's social, environmental, economic and cultural assets but also as a practical tool to bolster public participation in policy-making and to improve local communities' trust in the municipal authority, thereby shaping sustainability practices through enhanced governance. [source]


Prospects for de novo phasing with de novo protein models

ACTA CRYSTALLOGRAPHICA SECTION D, Issue 2 2009
Rhiju Das
The prospect of phasing diffraction data sets `de novo' for proteins with previously unseen folds is appealing but largely untested. In a first systematic exploration of phasing with Rosetta de novo models, it is shown that all-atom refinement of coarse-grained models significantly improves both the model quality and performance in molecular replacement with the Phaser software. 15 new cases of diffraction data sets that are unambiguously phased with de novo models are presented. These diffraction data sets represent nine space groups and span a large range of solvent contents (33,79%) and asymmetric unit copy numbers (1,4). No correlation is observed between the ease of phasing and the solvent content or asymmetric unit copy number. Instead, a weak correlation is found with the length of the modeled protein: larger proteins required somewhat less accurate models to give successful molecular replacement. Overall, the results of this survey suggest that de novo models can phase diffraction data for approximately one sixth of proteins with sizes of 100 residues or less. However, for many of these cases, `de novo phasing with de novo models' requires significant investment of computational power, much greater than 103 CPU days per target. Improvements in conformational search methods will be necessary if molecular replacement with de novo models is to become a practical tool for targets without homology to previously solved protein structures. [source]


Going soft and SAD with manganese

ACTA CRYSTALLOGRAPHICA SECTION D, Issue 1 2005
Paula S. Salgado
SAD phasing has been revisited recently, with experiments being carried out using previously unconventional sources of anomalous signal, particularly lighter atoms and softer X-rays. A case study is reported using the 75,kDa RNA-dependent RNA polymerase of the bacteriophase ,6, which binds a Mn atom and crystallizes with three molecules in the asymmetric unit. X-ray diffraction data were collected at a wavelength of 1.89,Å and although the calculated anomalous signal from the three Mn atoms was only 1.2%, SHELXD and SOLVE were able to locate these atoms. SOLVE/RESOLVE used this information to obtain SAD phases and automatically build a model for the core region of the protein, which possessed the characteristic features of the right-hand polymerase motif. These results demonstrate that with modern synchrotron beamlines and software, manganese phasing is a practical tool for solving the structure of large proteins. [source]


Data Envelopment Analysis: A Practical Tool to Measure Performance

AUSTRALIAN ACCOUNTING REVIEW, Issue 2 2010
Paul Rouse
This paper describes a productivity method, data envelopment analysis (DEA), and how it can be used to measure performance using multiple performance measures. DEA compares organisations or parts of organisations that share common goals and use similar resources to produce similar products, and calculates the technical efficiency with which firms convert bundles of inputs into bundles of outputs. DEA has been used in both public and private settings, and the paper describes some of its applications within Australasia. A case study of New Zealand dairy farms is used to demonstrate the benchmarking capability of DEA. While built upon solid theoretical foundations, DEA is essentially a practical tool that can be used by academics for research as well as by managers and practitioners for improved performance measurement and accountability. [source]


TOWARD CASE-BASED REASONING FOR DIABETES MANAGEMENT: A PRELIMINARY CLINICAL STUDY AND DECISION SUPPORT SYSTEM PROTOTYPE

COMPUTATIONAL INTELLIGENCE, Issue 3 2009
Cindy Marling
This paper presents a case-based decision support system prototype to assist patients with Type 1 diabetes on insulin pump therapy. These patients must vigilantly maintain blood glucose levels within prescribed target ranges to prevent serious disease complications, including blindness, neuropathy, and heart failure. Case-based reasoning (CBR) was selected for this domain because (a) existing guidelines for managing diabetes are general and must be tailored to individual patient needs; (b) physical and lifestyle factors combine to influence blood glucose levels; and (c) CBR has been successfully applied to the management of other long-term medical conditions. An institutional review board (IRB) approved preliminary clinical study, involving 20 patients, was conducted to assess the feasibility of providing case-based decision support for these patients. Fifty cases were compiled in a case library, situation assessment routines were encoded to detect common problems in blood glucose control, and retrieval metrics were developed to find the most relevant past cases for solving current problems. Preliminary results encourage continued research and work toward development of a practical tool for patients. [source]


On Constraining Pilot Point Calibration with Regularization in PEST

GROUND WATER, Issue 6 2009
Michael N. Fienen
Ground water model calibration has made great advances in recent years with practical tools such as PEST being instrumental for making the latest techniques available to practitioners. As models and calibration tools get more sophisticated, however, the power of these tools can be misapplied, resulting in poor parameter estimates and/or nonoptimally calibrated models that do not suit their intended purpose. Here, we focus on an increasingly common technique for calibrating highly parameterized numerical models,pilot point parameterization with Tikhonov regularization. Pilot points are a popular method for spatially parameterizing complex hydrogeologic systems; however, additional flexibility offered by pilot points can become problematic if not constrained by Tikhonov regularization. The objective of this work is to explain and illustrate the specific roles played by control variables in the PEST software for Tikhonov regularization applied to pilot points. A recent study encountered difficulties implementing this approach, but through examination of that analysis, insight into underlying sources of potential misapplication can be gained and some guidelines for overcoming them developed. [source]


Waiting for scheduled services in Canada: development of priority-setting scoring systems

JOURNAL OF EVALUATION IN CLINICAL PRACTICE, Issue 1 2003
T. W. Noseworthy MD MSc MPH FRCPC FACP FCCP FCCM CHE
Abstract Rationale, aims and objectives An Achilles' heel of Canadian Medicare is long waits for elective services. The Western Canada Waiting List (WCWL) project is a collaboration of 19 partner organizations committed to addressing this issue and influencing the way waiting lists are structured and managed. The focus of the WCWL project has been to develop and refine practical tools for prioritizing patients on scheduled waiting lists. Methods Scoring tools for priority setting were developed through extensive clinical input and highly iterative exchange by clinical panels constituted in five clinical areas: cataract surgery; general surgery procedures; hip and knee replacement; magnetic resonance imaging (MRI) scanning, and children's mental health. Several stages of empirical work were conducted to formulate and refine criteria and to assess and improve their reliability and validity. To assess the acceptability and usability of the priority-setting tools and to identify issues pertaining to implementation, key personnel in the seven regional health authorities (RHAs) participated in structured interviews. Public opinion focus groups were conducted in the seven western cities. Results Point-count scoring systems were constructed in each of the clinical areas. Participating clinicians confirmed that the tools offered face validity and that the scoring systems appeared practical for implementation and use in clinical settings. Reliability was strongest for the general surgery and hip and knee criteria, and weakest for the diagnostic MRI criteria. Public opinion focus groups endorsed wholeheartedly the application of point-count priority measures. Regional health authorities were generally supportive, though cautiously optimistic towards implementation. Conclusions While the WCWL project has not ,solved' the problem of waiting lists and times, having a standardized, reliable means of assigning priority for services is an important step towards improved management in Canada and elsewhere. [source]


Meeting the Preteen Vaccine Law: A Pilot Program in Urban Middle Schools

JOURNAL OF SCHOOL HEALTH, Issue 2 2000
Lynda Boyer-Chuanroong
ABSTRACT: California, the most populous state in the nation, is one of many states that implemented vaccination requirements for preteens. While kindergarten requirements are well-established and accepted by parents, implementation of preteen vaccination requirements requires inter- and intra-institutional adjustments, educational and public relations efforts, and an augmentation of vaccination delivery systems. This article describes a pilot program in two middle schools in an urban school district and offers planning strategies and practical tools to assist school nurses and health providers to implement preteen requirements. [source]


A new class of models for bivariate joint tails

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES B (STATISTICAL METHODOLOGY), Issue 1 2009
Alexandra Ramos
Summary., A fundamental issue in applied multivariate extreme value analysis is modelling dependence within joint tail regions. The primary focus of this work is to extend the classical pseudopolar treatment of multivariate extremes to develop an asymptotically motivated representation of extremal dependence that also encompasses asymptotic independence. Starting with the usual mild bivariate regular variation assumptions that underpin the coefficient of tail dependence as a measure of extremal dependence, our main result is a characterization of the limiting structure of the joint survivor function in terms of an essentially arbitrary non-negative measure that must satisfy some mild constraints. We then construct parametric models from this new class and study in detail one example that accommodates asymptotic dependence, asymptotic independence and asymmetry within a straightforward parsimonious parameterization. We provide a fast simulation algorithm for this example and detail likelihood-based inference including tests for asymptotic dependence and symmetry which are useful for submodel selection. We illustrate this model by application to both simulated and real data. In contrast with the classical multivariate extreme value approach, which concentrates on the limiting distribution of normalized componentwise maxima, our framework focuses directly on the structure of the limiting joint survivor function and provides significant extensions of both the theoretical and the practical tools that are available for joint tail modelling. [source]