Systematic Use (systematic + use)

Distribution by Scientific Domains


Selected Abstracts


Developing A Database to Describe the Practice Patterns of Adult Nurse Practitioner Students

JOURNAL OF NURSING SCHOLARSHIP, Issue 1 2000
Nancy A. O'Connor
Purpose: To describe the practice patterns of adult nurse practitioner students using a database composed of core health data elements and standardized nursing language. Design: Descriptive study of 3,733 patient visits documented by 19 adult nurse practitioner students in the academic year 1996,1997. Methods: A database was designed for documenting the full scope of practice of adult nurse practitioner students by use of core health data elements and the standardized nursing languages of NANDA and NIC. Nurse practitioner students used the database to document every linical encounter during their final clinical year of study. Most visits occurred in ambulatory care settings in a midwestern American city. Findingsx: Based on the American Medical Association's Evaluation/Management coding system, data indicated that 50% of visits were classified as problem focused, while 31.9% were expanded, 10% were detailed, and 8.1% were comprehensive. The most frequently occurring NANDA diagnoses were pain, health-seeking behavior, altered health maintenance, and knowledge deficit. The most frequently reported Nursing intervention classifications (NIC) were patient education, drug management, information management, and risk management. Conclusions: Using standardized nursing language to describe clinical encounters made visible the complex clinical decision-making patterns of adult nurse practitioner students. Systematic use of a database designed for documenting the full scope of practice of nurse practitioner students showed the applicability of standardized nursing language to advanced practice nursing contexts. [source]


A meso-level approach to the 3D numerical analysis of cracking and fracture of concrete materials

FATIGUE & FRACTURE OF ENGINEERING MATERIALS AND STRUCTURES, Issue 12 2006
A. CABALLERO
ABSTRACT A meso-mechanical model for the numerical analysis of concrete specimens in 3D has been recently proposed. In this approach, concrete is represented as a composite material with the larger aggregates embedded in a mortar-plus-aggregates matrix. Both continuum-type components are considered linear elastic, while the possibilities of failure are provided with the systematic use of zero-thickness interface elements equipped with a cohesive fracture constitutive law. These elements are inserted along all potential crack planes in the mesh a priori of the analysis. In this paper, the basic features of the model are summarized, and then results of calculations are presented, which include uniaxial tension and compression loading of 14-aggregate cubical specimen along X, Y and Z axes. The results confirm the consistency of the approach with physical phenomena and well-known features of concrete behaviour, and show low scatter when different loading directions are considered. Those cases can also be considered as different specimens subjected to the same type of loading. [source]


Robust monotone gradient-based discrete-time iterative learning control

INTERNATIONAL JOURNAL OF ROBUST AND NONLINEAR CONTROL, Issue 6 2009
D. H. Owens
Abstract This paper considers the use of matrix models and the robustness of a gradient-based iterative learning control (ILC) algorithm using both fixed learning gains and nonlinear data-dependent gains derived from parameter optimization. The philosophy of the paper is to ensure monotonic convergence with respect to the mean-square value of the error time series. The paper provides a complete and rigorous analysis for the systematic use of the well-known matrix models in ILC. Matrix models provide necessary and sufficient conditions for robust monotonic convergence. They also permit the construction of accurate sufficient frequency domain conditions for robust monotonic convergence on finite time intervals for both causal and non-causal controller dynamics. The results are compared with recently published results for robust inverse-model-based ILC algorithms and it is seen that the algorithm has the potential to improve the robustness to high-frequency modelling errors, provided that resonances within the plant bandwidth have been suppressed by feedback or series compensation. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Morphology of deltoid origin and end tendons , a generic model

JOURNAL OF ANATOMY, Issue 6 2008
J. N. A. L. Leijnse
Abstract This study provides a model of the complex deltoid origin and end tendons, as a basis for further anatomical, biomechanical and clinical research. Although the deltoid is used in transpositions with upper limb paralysis, its detailed morphology and segmentation has not been object of much study. Morphologically, the deltoid faces two distinct challenges. It closely envelops a ball joint, and it reduces its width over a short distance from a very wide origin along clavicle, acromion and spina scapula, to an insertion as narrow as the humerus. These challenges necessitate specific morphological tendon adaptations. A qualitative model for these tendons is developed by the stepwise transformation of a unipennate muscle model into a functional deltoid muscle. Each step is the solution to one of the mentioned morphological challenges. The final model is of an end tendon consisting of a continuous succession of bipennate end tendon blades centrally interspaced by unipennate tendon parts. The origin tendon consists of lamellae that interdigitate with the end tendon blades, creating a natural segmentation. The model is illustrated by qualitative dissection results. In addition, in view of a proliferation of terms found in the literature to describe deltoid tendons, tendon concepts are reviewed and the systematic use of the unique and simple terminology of ,origin and end tendons' is proposed. [source]


Evidence-based human resource management: a study of nurse leaders' resource allocation

JOURNAL OF NURSING MANAGEMENT, Issue 4 2009
LISBETH FAGERSTRÖM RN
Aims, The aims were to illustrate how the RAFAELA system can be used to facilitate evidence-based human resource management. Background, The theoretical framework of the RAFAELA system is based on a holistic view of humankind and a view of leadership founded on human resource management. Methods, Nine wards from three central hospitals in Finland participated in the study. The data, stemming from 2006,2007, were taken from the critical indicators (ward-related and nursing intensity information) for national benchmarking used in the RAFAELA system. The data were analysed descriptively. Results, The daily nursing resources per classified patient ratio is a more specific method of measurement than the nurse-to-patient ratio. For four wards, the nursing intensity per nurse surpassed the optimal level 34% to 62.2% of days. Resource allocation was clearly improved in that a better balance between patients' care needs and available nursing resources was maintained. Conclusions, The RAFAELA system provides a rational, systematic and objective foundation for evidence-based human resource management. Implications for nursing management, Data from a systematic use of the RAFAELA system offer objective facts and motives for evidence-based decision making in human resource management, and will therefore enhance the nurse leaders' evidence and scientific based way of working. [source]


How to track and assess genotyping errors in population genetics studies

MOLECULAR ECOLOGY, Issue 11 2004
A. BONIN
Abstract Genotyping errors occur when the genotype determined after molecular analysis does not correspond to the real genotype of the individual under consideration. Virtually every genetic data set includes some erroneous genotypes, but genotyping errors remain a taboo subject in population genetics, even though they might greatly bias the final conclusions, especially for studies based on individual identification. Here, we consider four case studies representing a large variety of population genetics investigations differing in their sampling strategies (noninvasive or traditional), in the type of organism studied (plant or animal) and the molecular markers used [microsatellites or amplified fragment length polymorphisms (AFLPs)]. In these data sets, the estimated genotyping error rate ranges from 0.8% for microsatellite loci from bear tissues to 2.6% for AFLP loci from dwarf birch leaves. Main sources of errors were allelic dropouts for microsatellites and differences in peak intensities for AFLPs, but in both cases human factors were non-negligible error generators. Therefore, tracking genotyping errors and identifying their causes are necessary to clean up the data sets and validate the final results according to the precision required. In addition, we propose the outline of a protocol designed to limit and quantify genotyping errors at each step of the genotyping process. In particular, we recommend (i) several efficient precautions to prevent contaminations and technical artefacts; (ii) systematic use of blind samples and automation; (iii) experience and rigor for laboratory work and scoring; and (iv) systematic reporting of the error rate in population genetics studies. [source]


Re-refinement from deposited X-ray data can deliver improved models for most PDB entries

ACTA CRYSTALLOGRAPHICA SECTION D, Issue 2 2009
Robbie P. Joosten
The deposition of X-ray data along with the customary structural models defining PDB entries makes it possible to apply large-scale re-refinement protocols to these entries, thus giving users the benefit of improvements in X-ray methods that have occurred since the structure was deposited. Automated gradient refinement is an effective method to achieve this goal, but real-space intervention is most often required in order to adequately address problems detected by structure-validation software. In order to improve the existing protocol, automated re-refinement was combined with structure validation and difference-density peak analysis to produce a catalogue of problems in PDB entries that are amenable to automatic correction. It is shown that re-refinement can be effective in producing improvements, which are often associated with the systematic use of the TLS parameterization of B factors, even for relatively new and high-resolution PDB entries, while the accompanying manual or semi-manual map analysis and fitting steps show good prospects for eventual automation. It is proposed that the potential for simultaneous improvements in methods and in re-refinement results be further encouraged by broadening the scope of depositions to include refinement metadata and ultimately primary rather than reduced X-ray data. [source]