Data Analysis Methods (data + analysis_methods)

Distribution by Scientific Domains


Selected Abstracts


The End of an Era: What Became of the "Managed Care Revolution" in 2001?

HEALTH SERVICES RESEARCH, Issue 1p2 2003
Cara S. Lesser
Objective. To describe how the organization and dynamics of health systems changed between 1999 and 2001, in the context of expectations from the mid-1990s when managed care was in ascendance, and assess the implications for consumers and policymakers. Data Sources/Study Setting. Data are from the Community Tracking Study site visits to 12 communities that were randomly selected to be nationally representative of metropolitan areas with 200,000 people or more. The Community Tracking Study is an ongoing effort that began in 1996 and is fielded every two years. Study Design. Semistructured interviews were conducted with 50,90 stakeholders and observers of the local health care market in each of the 12 communities every two years. Respondents include leaders of local hospitals, health plans, and physician organizations and representatives of major employers, state and local governments, and consumer groups. First round interviews were conducted in 1996,1997 and subsequent rounds of interviews were conducted in 1998,1999 and 2000,2001. A total of 1,690 interviews were conducted between 1996 and 2001. Data Analysis Methods. Interview information was stored and coded in qualitative data analysis software. Data were analyzed to identify patterns and themes within and across study sites and conclusions were verified by triangulating responses from different respondent types, examining outliers, searching for disconfirming evidence, and testing rival explanations. Principal Findings. Since the mid-1990s, managed care has developed differently than expected in local health care markets nationally. Three key developments shaped health care markets between 1999 and 2001: (1) unprecedented, sustained economic growth that resulted in extremely tight labor markets and made employers highly responsive to employee demands for even fewer restrictions on access to care; (2) health plans increasingly moved away from core strategies in the "managed care toolbox"; and (3) providers gained leverage relative to managed care plans and reverted to more traditional strategies of competing for patients based on services and amenities. Conclusions. Changes in local health care markets have contributed to rising costs and created new access problems for consumers. Moreover, the trajectory of change promises to make the goals of cost-control and quality improvement more difficult to achieve in the future. [source]


The integrative review: updated methodology

JOURNAL OF ADVANCED NURSING, Issue 5 2005
Robin Whittemore PhD APRN
Aim., The aim of this paper is to distinguish the integrative review method from other review methods and to propose methodological strategies specific to the integrative review method to enhance the rigour of the process. Background., Recent evidence-based practice initiatives have increased the need for and the production of all types of reviews of the literature (integrative reviews, systematic reviews, meta-analyses, and qualitative reviews). The integrative review method is the only approach that allows for the combination of diverse methodologies (for example, experimental and non-experimental research), and has the potential to play a greater role in evidence-based practice for nursing. With respect to the integrative review method, strategies to enhance data collection and extraction have been developed; however, methods of analysis, synthesis, and conclusion drawing remain poorly formulated. Discussion., A modified framework for research reviews is presented to address issues specific to the integrative review method. Issues related to specifying the review purpose, searching the literature, evaluating data from primary sources, analysing data, and presenting the results are discussed. Data analysis methods of qualitative research are proposed as strategies that enhance the rigour of combining diverse methodologies as well as empirical and theoretical sources in an integrative review. Conclusion., An updated integrative review method has the potential to allow for diverse primary research methods to become a greater part of evidence-based practice initiatives. [source]


A comparison of modern data analysis methods for X-ray and neutron specular reflectivity data

JOURNAL OF APPLIED CRYSTALLOGRAPHY, Issue 5 2007
A. Van Der Lee
Data analysis methods for specular X-ray or neutron reflectivity are compared. The methods that have been developed over the years can be classified into different types. The so-called classical methods are based on Parrat's or Abelès' formalism and rely on minimization using more or less evolved Levenberg,Marquardt or simplex routines. A second class uses the same formalism, but optimization is carried out using simulated annealing or genetic algorithms. A third class uses alternative expressions for the reflectivity, such as the Born approximation or distorted Born approximation. This makes it easier to invert the specular data directly, coupled or not with classical least-squares or iterative methods using over-relaxation or charge-flipping techniques. A fourth class uses mathematical methods founded in scattering theory to determine the phase of the scattered waves, but has to be coupled in certain cases with (magnetic) reference layers. The strengths and weaknesses of a number of these methods are evaluated using simulated and experimental data. It is shown that genetic algorithms are by far superior to traditional and advanced least-squares methods, but that they fail when the layers are less well defined. In the latter case, the methods from the third or fourth class are the better choice, because they permit at least a first estimate of the density profile to be obtained that can be refined using the classical methods of the first class. It is also shown that different analysis programs may calculate different reflectivities for a similar chemical system. One reason for this is that the representation of the layers is either described by chemical composition or by scattering length or electronic densities, between which the conversion of the absorptive part is not straightforward. A second important reason is that routines that describe the convolution with the instrumental resolution function are not identical. [source]


Measurement and data analysis methods for field-scale wind erosion studies and model validation,

EARTH SURFACE PROCESSES AND LANDFORMS, Issue 11 2003
Ted M. Zobeck
Abstract Accurate and reliable methods of measuring windblown sediment are needed to con,rm, validate, and improve erosion models, assess the intensity of aeolian processes and related damage, determine the source of pollutants, and for other applications. This paper outlines important principles to consider in conducting ,eld-scale wind erosion studies and proposes strategies of ,eld data collection for use in model validation and development. Detailed discussions include consideration of ,eld characteristics, sediment sampling, and meteorological stations. The ,eld shape used in ,eld-scale wind erosion research is generally a matter of preference and in many studies may not have practical signi,cance. Maintaining a clear non-erodible boundary is necessary to accurately determine erosion fetch distance. A ,eld length of about 300 m may be needed in many situations to approach transport capacity for saltation ,ux in bare agricultural ,elds. Field surface conditions affect the wind pro,le and other processes such as sediment emission, transport, and deposition and soil erodibility. Knowledge of the temporal variation in surface conditions is necessary to understand aeolian processes. Temporal soil properties that impact aeolian processes include surface roughness, dry aggregate size distribution, dry aggregate stability, and crust characteristics. Use of a portable 2 tall anemometer tower should be considered to quantify variability of friction velocity and aerodynamic roughness caused by surface conditions in ,eld-scale studies. The types of samplers used for sampling aeolian sediment will vary depending upon the type of sediment to be measured. The Big Spring Number Eight (BSNE) and Modi,ed Wilson and Cooke (MWAC) samplers appear to be the most popular for ,eld studies of saltation. Suspension ,ux may be measured with commercially available instruments after modi,cations are made to ensure isokinetic conditions at high wind speeds. Meteorological measurements should include wind speed and direction, air temperature, solar radiation, relative humidity, rain amount, soil temperature and moisture. Careful consideration of the climatic, sediment, and soil surface characteristics observed in future ,eld-scale wind erosion studies will ensure maximum use of the data collected. Copyright © 2003 John Wiley & Sons, Ltd. [source]


Data Mining for Bioprocess Optimization

ENGINEERING IN LIFE SCIENCES (ELECTRONIC), Issue 3 2004
S. Rommel
Abstract Although developed for completely different applications, the great technological potential of data analysis methods called "data mining" has increasingly been realized as a method for efficiently analyzing potentials for optimization and for troubleshooting within many application areas of process, technology. This paper presents the successful application of data mining methods for the optimization of a fermentation process, and discusses diverse characteristics of data mining for biological processes. For the optimization of biological processes a huge amount of possibly relevant process parameters exist. Those input variables can be parameters from devices as well as process control parameters. The main challenge of such optimizations is to robustly identify relevant combinations of parameters among a huge amount of process parameters. For the underlying process we found with the application of data mining methods, that the moment a special carbohydrate component is added has a strong impact on the formation of secondary components. The yield could also be increased by using 2 m3 fermentors instead of 1 m3 fermentors. [source]


Sexual boundary violations in residential drug-free therapeutic community treatment

INTERNATIONAL JOURNAL OF APPLIED PSYCHOANALYTIC STUDIES, Issue 4 2008
William H. Gottdiener
Abstract The purpose of this study was to investigate the phenomenon of sexual boundary violations in residential therapeutic community (TC) programs for substance abusers using secondary data analysis methods. One hundred and ninety-seven participants who had been treated in TCs in New York City were interviewed 30 days post-discharge. Sexual boundary violations were reported by approximately one-quarter of the sample. Possible psychodynamic causes (e.g. enactments, group dynamics) of sexual boundary violations are discussed. Implications for research, theory, and for prevention of sexual boundary violations in TC programs are addressed. Copyright © 2008 John Wiley & Sons, Ltd. [source]


A comparison of modern data analysis methods for X-ray and neutron specular reflectivity data

JOURNAL OF APPLIED CRYSTALLOGRAPHY, Issue 5 2007
A. Van Der Lee
Data analysis methods for specular X-ray or neutron reflectivity are compared. The methods that have been developed over the years can be classified into different types. The so-called classical methods are based on Parrat's or Abelès' formalism and rely on minimization using more or less evolved Levenberg,Marquardt or simplex routines. A second class uses the same formalism, but optimization is carried out using simulated annealing or genetic algorithms. A third class uses alternative expressions for the reflectivity, such as the Born approximation or distorted Born approximation. This makes it easier to invert the specular data directly, coupled or not with classical least-squares or iterative methods using over-relaxation or charge-flipping techniques. A fourth class uses mathematical methods founded in scattering theory to determine the phase of the scattered waves, but has to be coupled in certain cases with (magnetic) reference layers. The strengths and weaknesses of a number of these methods are evaluated using simulated and experimental data. It is shown that genetic algorithms are by far superior to traditional and advanced least-squares methods, but that they fail when the layers are less well defined. In the latter case, the methods from the third or fourth class are the better choice, because they permit at least a first estimate of the density profile to be obtained that can be refined using the classical methods of the first class. It is also shown that different analysis programs may calculate different reflectivities for a similar chemical system. One reason for this is that the representation of the layers is either described by chemical composition or by scattering length or electronic densities, between which the conversion of the absorptive part is not straightforward. A second important reason is that routines that describe the convolution with the instrumental resolution function are not identical. [source]


Phase composition depth profiles using spatially resolved energy dispersive X-ray diffraction

JOURNAL OF APPLIED CRYSTALLOGRAPHY, Issue 6 2004
Andrew C. Jupe
Spatially resolved energy dispersive X-ray diffraction, using high-energy synchrotron radiation (,35,80,keV), was used nondestructively to obtain phase composition profiles along the radii of cylindrical cement paste samples to characterize the progress of the chemical changes associated with sulfate attack on the cement. Phase distributions were acquired to depths of ,4,mm below the specimen surface with sufficient spatial resolution to discern features less than 200,µm thick. The experimental and data analysis methods employed to obtain quantitative composition profiles are described. The spatial resolution that could be achieved is illustrated using data obtained from copper cylinders with a thin zinc coating. The measurements demonstrate that this approach is useful for nondestructively visualizing the sometimes complex transformations that take place during sulfate attack on cement-based materials. These transformations can be spatially related to microstructure as seen by computed microtomography. [source]


Advanced solution scattering data analysis methods and their applications

JOURNAL OF APPLIED CRYSTALLOGRAPHY, Issue 3-1 2000
D. I. Svergun
A method for ab initio low-resolution shape and internal structure retrieval from contrast variation solution scattering data is described. The method uses a multiphase model of a particle build from densely packed dummy atoms and employs simulated annealing to find a compact interconnected configuration of phases that fits the available experimental data. In the particular case of a single phase particle (shape determination) the method is compared to another ab initio method using low resolution envelope functions. Examples of the shape determination of several proteins from experimental X-ray scattering data are presented. [source]


Characterization of drug,protein interactions in blood using high-performance affinity chromatography

JOURNAL OF SEPARATION SCIENCE, JSS, Issue 5-6 2009
David S. Hage
Abstract The binding of drugs with proteins in blood, serum, or plasma is an important process in determining the activity, distribution, rate of excretion, and toxicity of drugs in the body. High-performance affinity chromatography (HPAC) has received a great deal of interest as a means for studying these interactions. This review examines the various techniques that have been used in HPAC to examine drug,protein binding and discusses the types of information that can be obtained through this approach. A comparison of these techniques with traditional methods for binding studies (e.g., equilibrium dialysis and ultrafiltration) will also be presented. The use of HPAC with specific serum proteins and binding agents will then be discussed, including HSA and ,1 -acid glycoprotein (AGP). Several examples from the literature are provided to illustrate the applications of such research. Recent developments in this field are also described, such as the use of improved immobilization techniques, new data analysis methods, techniques for working directly with complex biological samples, and work with immobilized lipoproteins. The relative advantages and limitations of the methods that are described will be considered and the possible use of these techniques in the high-throughput screening or characterization of drug,protein binding will be discussed. [source]


A sensitive two-color electrophoretic mobility shift assay for detecting both nucleic acids and protein in gels

PROTEINS: STRUCTURE, FUNCTION AND BIOINFORMATICS, Issue 7 2003
Debra Jing
Abstract DNA-binding proteins are key to the regulation and control of gene expression, replication and recombination. The electrophoretic mobility shift assay (or gel shift assay) is considered an essential tool in modern molecular biology for the study of protein-nucleic acid interactions. As typically implemented, however, the technique suffers from a number of shortcomings, including the handling of hazardous 32P-labeled DNA probes, and difficulty in quantifying the amount of DNA and especially the amount of protein in the gel. A new detection method for mobility-shift assays is described that represents a significant improvement over existing techniques. The assay is fast, simple, does not require the use of radioisotopes and allows independent quantitative determination of: (i) free nucleic acid, (ii) bound nucleic acid, (iii) bound protein, and (iv) free protein. Nucleic acids are detected with SYBR® Green EMSA dye, while proteins are subsequently detected with SYPRO® Ruby EMSA dye. All fluorescence staining steps are performed after the entire gel-shift experiment is completed, so there is no need to prelabel either the DNA or the protein and no possibility of the fluorescent reagents interfering with the protein-nucleic acid interactions. The ability to independently quantify each molecular species allows more rigorous data analysis methods to be applied, especially with respect to the mass of protein bound per nucleic acid. [source]