Several Methods (several + methods)

Distribution by Scientific Domains


Selected Abstracts


Statistical Analysis of Surgical Dog-Ear Regression

DERMATOLOGIC SURGERY, Issue 8 2008
KYUNG SUK LEE MD
BACKGROUND Several methods have been developed to prevent or correct dog-ears. Most of these methods, however, result in prolonged scars and operative times. OBJECTIVE We observed dog-ears without correction to examine the regression of dog-ears with time. METHODS The study was performed on 43 cases of dog-ears in 26 patients. Linear regression analysis was performed to examine the correlation between various factors and the height of the dog-ears (%). We produced a regression equation to allow prediction of the height of the dog-ears (%). In addition, we estimated the initial height of the dog-ears that should be removed during surgery. RESULTS The height of dog-ears regressed with time, and this response was better in younger and female patients. It was predicted that the time taken for a dog-ear to reduce to 50% of its original height was 20.697 days; the median time at which dog-ears completely regressed was 132 days. The odds of regression of dog-ears with an initial height of ,8 mm was 4.667 times greater than that of larger dog-ears. CONCLUSIONS If the height of a dog-ear is ,8 mm, we recommend observation rather than immediate surgical removal. [source]


The Corset Platysma Repair: A Technique Revisited

DERMATOLOGIC SURGERY, Issue 3 2002
Carolyn I. Jacob MD
background. Platysma banding along with excess submental adipose tissue and sagging skin can lead to an aged appearance. Several methods for improving neck and submental contours exist, including neck liposuction, bilateral platysma plication, midline platysma plication with transection of distal fibers, necklift with skin excision, and botulinum toxin injection for platysma relaxation. With the current interest in minimally invasive procedures, surgeons and patients are searching for techniques that produce maximal improvement with minimal intervention. objective. To present a modified technique for maximizing neck contouring, discuss possible complications of the procedure, and describe appropriate candidates for the corset platysmaplasty. methods. We performed a retrospective analysis of 10 consecutive patients who underwent neck liposuction with concomitant corset platysmaplasty at our institution. results. All 10 patients achieved good to excellent submental and jawline contouring, determined by both physician and patient assessment, with no visible platysma banding at 6 months follow-up. No major complications were noted. conclusion. Use of corset platysmaplasty is a safe and effective method for neck rejuvenation. This variation of platysmaplasty can be used in conjunction with neck liposuction to maximize jawline and neck contour enhancement. [source]


Perfusional evaluation of postesophagectomy gastroplasty with a radioisotopic study

DISEASES OF THE ESOPHAGUS, Issue 6 2008
G. Gabiatti
SUMMARY., Anastomotic fistula represents one of the frequent causes of postoperative morbidity and mortality following transhiatal esophageal resections. The main etiological factor is the ischemia of the gastric tube created for digestive transit reconstruction. Evidence suggests that per operative hypoperfusion can be maintained or even impaired after the surgery. Several methods have been employed in an attempt to assess the blood perfusion of the gastric flap, but they all pose limitations. However, there is a chronological relationship between perfusion assessments, which are almost exclusively performed per operatively, and the occurrence of a leak, which commonly appears several days after the surgery. The authors have developed a method of gastric perfusion evaluation by single photon emission computed tomography scintigraphy, which corrects that temporal matter, allowing the estimation of postoperative gastric perfusion. It is noninvasive, low cost, and may be applied by the time frame when most fistulas occur. High correlation between the event fistula and the low radiotracer uptake in the group of studied patients could be demonstrated. A role in the research of perfusion evaluation of different types of esophageal reconstruction is suggested. [source]


Inelastic deformation response of SDOF systems subjected to earthquakes

EARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 3 2002
Rafael Riddell
Abstract Performance-based seismic design requires reliable methods to predict earthquake demands on structures, and particularly inelastic deformations, to ensure that specific damage-based criteria are met. Several methods based on the response of equivalent linear single-degree-of-freedom (SDOF) systems have been proposed to estimate the response of multi-degree-of-freedom structures. These methods do not offer advantages over the traditional Veletsos,Newmark,Hall (VNH) procedure, indeed, they have been shown to be inaccurate. In this study, the VNH method is revised, considering the inelastic response of elastoplastic, bilinear, and stiffness-degrading systems with 5% damping subjected to two sets of earthquake ground motions. One is an ensemble of 51 earthquake records in the Circumpacific Belt, and the other is a group of 44 records in California. A statistical analysis of the response data provides factors for constructing VNH inelastic spectra. Such factors show that the ,equal-displacement' and ,equal-energy' rules to relate elastic and inelastic responses are unconservative for high ductilities in the acceleration- and velocity-sensitive regions of the spectrum. It is also shown that, on average, the effect of the type of force,deformation relationship of non-linear systems is not significant, and responses can be conservatively predicted using the simple elastoplastic model. Copyright © 2001 John Wiley & Sons, Ltd. [source]


State of the Art in the Field of Electronic and Bioelectronic Tongues , Towards the Analysis of Wines

ELECTROANALYSIS, Issue 23 2009
Jiri Zeravik
Abstract This review compares various types of (bio)electronic tongues. The design and principles of potentiometric and voltammetric electronic tongues are discussed together with applications in food and environmental analysis. Different approaches towards bioelectronic tongue are presented. Several methods for evaluation and interpretation of the measured data are described. Finally, the potential of such devices for analysis of wine is discussed. [source]


Application of Exchangeable Biochemical Reactors with Oxidase-Catalase-Co-immobilizates and Immobilized Microorganisms in a Microfluidic Chip-Calorimeter

ENGINEERING IN LIFE SCIENCES (ELECTRONIC), Issue 5 2008
M. Leifheit
Abstract Several methods for the quantitative detection of different compounds, e.g., L -amino acids, sugars or alcohols in liquid media were developed by application of an automatic measuring unit including a fluid chip-calorimeter FCC-21. For this purpose, enzymes were immobilized covalently on the inner and outer surface of CPG (controlled porous glass)-spherules with an outer diameter of 100,,m and filled into a micro flow-through reaction chamber (VR = 20,,L). The design of the measuring cell allows for easy insertion into the calorimeter device of a stored series of comfortably pre-fabricated measuring cells. These cells can be filled with different enzyme immobilizates. Different oxidases were used and co-immobilized with catalase for the improvement of the detection sensitivity. A signal amplification could be achieved up to a factor of 3.5 with this configuration. ,- D -glucose, ethanol and L -lysine could be detected in a range of 0.25,1.75,mM using glucose oxidase, alcohol oxidase and lysine oxidase. The group of oxidases in combination with the enzymatic catalysis of the intermediate H2O2 allows the quantitative detection of a large number of analytes. A good measurement and storage stability could be achieved for several weeks by this immobilization method. In addition to enzyme-based detection reactions, it was shown that living microorganisms can be immobilized in the reaction chamber. Thus, the system can be used as a whole-cell biosensor. The quantitative detection of phenol in the range of 10,100,,M could be performed using the actinomycete Rhodococcus sp. immobilized on glass beads by means of embedding into polymers. [source]


Soil metaproteomics: a review of an emerging environmental science.

EUROPEAN JOURNAL OF SOIL SCIENCE, Issue 6 2009
Significance, methodology, perspectives
Summary Soil is a dynamic system in which microorganisms perform important tasks in organic matter transformations and nutrient cycles. Recently, some studies have started to focus on soil metaproteomics as a tool for understanding the function and the role of members of the microbial community. The aim of our work was to provide a review of soil proteomics by looking at the methodologies used in order to illustrate the challenges and gaps in this field, and to provide a broad perspective about the use and meaning of soil metaproteomics. The development of soil metaproteomics is influenced strongly by the extraction methods. Several methods are available but only a few provide an identification of soil proteins, while others extract proteins and are able to separate them by electrophoresis but do not provide an identification. The extraction of humic compounds together with proteins interferes with the latter's separation and identification, although some methods can avoid these chemical interferences. Nevertheless, the major problems regarding protein identification reside in the fact that soil is a poor source of proteins and that there is not enough sequence-database information for the identification of proteins by mass spectrometric analysis. Once these pitfalls have been solved, the identification of soil proteins may provide information about the biogeochemical potential of soils and pollutant degradation and act as an indicator of soil quality, identifying which proteins and microorganisms are affected by a degradation process. The development of soil metaproteomics opens the way to proteomic studies in other complex substrates, such as organic wastes. These studies can be a source of knowledge about the possibility of driven soil restoration in polluted and degraded areas with low organic matter content and even for the identification of enzymes and proteins with a potential biotechnological value. [source]


Velocity analysis based on data correlation

GEOPHYSICAL PROSPECTING, Issue 6 2008
T. Van Leeuwen
ABSTRACT Several methods exist to automatically obtain a velocity model from seismic data via optimization. Migration velocity analysis relies on an imaging condition and seeks the velocity model that optimally focuses the migrated image. This approach has been proven to be very successful. However, most migration methods use simplified physics to make them computationally feasible and herein lies the restriction of migration velocity analysis. Waveform inversion methods use the full wave equation to model the observed data and more complicated physics can be incorporated. Unfortunately, due to the band-limited nature of the data, the resulting inverse problem is highly nonlinear. Simply fitting the data in a least-squares sense by using a gradient-based optimization method is sometimes problematic. In this paper, we propose a novel method that measures the amount of focusing in the data domain rather than the image domain. As a first test of the method, we include some examples for 1D velocity models and the convolutional model. [source]


Hypoxia in head and neck cancer: How much, how important?

HEAD & NECK: JOURNAL FOR THE SCIENCES & SPECIALTIES OF THE HEAD AND NECK, Issue 7 2005
H. L. Janssen MD
Abstract Background. Hypoxia develops in tumors because of a less ordered, often chaotic, and leaky vascular supply compared with that in normal tissues. In preclinical models, hypoxia has been shown to be associated with treatment resistance and increased malignant potential. In the clinic, several reports show the presence and extent of tumor hypoxia as a negative prognostic indicator. This article reviews the biology and importance of hypoxia in head and neck cancer. Methods. A review of literature was carried out and combined with our own experience on hypoxia measurements using exogenous and endogenous markers. Results. Hypoxia can increase resistance to radiation and cytotoxic drugs and lead to malignant progression, affecting all treatment modalities, including surgery. Hypoxia measurements using electrodes, exogenous bioreductive markers, or endogenous markers show the presence of hypoxia in most head and neck cancers, and correlations with outcome, although limited, consistently indicate hypoxia as an important negative factor. Each hypoxia measurement method has disadvantages, and no "gold standard" yet exists. Distinctions among chronic, acute, and intermediate hypoxia need to be made, because their biology and relevance to treatment resistance differ. Reliable methods for measuring these different forms in the clinic are still lacking. Several methods to overcome hypoxia have been tested clinically, with radiosensitizers (nimorazole), hypoxic cytotoxins (tirapazamine), and carbogen showing some success. New treatments such as hypoxia-mediated gene therapy await proper clinical testing. Conclusions. The hypoxia problem in head and neck cancer needs to be addressed if improvements in current treatments are to be made. Increased knowledge of the molecular biology of intermediate, severe, and intermittent hypoxia is needed to assess their relevance and indicate strategies for overcoming their negative influence. © 2005 Wiley Periodicals, Inc. Head Neck27: XXX,XXX, 2005 [source]


Extracting Parameters from the Current,Voltage Characteristics of Organic Field-Effect Transistors

ADVANCED FUNCTIONAL MATERIALS, Issue 11 2004
G. Horowitz
Abstract Organic field-effect transistors were fabricated with vapor-deposited pentacene on aluminum oxide insulating layers. Several methods are used in order to extract the mobility and threshold voltage from the transfer characteristic of the devices. In all cases, the mobility is found to depend on the gate voltage. The first method consists of deriving the drain current as a function of gate voltage (transconductance), leading to the so-called field-effect mobility. In the second method, we assume a power-law dependence of the mobility with gate voltage together with a constant contact resistance. The third method is the so-called transfer line method, in which several devices with various channel length are used. It is shown that the mobility is significantly enhanced by modifying the aluminum oxide layer with carboxylic acid self-assembled monolayers prior to pentacene deposition. The methods used to extract parameters yield threshold voltages with an absolute value of less than 2 V. It is also shown that there is a shift of the threshold voltage after modification of the aluminum oxide layer. These features seem to confirm the validity of the parameter-extraction methods. [source]


Simultaneously recorded EEG,fMRI: Removal of gradient artifacts by subtraction of head movement related average artifact waveforms

HUMAN BRAIN MAPPING, Issue 10 2009
Limin Sun
Abstract Electroencephalograms (EEGs) recorded simultaneously with functional magnetic resonance imaging (fMRI) are corrupted by large repetitive artifacts generated by the switched MR gradients. Several methods have been proposed to remove these distortions by subtraction of averaged artifact templates from the ongoing EEG. Here, we present a modification of this approach which accounts for head movements to improve the extracted template. Using the fMRI analysis package statistical parametric mapping (SPM; FIL London) the head displacement is determined at each half fMRI-volume. The basic idea is to apply a moving average algorithm for template extraction but to include only epochs that were obtained at the same head position as the artefact to be removed. This approach was derived from phantom EEG measurements demonstrating substantial variations of the artefact waveform in response to movements of the phantom in the MRI magnet. To further reduce the residual noise, we applied a resampling algorithm which aligns the EEG samples in a strict adaptive manner to the fMRI timing. Finally, we propose a new algorithm to suppress residual artifacts such as those occasionally observed in case of brief strong movements, which are not reflected by the movement indicator because of the limited temporal resolution of the fMRI sequence. On the basis of EEG recordings of six subjects these measures combined reduce the residual artefact activity quantified in terms of the spectral power at the gradient repetition rate and its harmonics by roughly 20 to 50% (depending on the amount of movement) predominantly in frequencies beyond 30 Hz. Hum Brain Mapp, 2009. © 2009 Wiley-Liss, Inc. [source]


On a multilevel preconditioning module for unstructured mesh Krylov solvers: two-level Schwarz

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN BIOMEDICAL ENGINEERING, Issue 6 2002
R. S. Tuminaro
Abstract Multilevel methods offer the best promise to attain both fast convergence and parallel efficiency in the numerical solution of parabolic and elliptic partial differential equations. Unfortunately, they have not been widely used in part because of implementation difficulties for unstructured mesh solvers. To facilitate use, a multilevel preconditioner software module, ML, has been constructed. Several methods are provided requiring relatively modest programming effort on the part of the application developer. This report discusses the implementation of one method in the module: a two-level Krylov,Schwarz preconditioner. To illustrate the use of these methods in computational fluid dynamics (CFD) engineering applications, we present results for 2D and 3D CFD benchmark problems. Copyright © 2002 John Wiley & Sons, Ltd. [source]


A test of two methods of radiographically deriving long bone cross-sectional properties compared to direct sectioning of the diaphysis

INTERNATIONAL JOURNAL OF OSTEOARCHAEOLOGY, Issue 5 2002
Jay T. Stock
Abstract Numerous studies have made use of cross-sectional geometry to describe the distribution of cortical bone in long bone diaphyses. Several methods can be used to measure or estimate cross-sectional contours. Direct sectioning (DSM) of the diaphysis is not appropriate in most curatorial contexts, and is commonly substituted with methods based upon bi-planar radiography: a latex cast method (LCM) or an eccentric elliptical method (EEM). Previous studies have demonstrated that the EEM provides accurate estimates of area measurements, while providing less accurate estimates of second moments of area (Biknevicius & Ruff, 1992; Runestad et al., 1993; Lazenby, 1997). The LCM has been commonly employed, as a way to estimate section contours more accurately, yet the validity of this method has not been adequately documented. This study measures the agreement of these methods against DSM of long bone diaphyses using 21 sections of canine tibiae derived from a study of total hip arthroplasty. The accuracy and agreement of these methods is evaluated using reduced major axis regression, paired sample t-tests and tests for agreement (Bland & Altman, 1986). The results illustrate that the LCM provides a reasonable estimate of cross-sectional dimensions, producing cross-sectional properties that are on average within 5% of properties derived from the DSM. The EEM is found to provide adequate estimates of true cross-sectional areas, but poor estimates of second moments of area. The use of the LCM is supported for all cross-sectional properties, but the EEM is only accurate in total area, cortical area and percent cortical area estimates. Copyright © 2002 John Wiley & Sons, Ltd. [source]


Sampling Procedures for Coordinating Stratified Samples: Methods Based on Microstrata

INTERNATIONAL STATISTICAL REVIEW, Issue 3 2008
Desislava Nedyalkova
Summary The aim of sampling coordination is to maximize or minimize the overlap between several samples drawn successively in a population that changes over time. Therefore, the selection of a new sample will depend on the samples previously drawn. In order to obtain a larger (or smaller) overlap of the samples than the one obtained by independent selection of samples, a dependence between the samples must be introduced. This dependence will emphasize (or limit) the number of common units in the selected samples. Several methods for coordinating stratified samples, such as the Kish & Scott method, the Cotton & Hesse method, and the Rivière method, have already been developed. Using simulations, we compare the optimality of these methods and their quality of coordination. We present six new methods based on permanent random numbers (PRNs) and microstrata. These new methods have the advantage of allowing us to choose between positive or negative coordination with each of the previous samples. Simulations are run to test the validity of each of them. Résumé Le but de la coordination d'échantillons est de maximiser ou minimiser le recouvrement de plusieurs échantillons à l'intérieur d'une population qui évolue au fil des temps. Pour effectuer une coordination, la sélection d'un nouvel échantillon dépendra donc des échantillons précédemment tirés. Afin d'obtenir un recouvrement plus fort ou plus faible que celui fourni par des tirages indépendants, une dépendance entre les échantillons doit être introduite. Cette dépendance va augmenter ou limiter le nombre d'unités communes à tous les échantillons sélectionnés. Plusieurs méthodes pour coordonner des échantillons stratifiés ont déjàété développées. Parmi eux les méthodes de Kish and Scott, de Cotton and Hesse, et de Rivière sont présentées en détail. En utilisant des simulations, on compare l'optimalité et la qualité de la coordination pour chacune de ces trois méthodes. On présente six nouvelles méthodes basées sur l'utilisation de nombres aléatoires permanents et des microstrates et on essaye de les valider à l'aide des simulations. [source]


Assessing mothers' concerns in the postpartum period: methodological issues

JOURNAL OF ADVANCED NURSING, Issue 3 2004
Helen I. Lugina MN PhD RN RM
Aim., This paper reports a study evaluating the sensitivity of a semi-structured interview schedule and card sort methods in assessing postpartum concerns of women. Background., Several methods have been used to assess postpartum maternal concerns and the process of becoming a mother, but few studies have evaluated the methods with respect to their sensitivity for obtaining information. Method., A cohort of mothers was followed-up at one (n = 110) and 6 weeks (n = 83) after childbirth in Dar es Salaam, Tanzania. Women with a minimum of 7 years of primary education were interviewed and they also sorted cards. Those with less fewer than 7 years of primary education were interviewed only. The methods were used in alternate order to assess method interaction. Results., In the interviews at 1 week, mothers more often expressed worry and interest related to the baby or themselves when they had sorted cards first. The extent to which women expressed worry and interest about specific baby- and mother-related topics was generally higher for women who had sorted cards before the interview at both 1 and 6 weeks. Independent of whether they were interviewed only, interviewed after sorting cards or before, mothers more often expressed a higher degree of interest than of worry about the baby and self at both 1 and 6 weeks. The order of the data collection methods did not influence the way women sorted cards as being worries and interests. Conclusion., Compared to interview using a semi-structured interview schedule, our findings suggest that the card sort is more sensitive in obtaining information about women's concerns. Although the interview method has the advantage of reaching less educated people, the card sort is a technique that is associated with fewer barriers and is a more participatory method for those who can use it. [source]


FREE CASH FLOW AND PUBLIC GOVERNANCE: THE CASE OF ALASKA

JOURNAL OF APPLIED CORPORATE FINANCE, Issue 3 2000
Dwight R. Lee
In a widely cited 1986 article in the American Economic Review, Michael Jensen gave the concept of free cash flow (FCF) a new twist by redefining it as cash flow in excess of that required to fund all projects with positive net present values. Put another way, FCF represents funds available in the firm that managers may choose to hold as idle cash, return to shareholders, or invest in projects with returns below the firm's cost of capital. In redefining FCF in this way, Jensen converted FCF from a measure of economic income and value into a measure of corporate assets available for discretionary, and potentially value-destroying, use by firm managers. And, as he argued in his important article, managers in mature businesses with substantial free cash flow have a tendency to destroy value by plowing too much capital back into those businesses or, often worse, making ill-advised acquisitions in unrelated businesses. Several methods have been developed in financial markets and internal corporate governance systems to discourage managers from wasting FCF. Better monitoring by boards of directors, large ownership blocks, and properly aligned management compensation contracts are all parts of the solution. And the extraordinary increase in stock repurchases in recent years, invariably applauded by investors, is another illustration of the market's success in encouraging companies to address their free cash flow problems. But if the "FCF problem" of the private sector has attracted considerable attention from finance scholars, the problem is even more acute in the public sector, where FCF can be thought of as tax revenue in excess of what is required to finance well-defined and generally accepted levels of public services. Unlike the private sector, in the public sector there are neither measures nor mechanisms by which to monitor and constrain wasteful spending by elected officials. In this article, the authors attempt to measure the costs to taxpayers of government FCF using the case of Alaska, which since 1969 has received a huge windfall of tax revenue from North Slope oil leases. After examining the state's public finances from 1968 through 1993, the authors offer $25 billion as a conservative estimate of the social losses from Alaska's waste of free cash flow during that 25-year period. [source]


Investigation of bone and cartilage by synchrotron scanning-SAXS and -WAXD with micrometer spatial resolution

JOURNAL OF APPLIED CRYSTALLOGRAPHY, Issue 3-1 2000

Biological materials such as bone or wood are hierarchically structured to optimize mechanical and other properties. Several methods and experimental techniques are usually needed to study these materials on different length scales. We developed a device for small angle X-ray scattering (SAXS) and wide-angle X-ray diffraction (WAXD), optimized for position resolved investigations of bone sections using synchrotron radiation. Thin samples can be scanned with 20 µm steps, acquiring two-dimensional SAXS or WAXD patterns at every point. The system was tested by performing one-dimensional scans across bone cartilage interfaces, revealing information about size, shape and orientation of nanometer sized mineral particles as well as about crystal type and texture of these particles. [source]


The effect of viscosity on surface tension measurements by the drop weight method

JOURNAL OF APPLIED POLYMER SCIENCE, Issue 3 2007
T. Kaully
Abstract Viscosity is one of the parameters affecting the measured surface tension, as fluid mechanics affects the measurement process using conventional methods. Several methods including the selected planes (SPM) and WDSM which combines the weight drop method (WDM) and SPM, are applied to surface tension measurement of high viscous liquids. Yet, none of them treats the viscosity effect separately. The current publication presents a simple, easy to apply empirical approach of satisfactory accuracy, for evaluation of surface tension of liquids having wide range of viscosities up to 10 Pa s. The proposed method is based on Tate's law and the "drop weight" method using calibration curves of known liquids having similar surface tensions but different viscosities. Drop weight of liquids having viscosity ,0.05 Pa s, was found to be significantly affected by the liquid viscosity. The shape factor, f, of high viscosity liquids was found to correlate linearly with the logarithm of viscosity, pointing the importance of viscosity correction. The experimental correlation presented in the current work can be used as a tool for the evaluation of surface tension for high viscosity liquids such as prepolymers. © 2007 Wiley Periodicals, Inc. J Appl Polym Sci, 2007 [source]


Cultured human keratinocytes for optical transmission measurement

JOURNAL OF BIOPHOTONICS, Issue 3 2010
David Schaaf
Abstract The challenges of measuring optical properties of human tissues include the thickness of the sample, homogenization, or crystallization from freezing of the tissue. This investigation demonstrates a method to avoid these problems by growing optically thin samples of human keratinocytes as a substitute for ex vivo epidermis samples. Several methods of growth were investigated. Resulting samples were measured on a spectrophotometer for transmission between 300 nm and 2600 nm. The efficacy of the cell growth was confirmed with histological examination of several cultured keratinocyte samples. Limitations were the requirement to measure samples immediately after removal from the incubation environment, and the absence of the irregular structures of normal skin such as hair and glands. (© 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


Value of Different Follow-Up Strategies to Assess the Efficacy of Circumferential Pulmonary Vein Ablation for the Curative Treatment of Atrial Fibrillation

JOURNAL OF CARDIOVASCULAR ELECTROPHYSIOLOGY, Issue 12 2005
CHRISTOPHER PIORKOWSKI M.D.
Background: The objective of this study was to compare transtelephonic ECG every 2 days and serial 7-day Holter as two methods of follow-up after atrial fibrillation (AF) catheter ablation for the judgment of ablation success. Patients with highly symptomatic AF are increasingly treated with catheter ablation. Several methods of follow-up have been described, and judgment on ablation success often relies on patients' symptoms. However, the optimal follow-up strategy objectively detecting most of the AF recurrences is yet unclear. Methods: Thirty patients with highly symptomatic AF were selected for circumferential pulmonary vein ablation. During follow-up, a transtelephonic ECG was transmitted once every 2 days for half a year. Additionally, a 7-day Holter was recorded preablation, after ablation, after 3 and 6 months, respectively. With both, procedures symptoms and actual rhythm were correlated thoroughly. Results: A total of 2,600 transtelephonic ECGs were collected with 216 of them showing AF. 25% of those episodes were asymptomatic. On a Kaplan-Meier analysis 45% of the patients with paroxysmal AF were still in continuous SR after 6 months. Simulating a follow-up based on symptomatic recurrences only, that number would have increased to 70%. Using serial 7-day ECG, 113 Holter with over 18,900 hours of ECG recording were acquired. After 6 months the percentage of patients classified as free from AF was 50%. Of the patients with recurrences, 30,40% were completely asymptomatic. The percentage of asymptomatic AF episodes stepwise increased from 11% prior ablation to 53% 6 months after. Conclusions: The success rate in terms of freedom from AF was 70% on a symptom-only-based follow-up; using serial 7-day Holter it decreased to 50% and on transtelephonic monitoring to 45%, respectively. Transtelephonic ECG and serial 7-day Holter were equally effective to objectively determine long-term success and to detect asymptomatic patients. [source]


Participatory land-use planning and conservation in northern Tanzania rangelands

AFRICAN JOURNAL OF ECOLOGY, Issue 2009
Abiud L. Kaswamila
Abstract In developing countries, participatory land-use planning is seen as a panacea to mitigate land-use conflicts and enhance land productivity. This assumption has not been thoroughly tested in wildlife corridors. Three villages were selected for this study. Several methods were used to provide indication of the performance of the plans against their stated objectives of mitigating conflicts and conserving wildlife corridors. Three hundred and fifty-eight households and eight park and extension workers were interviewed. In addition, focus group discussion with the nomadic Barabeig, field assessment and review of land-use plan/general management plan reports were carried out. Results reveal that land-use plans failed to achieve their set objectives. For example, 75% of the households held this view. Major causes of failure were insufficient participation by stakeholders in the planning process, lack of robust, transparent and accountable implementation strategies, inadequacy of qualified staff and lack of ,holistic approach' to the planning process. Taking these findings into account, an improved buffer zone land-use planning framework is suggested. For the framework to enhance both conservation and development and to enable policies and legislation, equitable benefit sharing and conservation education, initiation of compensation schemes for depredation caused by wild animals and intensification of patrols are required. [source]


Correlation between the Individual and the Combined Width of the Six Maxillary Anterior Teeth

JOURNAL OF ESTHETIC AND RESTORATIVE DENTISTRY, Issue 3 2009
LUIZ CARLOS GONÇALVES DMD
ABSTRACT Purpose:, There is a consensus in the community of dental research that the selection of undersized artificial maxillary anterior teeth offers an unnatural appearance to the denture. Several methods to select the adequate width of these teeth are of questionable validity, and many dentures have an obviously artificial appearance. This article assessed the relationship between the individual and the combined width of maxillary anterior teeth. Materials and Methods:, Impressions were made of the anterior dentition of 69 dentate undergraduate students with rubber impression silicon, and casts were formed. The individual widths of the maxillary anterior teeth were measured by using a digital caliper (SC-6 digital caliper, Mitutoyo Corporation, Tokyo, Japan), and the combined width was registered by both adding the individual width and using a flexible millimeter ruler. Results:, Student's t -test showed significant differences between the analogous teeth and different sides of the maxillary dental arch (p = 0.001), with the exception of the central incisor (p = 0.984). Pearson's product moment correlation coefficient showed significant positive correlation between all the measurements compared (p = 0.000). Linear regression analysis concluded three mathematical equations to obtain the individual tooth width after measuring the combined width of the six maxillary anterior teeth by using a flexible millimeter ruler. Conclusions:, The individual tooth width can be determined if the combined width of the maxillary anterior teeth is obtained by using a flexible millimeter ruler. CLINICAL SIGNIFICANCE The adequate selection of each maxillary anterior tooth width can offer variance and individuality to the denture, particularly for partially dentate patients. By offering an adequate tooth-to-tooth relationship, the esthetic result of the oral rehabilitation treatment can be improved. [source]


An improved, simple nest-box trap

JOURNAL OF FIELD ORNITHOLOGY, Issue 1 2008
Scott L. Friedman
ABSTRACT The success of ornithological studies often hinges on a researcher's ability to capture individuals quickly and efficiently. Sometimes it is necessary to capture the same individual multiple times, as is the case in many metabolic, ecotoxicological, and immunocompetence studies. Several methods of capturing cavity-nesting birds at their nest boxes have been described. However, these methods proved inefficient when attempting to catch wary individuals that had already been captured previously. Here we describe a simple and inexpensive method for capturing cavity-nesting birds using a square plate of sheet metal (5.8 × 5.8 × 0.2 cm), a drinking straw, a piece of duct tape, and a monofilament line. This method has the advantages of allowing selective capture of one, but not both members of a pair and being nearly invisible to trap-shy birds. SINOPSIS El éxito de estudios ornitológicos está atado, muchas veces, a la habilidad del investigador para atrapar aves de forma rápida y eficiente. En ocasiones es necesario capturar el mismo individuo multiples veces, como es en el caso de estudios metabólicos, ecotoxicológicos o de inmunocompetencia. Se han descrito varios métodos para atrapar aves que anidan en cajas. Sin embargo, estos métodos han provado ser ineficientes cuando se intenta capturar aves que han sido alertadas por haberse capturado anteriormente. Describimos un método, simple y de bajo costo, para capturar aves que anidan en cajas, utilizando una plancha cuadrada de metal (5.8 × 5.8 × 0.2 cm), un sorbeto y un pedazo de cinta adhesiva plástica (duck tape) y un monofilamento. Este método tiene ventajas, y permite la captura selectiva de uno de los miembros de la pareja. El mismo es virtualmente invisible para las aves. [source]


The Relationship Between Self-Reported Drinking and BAC Level in Emergency Room Injury Cases: Is it a Straight Line?

ALCOHOLISM, Issue 6 2010
Jason Bond
Background:, While the validity of self-reported consumption based on blood alcohol concentration (BAC) has been found to be high in emergency room (ER) samples, little research exists on the estimated number of drinks consumed given a BAC level. Such data would be useful in establishing a dose,response relationship between drinking and risk (e.g., of injury) in those studies for which the number of drinks consumed is not available but BAC is. Methods:, Several methods were used to estimate the number of drinks consumed in the 6 hours prior to injury based on BAC obtained at the time of ER admission of n = 1,953 patients who self-reported any drinking 6 hours prior to their injury and who arrived to the ER within 6 hours of the event, from the merged Emergency Room Collaborative Alcohol Analysis Project (ERCAAP) and the World Health Organization Collaborative Study on Alcohol and Injury across 16 countries. Results:, The relationship between self-reported consumption and averaged BAC within each consumption level appeared to be fairly linear up to about 7 drinks and a BAC of approximately 100 mg/dl. Above about 7 reported drinks, BAC appeared to have no relationship with drinking, possibly representing longer consumption periods than only the 6 hours before injury for those reporting higher quantities consumed. Both the volume estimate from the bivariate BAC to self-report relationship as well as from a Widmark calculation using BAC and time from last drink to arrival to the ER indicated a somewhat weak relationship to actual number of self-reported drinks. Conclusions:, Future studies may benefit from investigating the factors suspected to be driving the weak relationships between these measures, including the actual time over which the reported alcohol was consumed and pattern of drinking over the consumption period. [source]


ICK Classification System for Partially Edentulous Arches

JOURNAL OF PROSTHODONTICS, Issue 6 2008
Sulieman S. Al-Johany BDS
Abstract Several methods of classification of partially edentulous arches have been proposed and are in use. The most familiar classifications are those originally proposed by Kennedy, Cummer, and Bailyn. None of these classification systems include implants, simply because most of them were proposed before implants became widely accepted. At this time, there is no classification system for partially edentulous arches incorporating implants placed or to be placed in the edentulous spaces for a removable partial denture (RPD). This article proposes a simple classification system for partially edentulous arches with implants based on the Kennedy classification system, with modification, to be used for RPDs. It incorporates the number and positions of implants placed or to be placed in the edentulous areas. A different name, Implant-Corrected Kennedy (ICK) Classification System, is given to the new classification system to be differentiated from other partially edentulous arch classification systems. [source]


An Alternative Technique for Fabrication of an Occlusal Device

JOURNAL OF PROSTHODONTICS, Issue 5 2008
FACP, Mijin Choi DDS
Abstract Several methods have been described for fabrication of occlusal devices, but many require complex and time-consuming laboratory procedures. In this article, an alternative fabrication method for a hard occlusal device while maintaining the articulation of the cast is described. [source]


Options for Sustaining School-Based Health Centers

JOURNAL OF SCHOOL HEALTH, Issue 4 2004
Susan M. Swider
ABSTRACT: Several methods exist for financing and sustaining operations of school-based health centers (SBHCs). Promising sources of funds include private grants, federal grants, and slate funding. Recently, federal regulation changes mandated that federal funding specifically for SBHCs go only to SBHCs affiliated with a Federally Qualified Health Center (FQHC). Becoming a FQHC allows a SBHC to bill Medicaid at a higher rate, be notified about federal grants, and access the federal drug-pricing program. However, FQHCs must bill for services, including a sliding-fee scale based on ability to pay; develop a governance board with a majority of consumer members; provide a set of designated primary care services; and serve all people regardless of ability to pay. Private grants impose fewer restrictions and usually provide start-up and demonstration funds for specific program needs. Such funds are generally time limited, so new programs need to be incorporated into the operational budget of the center. State funding proves relatively stable, but fiscal challenges in some states made these funds less available. Using a variety of funding sources will enable ongoing provision of health care to students. Overall, SBHCs should consider infrastructure development that allows a variety of funding options, including formalizing existing partnership commitments, engaging in a needs assessment and strategic planning process, developing the infrastructure for FQHC status, and implementing a billing system for client services. [source]


COMPARISON OF METHODS FOR ANALYZING REPLICATED PREFERENCE TESTS

JOURNAL OF SENSORY STUDIES, Issue 6 2005
CHUN-YEN CHANG COCHRANE
ABSTRACT Preference testing is commonly used in consumer sensory evaluation. Traditionally, it is done without replication, effectively leading to a single 0/1 (binary) measurement on each panelist. However, to understand the nature of the preference, replicated preference tests are a better approach, resulting in binomial counts of preferences on each panelist. Variability among panelists then leads to overdispersion of the counts when the binomial model is used and to an inflated Type I error rate for statistical tests of preference. Overdispersion can be adjusted by Pearson correction or by other models such as correlated binomial or beta-binomial. Several methods are suggested or reviewed in this study for analyzing replicated preference tests and their Type I error rates and power are compared. Simulation studies show that all methods have reasonable Type I error rates and similar power. Among them, the binomial model with Pearson adjustment is probably the safest way to analyze replicated preference tests, while a normal model in which the binomial distribution is not assumed is the easiest. [source]


Behaviour of carbamate pesticides in gas chromatography and their determination with solid-phase extraction and solid-phase microextraction as preconcentration steps

JOURNAL OF SEPARATION SCIENCE, JSS, Issue 16 2005
Rita Carabias-Martínez
Abstract This work reports a study of the chromatographic behaviour of seven carbamate pesticides (aldicarb, carbetamide, propoxur, carbofuran, carbaryl, methiocarb, and pirimicarb) by gas chromatography-mass spectrometry (GC-MS). Variables such as injector temperature, solvent, injection mode, and the degree of ageing of the chromatographic column were studied. One of the aims of this work was to achieve a controlled decomposition of carbamates by a solid-phase microextraction (SPME) preconcentration step with a polyacrylate fibre in order to obtain reproducible chromatographic signals of the degradation products. Optimisation of the SPME process was accomplished by means of experimental design. Several methods using ultrapure water were developed with different preconcentration configurations: SPME-GC-MS, SPE followed by SPME-GC-MS, and SPE plus GC-MS. For all the pesticides studied, method detection limit (MDL) values below 0.1 ,g L,1 were reached in at least one of the proposed configurations. [source]


Industrial tools for the feature location problem: an exploratory study

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 6 2006
Sharon Simmons
Abstract Software engineers who maintain and enhance large systems often encounter the feature location problem: where in the many thousands of lines of code is a particular user feature implemented? Several methods of addressing the problem have been proposed, most of which involve tracing the execution of the system and analyzing the traces. Some supporting academic tools are available. However, companies that depend on the successful evolution of large systems are more likely to use new methods if they are supported by industrial-strength tools of known reliability. This article describes a study performed with Motorola, Inc. to see whether there were any pitfalls in using Metrowerks CodeTEST and Klocwork inSight for feature location on message-passing software similar to systems that Motorola maintains. These two tools were combined with TraceGraph, an academic trace comparison tool. The study identified two main problems. First, some ,glue' code and workarounds were needed to get CodeTEST to generate a trace for an interval of time in which the feature was operating. Second, getting information out of TraceGraph and into inSight was needlessly complicated for a user. However, with a moderate amount of work, the tool combination was effective in locating, understanding and documenting features. Study participants completed these steps in typically 3,4 hours per feature, studying only a few hundred lines out of a 200,000 line system. An ongoing project with Motorola is focused on improving tool integration with the hope of making feature location common practice at Motorola. Copyright © 2006 John Wiley & Sons, Ltd. [source]