Different Techniques (different + techniques)

Distribution by Scientific Domains
Distribution within Medical Sciences

Terms modified by Different Techniques

  • different techniques used

  • Selected Abstracts


    Volume fraction based miscible and immiscible fluid animation

    COMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 3-4 2010
    Kai Bao
    Abstract We propose a volume fraction based approach to effectively simulate the miscible and immiscible flows simultaneously. In this method, a volume fraction is introduced for each fluid component and the mutual interactions between different fluids are simulated by tracking the evolution of the volume fractions. Different techniques are employed to handle the miscible and immiscible interactions and special treatments are introduced to handle flows involving multiple fluids and different kinds of interactions at the same time. With this method, second-order accuracy is preserved in both space and time. The experiment results show that the proposed method can well handle both immiscible and miscible interactions between fluids and much richer mixing detail can be generated. Also, the method shows good controllability. Different mixing effects can be obtained by adjusting the dynamic viscosities and diffusion coefficients. Copyright © 2010 John Wiley & Sons, Ltd. [source]


    Use of Honey as an Adjunct in the Healing of Split-Thickness Skin Graft Donor Site

    DERMATOLOGIC SURGERY, Issue 2 2003
    Aykut Misirlioglu MD
    BACKGROUND Different techniques are being used in treatment of split-thickness skin graft donor sites; however, there is not a widely accepted method established for these partial-thickness wounds. It is well known that honey has been very effective in the treatment of various types of wounds, but there is not any information about the usage of honey as split-thickness skin graft donor site dressing in the literature. OBJECTIVE To evaluate and compare the effectiveness of honey-impregnated gauzes, hydrocolloid dressings, and as a conventional dressing, saline-soaked gauzes for skin graft donor sites. METHODS This is a nonrandomized, prospective, open-label (noncontrolled), side-by-side comparison trial of various options that are available for second-intention healing of donor site for split-thickness skin grafts. Eighty-eight patients who underwent skin grafting were observed using two different groups. In the first group, the donor site was divided into two equal halves, with each half being treated with honey-soaked gauzes and the other half with paraffin gauzes (group 1A), hydrocolloid dressings (group 1B), and saline-soaked gauzes (group 1C) alternatively. In the second group, two separate donor sites were formed, with one of them being treated with honey-impregnated gauzes (groups 2A,C) and the other one treated with either paraffin gauzes (group 2A), hydrocolloid dressings (group 2B), or saline-soaked gauzes (group 2C). The healing time, rate of infection, and sense of pain were evaluated. RESULTS In the treatment of split-thickness skin graft donor sites, honey-impregnated gauzes showed faster epithelization time and a low sense of pain than paraffin gauzes and saline-soaked gauzes. There was no significant difference between honey-impregnated gauzes and hydrocolloid dressings with regard to epithelization time and sense of pain. CONCLUSION The use of honey-impregnated gauzes is effective, safe, and practical. Honey can be an alternative material for the split-thickness skin graft donor site treatment. [source]


    Molecular analysis of ammonia oxidation and denitrification in natural environments

    FEMS MICROBIOLOGY REVIEWS, Issue 5 2000
    Hermann Bothe
    Abstract This review summarizes aspects of the current knowledge about the ecology of ammonia-oxidizing and denitrifying bacteria. The development of molecular techniques has contributed enormously to the rapid recent progress in the field. Different techniques for doing so are discussed. The characterization of ammonia-oxidizing and -denitrifying bacteria by sequencing the genes encoding 16S rRNA and functional proteins opened the possibility of constructing specific probes. It is now possible to monitor the occurrence of a particular species of these bacteria in any habitat and to get an estimate of the relative abundance of different types, even if they are not culturable as yet. These data indicate that the composition of nitrifying and denitrifying communities is complex and apparently subject to large fluctuations, both in time and in space. More attempts are needed to enrich and isolate those bacteria which dominate the processes, and to characterize them by a combination of physiological, biochemical and molecular techniques. While PCR and probing with nucleotides or antibodies are primarily used to study the structure of nitrifying and denitrifying communities, studies of their function in natural habitats, which require quantification at the transcriptional level, are currently not possible. [source]


    Techniques for liver parenchymal transection: a meta-analysis of randomized controlled trials

    HPB, Issue 4 2009
    Viniyendra Pamecha
    Abstract Background:, Different techniques of liver parenchymal transection have been described, including the finger fracture, sharp dissection, clamp,crush methods and, more recently, the Cavitron ultrasonic surgical aspirator (CUSA), the hydrojet and the radiofrequency dissection sealer (RFDS). This review assesses the benefits and risks associated with the various techniques. Methods:, Randomized clinical trials were identified from the Cochrane Library Trials Register, MEDLINE, EMBASE, Science Citation Index Expanded and reference lists. Odds ratio (ORs), mean difference (MDs) and standardized mean differences (SMDs) were calculated with 95% confidence intervals based on intention-to-treat analysis or available-case analysis. Results:, We identified seven trials including a total of 556 patients. Blood transfusion requirements were lower with the clamp,crush technique than with the CUSA or hydrojet. The clamp,crush technique was quicker than the CUSA, hydrojet or RFDS. Infective complications and transection blood loss were greater with the RFDS than with the clamp,crush method. There was no significant difference between techniques in mortality, morbidity, liver dysfunction or intensive therapy unit and hospital stay. Conclusions:, The clamp,crush technique is more rapid and is associated with lower rates of blood loss and otherwise similar outcomes when compared with other methods of parenchymal transection. It represents the reference standard against which new methods may be compared. [source]


    Radiofrequency ablation-assisted liver resection: review of the literature and our experience

    HPB, Issue 4 2006
    Peng Yao
    Abstract Background: Surgical resection is the best established treatment known to provide long-term survival and possibility of cure for liver malignancy. Intraoperative blood loss has been the major concern during major liver resections, and mortality and morbidity of surgery are clearly associated with the amount of blood loss. Different techniques have been developed to minimize intraoperative blood loss during liver resection. The radiofrequency ablation (RFA) technique has been used widely in the treatment of unresectable liver tumors. This review concentrates on the use of RFA to provide an avascular liver resection plane. Methods and results: The following review is based on two types of RFA device during liver resection: single needle probe RFA and the In-Line RFA device. Conclusion: Liver resection assisted by RFA is safe and is associated with very limited blood loss. [source]


    Micromanipulation of single cells from tissue imprints is an alternative to laser-assisted microdissection

    JOURNAL OF CUTANEOUS PATHOLOGY, Issue 7 2005
    Tilmann C. Brauns
    Different techniques have been developed to obtain single cells from solid tissue. Currently, the most frequently used technique is laser-assisted microdissection (LAM). However, LAM of tissues cannot exclude contamination of the targeted cells by underlying cell fragments. Moreover, this technique can only be performed if a laser microscope is available. Thus, we developed a method to obtain single cells of fresh solid tissue by the simple technique of tissue imprints. After immunostaining of the imprints, single cells were transferred to a reaction tube using a 27-gauge needle guided by a mechanical micromanipulator. Consequently, we used these cells in a single cell PCR. [source]


    The pH effects on the growth rate of KDP (KH2PO4) crystal by investigating Raman active lattice modes

    JOURNAL OF RAMAN SPECTROSCOPY, Issue 9 2007
    M. Badrouj
    Abstract We report on the dependence of the pH value on the growth rates of KDP single crystals. Extensive experimental work has been carried out in order to find the optimum pH ranges for growing KDP single crystals with suitable sizes and high optical quality. Different techniques including micro-Raman back-scattering spectroscopy, UV/vis/IR transmission spectroscopy and X-ray diffraction have been employed for this investigation. Deuterated substituted single crystals of KDP and DKDP also have been grown for the investigation of growth rates and Raman active mode identification purposes. The molecular vibration modes of the grown crystals, including internal modes of PO4 tetrahedrons molecular vibrations, external modes of optical phonons and hydrogen bonding modes have been determined exactly by micro-Raman back-scattering spectroscopy. The best pH values of the solution for the KDP crystal growth with reasonably higher growth rates from aqueous solutions that have been supersaturated ata temperature range of 30,50 °C have been found to be in the pH range of 3.2,5.4. Copyright © 2007 John Wiley & Sons, Ltd. [source]


    A new gastric-emptying mouse model based on in vivo non-invasive bioluminescence imaging

    NEUROGASTROENTEROLOGY & MOTILITY, Issue 10 2010
    A. Roda
    Abstract Background, Different techniques were used to assess gastric emptying (GE) in small animals; most of them require sophisticated equipment, animal sacrifice and are expensive. In the present investigation a simple, non-invasive method based on bioluminescence imaging (BLI) is reported to study GE, using light-emitting Escherichia coli cells as a marker of the gastric content. Methods, A new thermostable red-emitting luciferase was chosen as reporter gene to transform E. coli cells. Bioluminescent (BL) bacteria were administered to fasting mice, after a solid meal, and in response to different doses of metoclopramide (MET) and hyoscine butylbromide (HY). Bioluminescence imaging allowed to evaluate the real time 2D spatial and temporal distribution of bacteria along the gastrointestinal tract in animals and to calculate GE rate in basal conditions and following pharmacological stimulation. Key Results, The administered BL bacteria were easily imaged and localized in the stomach and subsequently followed in the duodenum and upper intestine allowing to accurately calculate GE. Gastric emptying after the test meal was significantly slower (T1/2 16 ± 3 min) than that obtained in fasting conditions (T1/2 2 ± 1 min); administration of HY (1 mg kg,1 b.w.) significantly (P < 0.05) increased T1/2 that was delayed up to 25 ± 4 min; MET (1 mg kg,1 b.w.) significantly (P < 0.05) accelerated T1/2, that was achieved within 8 ± 2 min. Conclusion & Inferences, The reported model is simple, inexpensive, reliable, sensitive and accurate; it can detect both acceleration and slowdown of GE. The model is useful in the investigation of new drug-induced alterations of gastric motility allowing to reduce the number of experimental animals. [source]


    Stoichiometry related defects in CdTe crystals

    PHYSICA STATUS SOLIDI (C) - CURRENT TOPICS IN SOLID STATE PHYSICS, Issue 4 2004
    F. Bissoli
    Abstract In this work, we report on the structural analyses of undoped CdTe samples grown by the vapor phase and the Bridgman methods. Different techniques were used for determining the structural defects: wet etching, high resolution X-ray diffraction, double crystal X-ray topography and monochromatic SEM-cathodoluminescence mapping. The density and the nature of the structural defects were found to be correlated to the stoichiometry of the samples, as determined by a detailed analysis of the temperature dependence of the partial pressure of the vapors in equilibrium with the solid. (© 2004 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


    Neck Nerve Trunks Schwannomas: Clinical Features and Postoperative Neurologic Outcome,

    THE LARYNGOSCOPE, Issue 9 2008
    Carlos Eugenio Nabuco de Araujo MD
    Abstract Objectives/Hypothesis: To analyze clinical and epidemiological features of neck nerve schwannomas, with emphasis on the neurologic outcome after surgical excision sparing as much of nerve fibers as possible with enucleation technique. Study Design: Retrospective study. Methods: Review of medical records from 1987 to 2006 of patients with neck nerve schwannomas, treated in a single institution. Results: Twenty-two patients were identified. Gender distribution was equal and age ranged from 15 to 61 years (mean: 38.6 years). Seven vagal, four brachial plexus, four sympathetic trunk, three cervical plexus, and two lesions on other sites could be identified. Most common symptom was neck mass. Local or irradiated pain also occurred in five cases. Median growing rate of tumors was 3 mm per year. Nerve paralysis was noted twice (a vagal schwannoma and a hypoglossal paralysis compressed by a vagal schwannoma). Different techniques were employed, and seven out of nine patients kept their nerve function (78%) after enucleation. No recurrence was observed in follow-up. Conclusions: Schwannomas should be treated surgically because of its growing potential, leading to local and neural compression symptoms. When possible, enucleation, which was employed in 10 patients of this series, is the recommended surgical option, allowing neural function preservation or restoration in most instances. This is especially important in the head and neck, where denervation may have a significant impact on the quality of life. [source]


    Use of gene transfer technology for functional studies in grapevine

    AUSTRALIAN JOURNAL OF GRAPE AND WINE RESEARCH, Issue 2010
    J.R. VIDAL
    Abstract The understanding of the genetic determinism of plant phenotypes requires the functional annotation of genes governing specific traits including the characterisation of their regulatory networks. A striking feature of the grapevine genome and proteome lies in the existence of large families related to wine attributes that have a higher gene copy number than in other sequenced plants. During speciation, the appearance of new adaptive functions is often based on the evolution of orthologous genes eventually associated with duplication (paralogous sequences) leading to new proteins and expression profiles. The presence of original features in grapevine, including perennial status, vegetative architecture, inflorescence/tendril, flower organisation (corolla), and fleshy fruit of considerable acidity with various flavonoid compounds, makes functional genomics an essential approach to link a gene to a trait. For grapevine, the current lack of high throughput genetic techniques (e.g. induced mutant collections) and the difficulties associated with genetic mapping (allele diversity, chimerism, generation time) highlights the critical role of transgenic technology for characterising gene function. Different techniques are available to obtain information about gene functioning, but the choice of a particular approach depends on the process investigated (e.g. metabolism, developmental, pathogen response) and the experimental purpose (e.g. induction of ectopic functions, promoter studies, subcellular localisation). After a brief overview of the development of grapevine biotechnology, this paper reviews the state-of-the-art gene transfer technology for grapevine and detailed examples of where transgenic technology has proven useful for studying gene function. [source]


    Data Preparation for Real-time High Quality Rendering of Complex Models

    COMPUTER GRAPHICS FORUM, Issue 3 2006
    Reinhard Klein
    The capability of current 3D acquisition systems to digitize the geometry reflection behaviour of objects as well as the sophisticated application of CAD techniques lead to rapidly growing digital models which pose new challenges for interaction and visualization. Due to the sheer size of the geometry as well as the texture and reflection data which are often in the range of several gigabytes, efficient techniques for analyzing, compressing and rendering are needed. In this talk I will present some of the research we did in our graphics group over the past years motivated by industrial partners in order to automate the data preparation step and allow for real-time high quality rendering e.g. in the context of VR-applications. Strength and limitations of the different techniques will be discussed and future challenges will be identified. The presentation will go along with live demonstrations. [source]


    A New Approach for Health Monitoring of Structures: Terrestrial Laser Scanning

    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 1 2007
    H. S. Park
    Three-dimensional (3D) coordinates of a target structure acquired using TLS can have maximum errors of about 10 mm, which is insufficient for the purpose of health monitoring of structures. A displacement measurement model is presented to improve the accuracy of the measurement. The model is tested experimentally on a simply supported steel beam. Measurements were made using three different techniques: (1) linear variable displacement transducers (LVDTs), (2) electric strain gages, and (3) a long gage fiber optic sensor. The maximum deflections estimated by the TLS model are less than 1 mm and within 1.6% of those measured directly by LVDT. Although GPS methods allow measurement of displacements only at the GPS receiver antenna location, the proposed TLS method allows measurement of the entire building's or bridge's deformed shape, and thus a realistic solution for monitoring structures at both structure and member level. Furthermore, it can be used to create a 3D finite element model of a structural member or the entire structure at any instance of time automatically. Through periodic measurements of deformations of a structure or a structural member and performing inverse structural analyses with the measured 3D displacements, the health of the structure can be monitored continuously. [source]


    Using parallelization and hardware concurrency to improve the performance of a genetic algorithm

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 4 2007
    Vijay Tirumalai
    Abstract Genetic algorithms (GAs) are powerful tools for solving many problems requiring the search of a solution space having both local and global optima. The main drawback for GAs is the long execution time normally required for convergence to a solution. This paper discusses three different techniques that can be applied to GAs to improve overall execution time. A serial software implementation of a GA designed to solve a task scheduling problem is used as the basis for this research. The execution time of this implementation is then improved by exploiting the natural parallelism present in the algorithm using a multiprocessor. Additional performance improvements are provided by implementing the original serial software GA in dedicated reconfigurable hardware using a pipelined architecture. Finally, an advanced hardware implementation is presented in which both pipelining and duplicated hardware modules are used to provide additional concurrency leading to further performance improvements. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    Towards a framework and a benchmark for testing tools for multi-threaded programs

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 3 2007
    Yaniv Eytani
    Abstract Multi-threaded code is becoming very common, both on the server side, and very recently for personal computers as well. Consequently, looking for intermittent bugs is a problem that is receiving more and more attention. As there is no silver bullet, research focuses on a variety of partial solutions. We outline a road map for combining the research within the different disciplines of testing multi-threaded programs and for evaluating the quality of this research. We have three main goals. First, to create a benchmark that can be used to evaluate different solutions. Second, to create a framework with open application programming interfaces that enables the combination of techniques in the multi-threading domain. Third, to create a focus for the research in this area around which a community of people who try to solve similar problems with different techniques can congregate. We have started creating such a benchmark and describe the lessons learned in the process. The framework will enable technology developers, for example, developers of race detection algorithms, to concentrate on their components and use other ready made components (e.g. an instrumentor) to create a testing solution. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    Aerosols and gaseous contrast agents for magnetic resonance imaging of the lung

    CONTRAST MEDIA & MOLECULAR IMAGING, Issue 5 2008
    Karim Mosbah
    Abstract Magnetic resonance imaging of lungs and the investigation of pulmonary pathologies with this technique are limited by low proton spin density, degraded magnetic homogeneity and motion. Inhaled contrast agents (gases or aerosols) can improve the diagnostic value of MRI for lung. Paramagnetic contrast agents such as gadolinium chelates aerosol or dioxygen gas increase the relaxivity of proton in lung parenchyma and can be used to assess the ventilated fraction of the bronchoalveolar space. Similarly, inhalation of non proton-MRI nuclei such as perfluorinated gas or hyperpolarized gases (3He or 129Xe) can provide functional ventilation image. In this review paper, the principles, the practical implementation, the limitations and possible safety issues of these different techniques are summarized. The main pre-clinical and clinical applications of these approaches based on oral contrast agents are reviewed and illustrated with cutting-edge lung MRI studies. Copyright © 2008 John Wiley & Sons, Ltd. [source]


    Sclerosing Foam in the Treatment of Varicose Veins and Telangiectases: History and Analysis of Safety and Complications

    DERMATOLOGIC SURGERY, Issue 1 2002
    Alessandro Frullini MD
    objective. To review the use of sclerosing foam in the treatment of varicose veins, to describe the different techniques of foam preparation, and to report the complications of our 3-year experience with this treatment. method. From November 1997 to the end of October 2000, 453 patients were treated with a sclerosing foam for large, medium, and minor varicosities with sodium tetradecylsulfate (STS) or polidocanol (POL). A first group of 257 patients (90 for minor varicosities and 167 for medium to large veins) received a sclerosing foam according to the Monfreux technique. From December 1999 to October 2000, 196 patients were treated with a sclerosing foam prepared according to Tessari's method (36 for minor size veins or teleangectasias and 170 for medium-large veins). Every patient was studied with (color-flow) duplex scanning before and after the treatment and large vein injections were administered under duplex guide. results. The immediate success rate was 88.1% in the first group for the medium-large veins. In the same districts we registered an early success rate in 93.3% for the patients treated with the Tessari's method. The complication rate (mostly minor complications) was 8.5% in the first group and 7.1% in the second group. conclusion. The use of sclerosing foam may become an established therapy in the treatment of varicose veins with a high success rate, low cost, and low major complication rate. According to our actual experience and knowledge, the safe amount of foam should not exceed the 3-ml limit, but further advancements could come from standardization of the foam preparation technique. [source]


    Experience architecture: A framework for designing personalized customer interactions

    DESIGN MANAGEMENT REVIEW, Issue 2 2001
    David Rose
    Know your customers; design accordingly. Thus David Rose advocates the personalization of digital relationships. He stresses the value of integrating electronic communications with other consumer contacts. He reviews different techniques for enhancing loyalty and commitment. He recommends distilling "patterns" and precisely measuring the return from investments in different types of consumers and communication strategies. [source]


    Assessment of different techniques for subcutaneous glucose monitoring in Type 1 diabetic patients during ,real-life' glucose excursions

    DIABETIC MEDICINE, Issue 3 2010
    J. K. Mader
    Diabet. Med. 27, 332,338 (2010) Abstract Aims, To compare the accuracy of two marketed subcutaneous glucose monitoring devices (Guardian RT, GRT; GlucoDay S, GDS) and standard microdialysis (CMA60; MD) in Type 1 diabetic patients. Methods, Seven male Type diabetic patients were investigated over a period of 26 h simulating real-life meal glucose excursions. Catheters of the three systems were inserted into subcutaneous adipose tissue of the abdominal region. For MD, interstitial fluid was sampled at 30- to 60-min intervals for offline glucose determination. Reference samples were taken at 15- to 60-min intervals. All three systems were prospectively calibrated to reference. Median differences, median absolute relative differences (MARD), median absolute differences (MAD), Bland,Altman plot and Clark Error Grid were used to determine accuracy. Results, Bland,Altman analysis indicated a mean glucose difference (2 standard deviations) between reference and interstitial glucose of ,10.5 (41.8) % for GRT, 20.2 (55.9) % for GDS and 6.5 (35.2) % for MD, respectively. Overall MAD (interquartile range) was 1.07 (0.39; 2.04) mmol/l for GRT, 1.59 (0.54; 3.08) mmol/l for GDS and 0.76 (0.26; 1.58) mmol/l for MD. Overall MARD was 15.0 (5.6; 23.4) % (GRT), 19.7 (6.1; 37.6) % (GDS) and 8.7 (4.1; 18.3) % (MD), respectively. Total sensor failure occurred in two subjects using GRT and one subject using GDS. Conclusions, The three investigated technologies had comparable performance. Whereas GRT underestimated actual blood glucose, GDS and MD overestimated blood glucose. Considerable deviations during daily life meal glucose excursions from reference glucose were observed for all three investigated technologies. Present technologies may require further improvement until individual data can lead to direct and automated generation of therapeutic advice in diabetes management. [source]


    Intraoperative evaluation of sentinel lymph nodes in breast carcinoma by imprint cytology, frozen section and rapid immunohistochemistry

    DIAGNOSTIC CYTOPATHOLOGY, Issue 12 2009
    Sharma Upender M.D.
    Abstract Sentinel lymph nodes (SLN) isolated in 40 patients of breast carcinoma (stage T1/T2) were evaluated intraoperatively by imprint cytology and frozen section. Rapid immunohistochemistry (IHC) was done in cases where both imprint smears and frozen sections were negative for any metastatic tumor deposits. The results of these different techniques were compared with postoperative paraffin sections taken as "Gold Standard." Nottingham modification of Bloom Richardson scoring system was used for grading the tumors. Further, the correlation of the SLN status with tumor size, grade, and lymphovascular invasion was studied. The sensitivity, specificity, and overall accuracy of imprint cytology were 91.7, 100, and 95% respectively, and those of the frozen section were 95.8, 100, and 97.5% respectively. Examination of multiple serial sections improved the sensitivity and overall accuracy of frozen section. Results of intraoperative rapid IHC were equivalent to final paraffin sections. Histological grade and lymphovascular invasion were in direct correlation with SLN metastasis (P < 0.05). The risk of lymphovascular invasion increased from 22.2% in grade I tumors to 85.7% in grade III tumors. SLN biopsy is a reliable method to evaluate the status of the axillary lymph nodes. Imprint cytology can be used reliably where the facility of frozen section is not available. Diagn. Cytopathol. 2009. © 2009 Wiley-Liss, Inc. [source]


    A study comparing tolerability, satisfaction and acceptance of three different techniques for esophageal endoscopy: sedated conventional, unsedated peroral ultra thin, and esophageal capsule

    DISEASES OF THE ESOPHAGUS, Issue 5 2009
    G. Nakos
    SUMMARY Three methods of esophagoscopy are available until now: sedated conventional endoscopy, unsedated ultrathin endoscopy, and esophageal capsule endoscopy. The three methods carry comparable diagnostic accuracy and different complication rates. Although all of them have been found well accepted from patients, no comparative study comprising the three techniques has been published. The aim of this study was to compare the three methods of esophagoscopy regarding tolerability, satisfaction, and acceptance. Twenty patients with large esophageal varices and 10 with gastroesophageal reflux disease were prospectively included. All patients underwent consecutively sedated conventional endoscopy, unsedated ultrathin endoscopy, and esophageal capsule endoscopy. After each procedure, patients completed a seven-item questionnaire. The total positive attitude of patients toward all methods was high. However, statistical analysis revealed the following differences in favor of esophageal capsule endoscopy: (i) total positive attitude has been found higher (,2= 18.2, df = 2, P= 0.00), (ii) less patients felt pain (,2= 6.9, df = 2, P= 0.03) and discomfort (,2= 22.1, df = 2, P= 0.00), (iii) less patients experienced difficulty (,2= 13.7, df = 2, P= 0.01), and (iv) more patients were willing to undergo esophageal capsule endoscopy in the future (,2= 12.1, df = 2, P= 0.002). Esophageal capsule endoscopy was characterized by a more positive general attitude and caused less pain and discomfort. Sedated conventional endoscopy has been found more difficult. More patients would repeat esophageal capsule endoscopy in the future. Patients' total position for all three available techniques for esophageal endoscopy was excellent and renders the observed advantage of esophageal capsule endoscopy over both sedated conventional and unsedated ultrathin endoscopy a statistical finding without a real clinical benefit. [source]


    Empirical estimate of fundamental frequencies and damping for Italian buildings

    EARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 8 2009
    Maria Rosaria Gallipoli
    Abstract The aim of this work is to estimate the fundamental translational frequencies and relative damping of a large number of existing buildings, performing ambient vibration measurements. The first part of the work is devoted to the comparison of the results obtained with microtremor measurements with those obtained from earthquake recordings using four different techniques: horizontal-to-vertical spectral ratio, standard spectral ratio, non-parametric damping analysis (NonPaDAn) and half bandwidth method. We recorded local earthquakes on a five floors reinforced concrete building with a pair of accelerometers located on the ground and on top floor, and then collected microtremors at the same location of the accelerometers. The agreement between the results obtained with microtremors and earthquakes has encouraged extending ambient noise measurements to a large number of buildings. We analysed the data with the above-mentioned methods to obtain the two main translational frequencies in orthogonal directions and their relative damping for 80 buildings in the urban areas of Potenza and Senigallia (Italy). The frequencies determined with different techniques are in good agreement. We do not have the same satisfactory results for the estimates of damping: the NonPaDAn provides estimates that are less dispersed and grouped around values that appear to be more realistic. Finally, we have compared the measured frequencies with other experimental results and theoretical models. Our results confirm, as reported by previous authors, that the theoretical period,height relationships overestimate the experimental data. Copyright © 2008 John Wiley & Sons, Ltd. [source]


    A Comparison of Echocardiographic Techniques in Determination of Arterial Elasticity in the Pediatric Population

    ECHOCARDIOGRAPHY, Issue 5 2009
    Michael Fahey M.D.
    Background: Many methods are used to measure arterial elasticity in children using echocardiography. There is no data to support the equivalence of the different techniques. The goal of this study was to evaluate the reproducibility of several techniques used to measure arterial elasticity using echocardiography. Methods: Aortic distension in two different sites (arterial distension) through the cardiac cycle was measured by (four) two-dimensional (2D) and M-mode echocardiographic techniques in 20 children without significant structural heart disease. These measurements combined with noninvasive blood pressure measurements were used to calculate arterial elastic indices. Arterial elasticity was expressed in terms of distensibility and stiffness. Data were collected by two sonographers and interpreted by two reviewers. Paired Student's t-test and Pitman's test for equality of variance for correlated observations were used to detect differences between different sonographers, different reviewers, and different techniques. Results: No significant difference in the measured elasticity between sonographers or reviewers was observed. There was a somewhat increased variance in two of the four techniques evaluated. There was no significant difference in elasticity measured using different techniques to evaluate the same arterial site, although a significantly decreased elasticity was noted from measurements taken in the proximal ascending aorta as compared with the distal ascending aorta. Conclusions: Many echocardiographic techniques produce reproducible measurements of arterial elasticity. There may be intrinsic differences in arterial elasticity between different segments of the ascending aorta, which have not been previously described in children with normal cardiac anatomy. Comparisons of data from separate studies must take these differences into account. [source]


    A retrospective audit exploring the use of relaxation as an intervention in oncology and palliative care

    EUROPEAN JOURNAL OF CANCER CARE, Issue 5 2008
    J. MILLER
    The benefits of relaxation in cancer care have been well documented within the literature, with the majority of research being undertaken by nursing professionals. However, evidence of the effectiveness of relaxation interventions by occupational therapists is lacking. Occupational therapists are in an ideal situation to provide information and practical relaxation sessions. Athough in numerical terms, the outcome of relaxation interventions is small, functional outcome related to quality of life and independence in activities of daily living is immeasurable. This article reports the findings of a retrospective audit exploring relaxation-specific referrals to occupational therapy, and identifies effectiveness of a variety of different techniques currently employed within this specific programme. Patients with a primary diagnosis of breast cancer were the most frequently seen, and this prevalence is reflected in current national statistics. Similarly, those between 50 and 59 years of age comprised the largest group. Guided visualization was the most commonly used technique, although there appeared to be very little change in perceived tension between all the techniques. Further study of the impact relaxation has on occupational performance would be worthwhile. [source]


    Nanoscale Grain Refinement and H-Sorption Properties of MgH2 Processed by High-Pressure Torsion and Other Mechanical Routes,

    ADVANCED ENGINEERING MATERIALS, Issue 8 2010
    Daniel Rodrigo Leiva
    MgH2 is a promising material for solid-state hydrogen storage due to its high gravimetric and volumetric storage capacity and its relatively low cost. Severe plastic deformation (SPD) processing techniques are being explored as an alternative to high-energy ball-milling (HEBM) in order to obtain more air resistant materials and reduce processing times. In this work, Mg, MgH2, and MgH2,Fe mixtures were severely mechanically processed by different techniques such as high-pressure torsion (HPT), extensive cold forging, and cold rolling. A very significant grain refinement was achieved when using MgH2 instead of Mg as raw material. The mean crystallite sizes observed ranged from 10 to 30,nm, depending on the processing conditions. Enhanced H-sorption properties were observed for the MgH2 -based nanocomposites processed by HPT when compared with MgH2 mixtures. Additionally, cold forging and cold rolling also proved effective in nanostructuring MgH2. These results suggest a high potential for innovative application with the use of low cost mechanical processing routes to produce Mg-based nanomaterials with attractive hydrogen storage properties. [source]


    Analysis of the behaviour of the structural concrete after the fire at the Windsor Building in Madrid

    FIRE AND MATERIALS, Issue 2 2010
    E. Menéndez
    Abstract The analysis of the concrete subjected to high temperatures is usually undertaken by means of tests specifically designed and carried out in the laboratory, or by using theoretical approaches using standardized curves for theoretical fires. An analysis by different techniques has been carried out on structural concrete to real fire of Windsor Building in Madrid, which was severally damaged by a fire in 2005. These techniques are X-ray diffraction, differential thermal and thermogravimetric analysis and backscattered electron microscopy with dispersive X-ray microanalysis. Samples of the concrete were taken from different floors in the building and analyses were carried out at different depths starting from the surface exposed to the fire itself. The analysis allows the damaged area to be limited as well as situating the 500,C isotherm in the concrete element. In accordance with the results obtained, the damage is limited to just a few centimeters from the surface exposed to the fire itself, in spite of its prolonged exposure to the fire. This would justify that the concrete has demonstrated a suitable resistant behaviour. Likewise, it can be deduced from the results obtained that the fire, to which the concrete was subjected, can be qualified as severe. Also, these results can confirm that the calculation hypothesis in the project is correct in relation to the fire resistance exigencies of the concrete. Copyright © 2009 John Wiley & Sons, Ltd. [source]


    Hydroacoustic target strength validation using angling creel census data

    FISHERIES MANAGEMENT & ECOLOGY, Issue 6 2002
    P. A. FREAR
    Validation of hydroacoustic in-situ target strength is problematic in large, deep lowland rivers, which cannot be sampled easily by conventional methods such as netting or electric fishing. A sampling programme involving three different techniques (split beam sonar, angling census and post-angling competition data collection) was conducted to examine methodologies suitable for target strength validation. This combination of techniques also assessed the relative merits of each method for best describing fish populations and the stocks exploited in a recreational coarse fishery. The sonar estimated the greatest number of fish of the three techniques, with a strong positive size correlation with the other two methods. The angling census and post-competition census accounted for more larger fish, >26 cm, than were detected acoustically, indicating a stratification of species that were exploited by angling but not detected by horizontal sonar. The combined techniques demonstrated a suitable, cost-effective, hydroacoustic validation method for large UK rivers, which supports recreational coarse fisheries management, with the added advantage of species identification. [source]


    Analytical and 3-D numerical modelling of Mt. Etna (Italy) volcano inflation

    GEOPHYSICAL JOURNAL INTERNATIONAL, Issue 2 2005
    A. Bonaccorso
    SUMMARY Since 1993, geodetic data obtained by different techniques (GPS, EDM, SAR, levelling) have detected a consistent inflation of the Mt. Etna volcano. The inflation, culminating with the 1998,2001 strong explosive activity from summit craters and recent 2001 and 2002 flank eruptions, is interpreted in terms of magma ascent and refilling of the volcanic plumbing system and reservoirs. We have modelled the 1993,1997 EDM and GPS data by 3-D pressurized sources to infer the position and dimension of the magma reservoir. We have performed analytical inversions of the observed deformation using both spheroidal and ellipsoidal sources embedded in a homogeneous elastic half-space and by applying different inversion methods. Solutions for these types of sources show evidence of a vertically elongated magma reservoir located 6 km beneath the summit craters. The maximum elevation of topography is comparable to such depth and strong heterogeneities are inferred from seismic tomography; in order to assess their importance, further 3-D numerical models, employing source parameters extracted from analytical models, have been developed using the finite-element technique. The deformation predicted by all the models considered shows a general agreement with the 1993,1997 data, suggesting the primary role of a pressure source, while the complexities of the medium play a minor role under elastic conditions. However, major discrepancies between data and models are located in the SE sector, suggesting that sliding along potential detachment surfaces may contribute to amplify deformation during the inflation. For the first time realistic features of Mt. Etna are studied by a 3-D numerical model characterized by the topography and lateral variations of elastic structure, providing a framework for a deeper insight into the relationships between internal sources and tectonic structures. [source]


    BIOMOD , optimizing predictions of species distributions and projecting potential future shifts under global change

    GLOBAL CHANGE BIOLOGY, Issue 10 2003
    Wilfried ThuillerArticle first published online: 9 OCT 200
    Abstract A new computation framework (BIOMOD: BIOdiversity MODelling) is presented, which aims to maximize the predictive accuracy of current species distributions and the reliability of future potential distributions using different types of statistical modelling methods. BIOMOD capitalizes on the different techniques used in static modelling to provide spatial predictions. It computes, for each species and in the same package, the four most widely used modelling techniques in species predictions, namely Generalized Linear Models (GLM), Generalized Additive Models (GAM), Classification and Regression Tree analysis (CART) and Artificial Neural Networks (ANN). BIOMOD was applied to 61 species of trees in Europe using climatic quantities as explanatory variables of current distributions. On average, all the different modelling methods yielded very good agreement between observed and predicted distributions. However, the relative performance of different techniques was idiosyncratic across species, suggesting that the most accurate model varies between species. The results of this evaluation also highlight that slight differences between current predictions from different modelling techniques are exacerbated in future projections. Therefore, it is difficult to assess the reliability of alternative projections without validation techniques or expert opinion. It is concluded that rather than using a single modelling technique to predict the distribution of several species, it would be more reliable to use a framework assessing different models for each species and selecting the most accurate one using both evaluation methods and expert knowledge. [source]


    Partial laryngectomy for recurrent glottic carcinoma after radiotherapy

    HEAD & NECK: JOURNAL FOR THE SCIENCES & SPECIALTIES OF THE HEAD AND NECK, Issue 2 2005
    Aniel Sewnaik MD
    Abstract Background. Early laryngeal cancer is treated with surgery or radiotherapy. A partial laryngectomy instead of a total laryngectomy can be used for treating patients with radiation failures. Methods. Patients were grouped by the two types of partial laryngectomies we performed: group I, endoscopic laser surgery (n = 42); and group II, frontolateral partial laryngectomy (n = 21). Results. With CO2 laser treatment, 14 of 24 patients (no involvement of the anterior commissure) and eight of 18 patients (involvement of the anterior commissure) were cured. With the frontolateral partial laryngectomy, we achieved local control in 15 of 21 patients. Conclusions. If the surgeon is familiar with the different techniques of, and indications for, partial laryngectomy, this can be a good and satisfying treatment in selected patients with radiation failure for glottic cancer. © 2004 Wiley Periodicals, Inc. Head Neck27: 101,107, 2004 [source]