Future Use (future + use)

Distribution by Scientific Domains
Distribution within Medical Sciences

Kinds of Future Use

  • possible future use


  • Selected Abstracts


    Evidence-based practice-focused interactive teaching strategy: a controlled study

    JOURNAL OF ADVANCED NURSING, Issue 6 2009
    Son C. Kim
    Abstract Title.,Evidence-based practice-focused interactive teaching strategy: a controlled study. Aim., This paper is a report of a study to evaluate the effectiveness of the evidence-based practice (EBP)-focused interactive teaching (E-FIT) strategy. Background., Although EBP is a mandatory competency for all healthcare professionals, little is known about the effectiveness of E-FIT in nursing. Methods., A quasi-experimental, controlled, pre- and post-test study involving senior, 4th-year nursing students (N = 208) at two nursing schools in the USA was carried out from August 2007 to May 2008. The experimental group (n = 88) received the E-FIT strategy intervention and the control group (n = 120) received standard teaching. A Knowledge, Attitudes and Behaviors Questionnaire for Evidence-Based Practice was used to assess the effectiveness of the E-FIT strategy. Results., Independent t -tests showed that the experimental group had statistically significant higher post-test Evidence-Based Practice Knowledge (mean difference = 0·25; P = 0·001) and Evidence-Based Practice Use (mean difference = 0·26; P = 0·015) subscale scores compared to the control group, but showed no statistically significant differences in Attitudes toward Evidence-Based Practice and Future Use of Evidence-Based Practice (mean difference = ,0·12; P = 0·398 and mean difference = 0·13; P = 0·255 respectively). Hierarchical multiple regression analyses of the post-test data indicated that the intervention explained 7·6% and 5·1% of variance in Evidence-Based Practice Knowledge and Evidence-Based Practice Use respectively. Conclusion., The EBP-focused interactive teaching strategy was effective in improving the knowledge and use of EBP among nursing students but not attitudes toward or future use of EBP. [source]


    Seeking Help a Second Time: Parents'/Caregivers' Characterizations of Previous Experiences With Mental Health Services for Their Children and Perceptions of Barriers to Future Use

    AMERICAN JOURNAL OF ORTHOPSYCHIATRY, Issue 2 2006
    Dara Kerkorian PhD
    This study examines the relationship between urban parents'/caregivers' previous experiences obtaining mental health care for their children and their perceptions of barriers to their children's use of services in the future. Assessments of prior treatment outcome and aspects of relationships with former providers were linked to endorsements of doubt about the utility of treatment as a potential barrier to the children's use of services in the future and the number of barriers parents endorsed. Implications for urban child mental health service delivery are drawn. [source]


    Uptake kinetics and subcellular compartmentalization of cadmium in acclimated and unacclimated earthworms (Eisenia andrei)

    ENVIRONMENTAL TOXICOLOGY & CHEMISTRY, Issue 7 2010
    Shuo Yu
    Abstract Acclimation to cadmium (Cd) levels exceeding background concentrations may influence the ability of earthworms to accumulate Cd with minimum adverse effects. In the present study, earthworms (Eisenia andrei) were acclimated by exposure to 20,mg/kg Cd (dry wt) in Webster soil for 28 d. A 224-d bioaccumulation test was subsequently conducted with both acclimated and unacclimated worms exposed in Webster soils spiked with 20,mg/kg and 100,mg/kg Cd (dry wt). Uptake kinetics and subcellular compartmentalization of Cd were examined. Results suggest that acclimated earthworms accumulated more Cd and required a longer time to reach steady state than unacclimated worms. Most of the Cd was present in the metallothionein (MT) fraction. Cadmium in the MT fraction increased approximately linearly with time and required a relatively longer time to reach steady state than Cd in cell debris and granule fractions, which quickly reached steady state. Cadmium in the cell debris fraction is considered potentially toxic, but low steady state concentrations observed in the present study would not suggest the potential for adverse effects. Future use of earthworms in ecological risk assessment should take into consideration pre-exposure histories of the test organisms. A prolonged test period may be required for a comprehensive understanding of Cd uptake kinetics and compartmentalization. Environ. Toxicol. Chem. 2010;29:1568,1574. © 2010 SETAC [source]


    Glucagon-like peptide-1 receptor is present on human hepatocytes and has a direct role in decreasing hepatic steatosis in vitro by modulating elements of the insulin signaling pathway,

    HEPATOLOGY, Issue 5 2010
    Nitika Arora Gupta
    Glucagon-like peptide 1 (GLP-1) is a naturally occurring peptide secreted by the L cells of the small intestine. GLP-1 functions as an incretin and stimulates glucose-mediated insulin production by pancreatic , cells. In this study, we demonstrate that exendin-4/GLP-1 has a cognate receptor on human hepatocytes and that exendin-4 has a direct effect on the reduction of hepatic steatosis in the absence of insulin. Both glucagon-like peptide 1 receptor (GLP/R) messenger RNA and protein were detected on primary human hepatocytes, and receptor was internalized in the presence of GLP-1. Exendin-4 increased the phosphorylation of 3-phosphoinositide-dependent kinase-1 (PDK-1), AKT, and protein kinase C , (PKC-,) in HepG2 and Huh7 cells. Small interfering RNA against GLP-1R abolished the effects on PDK-1 and PKC-,. Treatment with exendin-4 quantitatively reduced triglyceride stores compared with control-treated cells. Conclusion: This is the first report that the G protein,coupled receptor GLP-1R is present on human hepatocytes. Furthermore, it appears that exendin-4 has the same beneficial effects in vitro as those seen in our previously published in vivo study in ob/ob mice, directly reducing hepatocyte steatosis. Future use for human nonalcoholic fatty liver disease, either in combination with dietary manipulation or other pharmacotherapy, may be a significant advance in treatment of this common form of liver disease. (HEPATOLOGY 2010) [source]


    The role of medical simulation: an overview,

    THE INTERNATIONAL JOURNAL OF MEDICAL ROBOTICS AND COMPUTER ASSISTED SURGERY, Issue 3 2006
    Kevin Kunkler
    Abstract Robotic surgery and medical simulation have much in common: both use a mechanized interface that provides visual "patient" reactions in response to the actions of the health care professional (although simulation also includes touch feedback); both use monitors to visualize the progression of the procedure; and both use computer software applications through which the health care professional interacts. Both technologies are experiencing rapid adoption and are viewed as modalities that allow physicians to perform increasingly complex minimally invasive procedures while enhancing patient safety. A review of the literature and industry developments concludes that medical simulators can be useful tools in determining a physician's understanding and use of best practices, management of patient complications, appropriate use of instruments and tools, and overall competence in performing procedures. Future use of these systems depends on their impact on patient safety, procedure completion time and cost efficiency. The sooner simulation training can be used to support developing technologies and procedures, the earlier, and typically the better, the results. Continued studies are needed to identify and ensure the ongoing applicability of these systems for both training and certification. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    Baseline indicators for measuring progress in preventing falls injury in older people

    AUSTRALIAN AND NEW ZEALAND JOURNAL OF PUBLIC HEALTH, Issue 5 2009
    Annaliese M. Dowling
    Abstract Objective: Over recent years, there has been increasing attention given to preventing falls and falls injury in older people through policy and other initiatives. This paper presents a baseline set of fall injury outcome indicators against which these preventive efforts can be assessed in terms of monitoring the rate of fall-related deaths and hospitalisations. Methods: ICD-10-AM coded hospital separations, Australian Bureau of Statistics (ABS) mortality and ABS population data were used to determine the rate of fall-related injury mortality and hospitalisations occurring in people aged 65+ years in New South Wales (NSW), Australia, over the six-year period from 1998/99 to 2003/04, inclusive. Results: Baseline trends for one fatality and five separations-based metrics are presented. Overall, fall mortality rates increased over the six years, with higher rates in males. Falls hospitalisation rates also increased slightly, with higher rates in females. The rates of hip fracture and pelvic fracture hospital separations generally declined over the six years and were highest in females. The level of unspecified and missing information about the place where falls occur increased by 1.5%. Conclusion: Baseline trends in fall injury outcome metrics highlight the severity and frequency of fall injuries before wide scale implementation of the Management Policy to Reduce Fall Injury Among Older People in NSW. Implications: Future use of these metrics will help to evaluate and monitor the progress of falls prevention in older people in NSW. They could also be adopted in other jurisdictions. [source]


    Increasing the Homogeneity of CAT's Item-Exposure Rates by Minimizing or Maximizing Varied Target Functions While Assembling Shadow Tests

    JOURNAL OF EDUCATIONAL MEASUREMENT, Issue 3 2005
    Yuan H. Li
    A computerized adaptive testing (CAT) algorithm that has the potential to increase the homogeneity of CAT's item-exposure rates without significantly sacrificing the precision of ability estimates was proposed and assessed in the shadow-test (van der Linden & Reese, 1998) CAT context. This CAT algorithm was formed by a combination of maximizing or minimizing varied target functions while assembling shadow tests. There were four target functions to be separately used in the first, second, third, and fourth quarter test of CAT. The elements to be used in the four functions were associated with (a) a random number assigned to each item, (b) the absolute difference between an examinee's current ability estimate and an item difficulty, (c) the absolute difference between an examinee's current ability estimate and an optimum item difficulty, and (d) item information. The results indicated that this combined CAT fully utilized all the items in the pool, reduced the maximum exposure rates, and achieved more homogeneous exposure rates. Moreover, its precision in recovering ability estimates was similar to that of the maximum item-information method. The combined CAT method resulted in the best overall results compared with the other individual CAT item-selection methods. The findings from the combined CAT are encouraging. Future uses are discussed. [source]


    Rare Earth Deposits of North America

    RESOURCE GEOLOGY, Issue 4 2008
    Stephen B. Castor
    Abstract Rare earth elements (REE) have been mined in North America since 1885, when placer monazite was produced in the southeast USA. Since the 1960s, however, most North American REE have come from a carbonatite deposit at Mountain Pass, California, and most of the world's REE came from this source between 1965 and 1995. After 1998, Mountain Pass REE sales declined substantially due to competition from China and to environmental constraints. REE are presently not mined at Mountain Pass, and shipments were made from stockpiles in recent years. Chevron Mining, however, restarted extraction of selected REE at Mountain Pass in 2007. In 1987, Mountain Pass reserves were calculated at 29 Mt of ore with 8.9% rare earth oxide based on a 5% cut-off grade. Current reserves are in excess of 20 Mt at similar grade. The ore mineral is bastnasite, and the ore has high light REE/heavy REE (LREE/HREE). The carbonatite is a moderately dipping, tabular 1.4-Ga intrusive body associated with ultrapotassic alkaline plutons of similar age. The chemistry and ultrapotassic alkaline association of the Mountain Pass deposit suggest a different source than that of most other carbonatites. Elsewhere in the western USA, carbonatites have been proposed as possible REE sources. Large but low-grade LREE resources are in carbonatite in Colorado and Wyoming. Carbonatite complexes in Canada contain only minor REE resources. Other types of hard-rock REE deposits in the USA include small iron-REE deposits in Missouri and New York, and vein deposits in Idaho. Phosphorite and fluorite deposits in the USA also contain minor REE resources. The most recently discovered REE deposit in North America is the Hoidas Lake vein deposit, Saskatchewan, a small but incompletely evaluated resource. Neogene North American placer monazite resources, both marine and continental, are small or in environmentally sensitive areas, and thus unlikely to be mined. Paleoplacer deposits also contain minor resources. Possible future uranium mining of Precambrian conglomerates in the Elliott Lake,Blind River district, Canada, could yield by-product HREE and Y. REE deposits occur in peralkaline syenitic and granitic rocks in several places in North America. These deposits are typically enriched in HREE, Y, and Zr. Some also have associated Be, Nb, and Ta. The largest such deposits are at Thor Lake and Strange Lake in Canada. A eudialyte syenite deposit at Pajarito Mountain in New Mexico is also probably large, but of lower grade. Similar deposits occur at Kipawa Lake and Lackner Lake in Canada. Future uses of some REE commodities are expected to increase, and growth is likely for REE in new technologies. World reserves, however, are probably sufficient to meet international demand for most REE commodities well into the 21st century. Recent experience shows that Chinese producers are capable of large amounts of REE production, keeping prices low. Most refined REE prices are now at approximately 50% of the 1980s price levels, but there has been recent upward price movement for some REE compounds following Chinese restriction of exports. Because of its grade, size, and relatively simple metallurgy, the Mountain Pass deposit remains North America's best source of LREE. The future of REE production at Mountain Pass is mostly dependent on REE price levels and on domestic REE marketing potential. The development of new REE deposits in North America is unlikely in the near future. Undeveloped deposits with the most potential are probably large, low-grade deposits in peralkaline igneous rocks. Competition with established Chinese HREE and Y sources and a developing Australian deposit will be a factor. [source]


    Measurement Equivalence Using Generalizability Theory: An Examination of Manufacturing Flexibility Dimensions

    DECISION SCIENCES, Issue 4 2008
    Manoj K. Malhotra
    ABSTRACT As the field of decision sciences in general and operations management in particular has matured from theory building to theory testing over the past two decades, it has witnessed an explosion in empirical research. Much of this work is anchored in survey-based methodologies in which data are collected from the field in the form of scale items that are then analyzed to measure latent unobservable constructs. It is important to assess the invariance of scales across groups in order to reach valid, scientifically sound conclusions. Because studies have often been conducted in the field of decision sciences with small sample sizes, it further exacerbates the problem of reaching incorrect conclusions. Generalizability theory can more effectively test for measurement equivalence in the presence of small sample sizes than the confirmatory factor analysis (CFA) tests that have been conventionally used for assessing measurement equivalency across groups. Consequently, we introduce and explain the generalizability theory (G-theory) in this article to examine measurement equivalence of 24 manufacturing flexibility dimension scales that have been published in prior literature and also compare and contrast G-theory with CFA. We show that all the manufacturing flexibility scales tested in this study were invariant across the three industry SIC groups from which data were collected. We strongly recommend that G-theory should always be used for determining measurement equivalence in empirical survey-based studies. In addition, because using G-theory alone does not always reveal the complete picture, CFA techniques for establishing measurement equivalence should also be invoked when sample sizes are large enough to do so. Implications of G-theory for practice and its future use in operations management and decision sciences research are also presented. [source]


    Adipogenic Differentiation of Human Adipose Tissue,Derived Stem Cells Obtained from Cryopreserved Adipose Aspirates

    DERMATOLOGIC SURGERY, Issue 7 2010
    JUNG EUN LEE MS
    BACKGROUND Although frozen adipose tissue is frequently used for soft tissue augmentation, the viability of frozen fat remains a controversy. The cryopreservation of adipose tissue is important for the future use of adipose-derived stem cells (ASCs) and adipocytes. OBJECTIVE To determine whether optimal cryopreservation techniques with regard to the addition of cryopreservative agents and preservation temperature is essential for the long-term storage of adipose tissue and whether ASCs from cryopreserved adipose aspirates are reliable for use in adipogenic differentiation. MATERIALS AND METHODS Adipose tissue was frozen directly or with cryoprotectant at ,20°C or ,80°C for 1 year. The viability of adipose aspirates and the differentiation of ASCs isolated from adipose tissue were evaluated. RESULTS The viability of adipose aspirates frozen with dimethyl sulfoxide at ,80°C was approximately 87% after 2 months of storage. Moreover, ASCs from adipose tissue stored with cryoprotectant survived successfully for 1 year and differentiated into adipocytes, although ASCs were not detected in the directly frozen adipose tissue. CONCLUSION Adipose tissue cryopreserved with cryoprotectant and stored at optimal temperature might prove to be a reliable source of human ASCs and adipocytes. The authors have indicated no significant interest with commercial supporters. [source]


    A retrospective study of the diagnostic accuracy of fine-needle aspiration for breast lesions and implications for future use

    DIAGNOSTIC CYTOPATHOLOGY, Issue 12 2008
    Christina Day M.D.
    Abstract In recent years, the use of fine-needle aspiration (FNA) in the diagnosis of breast lesions has declined in many institutions. We sought to evaluate the role of FNA for breast lesions and the annual rate of the procedure at our institution over a 4½ year period (May 2002,October 2006). A total of 831 FNAs were performed, with 258 (31%) having histologic follow-up. The number of FNAs obtained was 159 from 5/02 to 4/03, 192 from 5/03 to 4/04, 194 from 5/04 to 4/05, 191 from 5/05 to 4/06, and 95 from 5/06 to 10/06. Each case was placed into one of four categories: nondiagnostic (9%), benign (77.5%), atypical/suspicious (5.5%), or malignant (8%). Surgical tissue was available for 37% of nondiagnostic cases, 22% of benign cases, 80% of atypical/suspicious cases, and 72% of malignant cases. The overall sensitivity and specificity for FNA was 83 and 92% respectively. The overall positive and negative predictive values were 83 and 92% respectively. There were no false-positive cases, indicating a positive predictive value of 100% for a Dx of malignancy. For cases with surgical follow-up, the false-negative rate was 5.4%. Although there is a national trend away from FNAs of breast lesion, this has not been the experience at our institution. Although FNA may not be ideal in the initial evaluation of suspicious lesions, we argue that FNA for clinically benign palpable lesions and recurrent carcinomas has significant value. Diagn. Cytopathol. 2008. © 2008 Wiley-Liss, Inc. [source]


    An "Omics" view of drug development

    DRUG DEVELOPMENT RESEARCH, Issue 2 2004
    Russ B. Altman
    Abstract The pharmaceutical industry cannot be blamed for having a love/hate relationship with the fields of pharmacogenetics and pharmacogenomics. At the same time that pharmacogenetics and pharmacogenomics promise to save pipeline drugs by identifying subsets of the population for which they work best, they also threaten to increase the complexity of new drug applications, fragment markets, and create uncertainty for prescribers who simply do not understand or have time to master "personalized medicine." Most importantly, the logical case for genetics-specific drug selection and dosing is much more mature than the practical list of drugs for which outcomes are demonstrably improved. Understandably, pharmaceutical developers and regulators have been careful in creating strategies for using genetics in drug development, and only recently has the FDA begun to establish preliminary rules for pharmacogenetic testing. A growing public academic effort in pharmacogenetics and pharmacogenomics is helping flesh out the basic science underpinnings of the field, and this should combine with extensive efforts of industry to create a solid foundation for future use of genetics in drug development. Two grand challenges to accelerate our capabilities include the characterization of all human genes involved in the basic pharmacokinetics of drugs, and the detailed study of the genes and pathways associated with G-protein-coupled receptors and how they are affected by genetic variation. Drug Dev. Res. 62:81,85, 2004. © 2004 Wiley-Liss, Inc. [source]


    Cationic and anionic lipid-based nanoparticles in CEC for protein separation

    ELECTROPHORESIS, Issue 11 2010
    Christian Nilsson
    Abstract The development of new separation techniques is an important task in protein science. Herein, we describe how anionic and cationic lipid-based liquid crystalline nanoparticles can be used for protein separation. The potential of the suggested separation methods is demonstrated on green fluorescent protein (GFP) samples for future use on more complex samples. Three different CEC-LIF approaches for protein separation are described. (i) GFP and GFP N212Y, which are equally charged, were separated with high resolution by using anionic nanoparticles suspended in the electrolyte and adsorbed to the capillary wall. (ii) High efficiency (800,000 plates/m) and peak capacity were demonstrated separating GFP samples from Escherichia coli with cationic nanoparticles suspended in the electrolyte and adsorbed to the capillary wall. (iii) Three single amino-acid-substituted GFP variants were separated with high resolution using an approach based on a physical attached double-layer coating of cationic and anionic nanoparticles combined with anionic lipid nanoparticles suspended in the electrolyte. The soft and porous lipid-based nanoparticles were synthesized by a one-step procedure based on the self-assembly of lipids, and were biocompatible with a large surface-to-volume ratio. The methodology is still under development and the optimization of the nanoparticle chemistry and separation conditions can further improve the separation system. In contrast to conventional LC, a new interaction phase is introduced for every analysis, which minimizes carry-over and time-consuming column regeneration. [source]


    Screening for Novel Industrial Biocatalysts

    ENGINEERING IN LIFE SCIENCES (ELECTRONIC), Issue 6 2004
    P. Lorenz
    Abstract Biocatalysis, the use of microbial cells or isolated enzymes in the production of fine chemicals, is steadily moving towards becoming accepted as an indispensable tool in the inventory of modern synthetic chemistry [1]. It is estimated that in 10,% of the cases biocatalysis will provide an overall superior synthetic strategy over traditional organic chemistry [2]. This remarkable development in a field coined "white biotechnology" is due to the growing recognition in the industry of the capabilities and performance of enzymes as exemplified in a growing number of implemented processes [3,,4], examples running at a scale of >1000 tons product/year. Breakthroughs in the key biotechnological areas of a) genetic resource access (explicitly the explorability of non-cultivated microorganisms), b) enzyme screening and discovery and c) in vitro evolution of proteins to find and optimize enzymes to become near-ideally suited biocatalysts have been instrumental in pushing industrial biocatalysis to where it stands today [5, 6]. With these technological options it seems that future use of biocatalysis is limited only by the availability of the biocatalyst [3], the screening for which is subject of this review. [source]


    Integrated environmental product innovation and impacts on company competitiveness: a case study of the automotive industry in the region of Munich

    ENVIRONMENTAL POLICY AND GOVERNANCE, Issue 1 2008
    Ursula Triebswetter
    Abstract This paper examines the impact of integrated environmental product innovations on company competitiveness. In a regional case study about automotive, rail and commercial vehicle firms in Southern Germany it is found that integrated environmental product innovation is driven by factors such as regulatory pressure, the search for competitive advantages and technological lead as well as customer pressure. Regulatory pressure includes sector policies, such as emission standards, and wider non-sector energy conservation issues, at both national and international levels. For instance, EU directives on future use of renewable energy as well as national goals for reaching the Kyoto protocol play an important role in driving innovation. The study finds that integrated environmental product innovations driven by regulatory pressure produce similar competitiveness impacts as innovations undertaken voluntarily by companies. Such results yield supporting evidence for the so-called ,Porter hypothesis', which assumes that environmental legislation stimulates innovation and leads to ,win,win' situations , the simultaneous reduction of pollution and increase in productivity. Copyright © 2008 John Wiley & Sons, Ltd and ERP Environment. [source]


    RESEARCH FOCUS ON COMPULSIVE BEHAVIOUR IN ANIMALS: An animal model of compulsive food-taking behaviour

    ADDICTION BIOLOGY, Issue 4 2009
    Andrea Heyne
    ABSTRACT The increase in the incidence of obesity and eating disorders has promoted research aimed at understanding the aetiology of abnormal eating behaviours. Apart from metabolic factors, obesity is caused by overeating. Clinical reports have led to the suggestion that some individuals may develop addictive-like behaviours when consuming palatable foods, and compulsive eating plays a similar dominant role in obesity as compulsive drug taking does in drug addiction. The progress made in the development of treatment strategies for obesity is limited, in part, because the physiological and neurological causes and consequences of compulsive eating behaviour are not clearly understood and cannot readily be studied in human subjects. We have developed experimental approaches that reflect the functioning of the components of eating control, including compulsive food taking in rats. Rats that are given free choice between standard chow and a palatable, chocolate-containing ,Cafeteria Diet' (CD) develop distinct signs of compulsive food taking that appear at an early stage. These include the inability to adapt intake behaviour in periods of limited or bitter-tasting CD access, continued food intake during resting phases and changes in fine structure of feeding (duration, distribution and recurrence of feeding bouts). The model will help examine the neurobiological underpinnings of compulsive food seeking and food taking and provides a possibility to study the effects of novel anti-obesity compounds on compulsive eating and other components of food-taking behaviour in detail. For future use of genetic models, the possibility of a transfer to a mouse was discussed. [source]


    A reversibly immortalized human hepatocyte cell line as a source of hepatocyte-based biological support

    ADDICTION BIOLOGY, Issue 4 2001
    Naoya Kobayashi
    The application of hepatocyte transplantation (HTX) is increasingly envisioned for temporary metabolic support during acute liver failure and provision of specific liver functions in inherited liver-based metabolic diseases. Compared with whole liver transplantation, HTX is a technically simple procedure and hepatocytes can be cryopreserved for future use. A major limitation of this form of therapy in humans is the worldwide shortage of human livers for isolating an adequate number of transplantable human hepatocyes when needed. Furthermore, the numbers of donor livers available for hepatocyte isolation is limited by competition for their use in whole organ transplantation. Considering the cost of hepatocyte isolation and the need for immediate preparation of consistent and functional cells, it is unlikely that human hepatocytes can be obtained on such a scale to treat a large number of patients with falling liver functions. The utilization of xenogenic hepatocytes will result in additional concerns regarding transmission of infectious pathogens and immunological and physiological incompatibilities between animals and humans. An attractive alternative to primary human hepatocytes is the use of tightly regulated human hepatocyte cell lines. Such cell lines can provide the advantages of unlimited availability, sterility and uniformity. We describe here methods for creating transplantable human hepatocyte cell lines using currently available cell cultures and gene transfer technology. [source]


    INTEGRATED LANDSCAPE ANALYSES OF CHANGE OF MIOMBO WOODLAND IN TANZANIA AND ITS IMPLICATION FOR ENVIRONMENT AND HUMAN LIVELIHOOD

    GEOGRAFISKA ANNALER SERIES A: PHYSICAL GEOGRAPHY, Issue 1 2009
    LENNART STRÖMQUIST
    ABSTRACT. Landscapes bear witness to past and present natural and societal processes influencing the environment and human livelihoods. By analysing landscape change at different spatial scales over time the effects on the environment and human livelihoods of various external and internal driving forces of change can be studied. This paper presents such an analysis of miombo woodland surrounding the Mkata plains in central Tanzania. The rich natural landscape diversity of the study area in combination with its historical and political development makes it an ideal observation ground for this kind of study. The paper focuses on long-term physical and biological changes, mainly based on satellite information but also on field studies and a review of documents and literature. The miombo woodlands are highly dynamic semi-arid ecosystems found on a number of nutrient-poor soil groups. Most of the woodlands are related to an old, low-relief geomorphology of erosion surfaces with relatively deep and leached soils, or to a lesser extent also on escarpments and steep Inselberg slopes with poor soils. Each period in the past has cast its footprints on the landscape development and its potential for a sustainable future use. On a regional level there has been a continual decrease in forest area over time. Expansion of agriculture around planned villages, implemented during the 1970s, in some cases equals the loss of forest area (Mikumi-Ulaya), whilst in other areas (Kitulangalo), the pre-independence loss of woodland was small; the agricultural area was almost the same during the period 1975,1999, despite the fact that forests have been lost at an almost constant rate over the same period. Illegal logging and charcoal production are likely causes because of the proximity to the main highway running through the area. Contrasting to the general regional pattern are the conditions in a traditional village (Ihombwe), with low immigration of people and a maintained knowledge of the resource potential of the forest with regards to edible plants and animals. In this area the local community has control of the forest resources in a Forest Reserve, within which the woody vegetation has increased in spite of an expansion of agriculture on other types of village land. The mapping procedure has shown that factors such as access to transport and lack of local control have caused greater deforestation of certain areas than during the colonial period. Planned villages have furthermore continued to expand over forest areas well after their implementation, rapidly increasing the landscape fragmentation. One possible way to maintain landscape and biodiversity values is by the sustainable use of traditional resources, based on local knowledge of their management as illustrated by the little change observed in the traditionally used area. [source]


    A Wet/Wet Differential Pressure Sensor for Measuring Vertical Hydraulic Gradient

    GROUND WATER, Issue 1 2010
    Brad G. Fritz
    Vertical hydraulic gradient is commonly measured in rivers, lakes, and streams for studies of groundwater,surface water interaction. While a number of methods with subtle differences have been applied, these methods can generally be separated into two categories; measuring surface water elevation and pressure in the subsurface separately or making direct measurements of the head difference with a manometer. Making separate head measurements allows for the use of electronic pressure sensors, providing large datasets that are particularly useful when the vertical hydraulic gradient fluctuates over time. On the other hand, using a manometer-based method provides an easier and more rapid measurement with a simpler computation to calculate the vertical hydraulic gradient. In this study, we evaluated a wet/wet differential pressure sensor for use in measuring vertical hydraulic gradient. This approach combines the advantage of high-temporal frequency measurements obtained with instrumented piezometers with the simplicity and reduced potential for human-induced error obtained with a manometer board method. Our results showed that the wet/wet differential pressure sensor provided results comparable to more traditional methods, making it an acceptable method for future use. [source]


    Tumor microenvironment in head and neck squamous cell carcinomas: Predictive value and clinical relevance of hypoxic markers.

    HEAD & NECK: JOURNAL FOR THE SCIENCES & SPECIALTIES OF THE HEAD AND NECK, Issue 6 2007
    A review
    Abstract Background. Hypoxia and tumor cell proliferation are important factors determining the treatment response of squamous cell carcinomas of the head and neck. Successful approaches have been developed to counteract these resistance mechanisms although usually at the cost of increased short- and long-term side effects. To provide the best attainable quality of life for individual patients and the head and neck cancer patient population as a whole, it is of increasing importance that tools be developed that allow a better selection of patients for these intensified treatments. Methods. A literature review was performed with special focus on the predictive value and clinical relevance of endogenous hypoxia-related markers. Results. New methods for qualitative and quantitative assessment of functional microenvironmental parameters such as hypoxia, proliferation, and vasculature have identified several candidate markers for future use in predictive assays. Hypoxia-related markers include hypoxia inducible factor (HIF)-1,, carbonic anhydrase IX, glucose transporters, erythropoietin receptor, osteopontin, and others. Although several of these markers and combinations of markers are associated with treatment outcome, their clinical value as predictive factors remains to be established. Conclusions: A number of markers and marker profiles have emerged that may have potential as a predictive assay. Validation of these candidate assays requires testing in prospective trials comparing standard treatment against experimental treatments targeting the related microregional constituent. © 2007 Wiley Periodicals, Inc. Head Neck, 2007 [source]


    Migraine Pain and Nociceptor Activation,Where Do We Stand?

    HEADACHE, Issue 5 2010
    Dan Levy PhD
    The mechanisms underlying the genesis of migraine pain remain enigmatic largely because of the absence of any identifiable cephalic pathology. Based on numerous indirect lines of evidence, 2 nonmutually exclusive hypotheses have been put forward. The first theorizes that migraine pain originates in the periphery and requires the activation of primary afferent nociceptive neurons that innervate cephalic tissues, primarily the cranial meninges and their related blood vessels. The second maintains that nociceptor activation may not be required and that the headache is promoted primarily as a result of abnormal processing of sensory signals in the central nervous system. This paper reviews the evidence leading to these disparate theories while siding with the primacy of nociceptor activation in the genesis migraine headache. The paper further examines the potential future use of established human models of migraine for addressing the origin of migraine headache. [source]


    Visualization of Helicobacter Species Within the Murine Cecal Mucosa Using Specific Fluorescence In Situ Hybridization

    HELICOBACTER, Issue 2 2005
    Vivian Chan
    ABSTRACT Background., Members of the genus Helicobacter have been associated with colitis development in a number of immunodeficient animal models. While it is known that these organisms can initiate colitis development, the location and spatial distribution of these bacteria within the intestinal tract is currently unknown. In this study, we developed and optimized fluorescence in situ hybridization (FISH) probes specifically for Helicobacter species. Materials and Methods., Based on 16S-RNA gene alignments, two probes specific for the entire family Helicobacteraceae and two probes specific for Helicobacter ganmani and Helicobacter hepaticus were designed. Evaluation of these probes was determined using ATCC reference strains and cecum samples from ten IL-10 knockout mice. The presence of Helicobacter species was determined using FISH and verified using PCR-DGGE and microscopic examination of silver stained sections. Results., Analysis of the ATCC reference strains revealed that the probes HEL274/HEL717 were specific for the family Helicobacteraceae, while HEP642 was specific for H. hepaticus and GAN1237 for H. ganmani. Using these probes, a pattern of spatial localization of the two different Helicobacter species was observed in the cecum tissues of IL-10 knockout mice. This consistently showed that H. ganmani was localized to the lower regions and H. hepaticus to the mid-upper regions of the crypts. Conclusion., We have developed FISH probes specific for the family Helicobacteraceae as well as two individual Helicobacter species. This study will allow the future use of the FISH to better understand host-pathogen interactions in vitro. [source]


    An international survey of the use and effectiveness of modern manufacturing practices

    HUMAN FACTORS AND ERGONOMICS IN MANUFACTURING & SERVICE INDUSTRIES, Issue 2 2002
    C. W. Clegg
    We describe a survey of the use and effectiveness of 12 manufacturing practices. The survey was administered to a random, stratified sample of companies with 150 or more employees in the United Kingdom, Australia, Japan, and Switzerland, yielding a total sample of 898 companies. We report findings on the extent of use of the practices, when they were introduced, their predicted future use, their effectiveness, and the correlates of their use and effectiveness. The data are examined for differences by country of location and country of ownership, as well as by industrial sector. © 2002 Wiley Periodicals, Inc. [source]


    Harvesting haemopoietic stem cells for future use: rainy days, real or imagined?

    INTERNAL MEDICINE JOURNAL, Issue 4 2008
    J. Szer
    No abstract is available for this article. [source]


    The second generation of human security: lessons from the UN and EU experience

    INTERNATIONAL AFFAIRS, Issue 1 2010
    MARY MARTIN
    The concept of human security, while much contested in both academic and policy debates, and highly fragmented across different meanings and forms of implementation, offers a potential locus around which global security discourse might converge, particularly in light of current shifts in US security thinking. However, key pioneers of human security, such as the United Nations and Canada, appear to be losing their enthusiasm for the concept, just at the moment when others such as the European Union, are advancing a human security agenda. This article examines the divergence of human security narratives between the UN and the EU. It argues that the UN's use of the concept ran aground owing to a triple problematic of lack of clarity, confusion between previously distinct policy streams on human rights and human development and conceptual overstretch. After assessing the EU experience with the concept to date, the article argues that future use of human security will require greater focus on how it deepens ideas of individual security, rather than treating it as an agenda for broadening security. As well as a need to project clarity on the conceptual definition of human security, there is also a need to associate human security with greater clarity of intent. If successful, this would contribute to establishing second generation human security as a new policy paradigm. [source]


    EVIDENCE SYNTHESIS: Systematic review of current executive function measures in adults with and without cognitive impairments

    INTERNATIONAL JOURNAL OF EVIDENCE BASED HEALTHCARE, Issue 3 2010
    Sabrina Pickens PhDc MSN ANP-BC GNP-BC
    Abstract Background, Executive function pertains to higher cognitive processes historically linked to frontal lobes. Several measures are available to screen for executive function; however, no gold standard exists. The difficulty in assessing executive function is the existence of its many subsets. Objectives, To evaluate the psychometric properties of executive function measures and determine the most effective measure(s) through a systematic review of the literature. Search strategy, The search strategy utilised a comprehensive literature review of articles written in the English language published from January 2003 to September 2009. The following electronic databases were searched: SCOPUS, PUBMED, Medline Ovid, PsychArticles and CINAHL Plus. Initial key words used were ,executive function', ,measures', ,reliability' and ,validity' followed by the addition of ,traumatic brain injury'. The initial search elicited 226 articles, of which 28 were retrieved. After further exclusion 19 were included in the review. Results, Eight measures underwent factor analysis and 18 underwent various forms of reliability and/or validity testing. Factor analysis showed different aspects of executive functions. According to preset evaluation criteria, only the Test of Practical Judgment performed all of the recommended reliability and validity testing. Reviewer's conclusion, Of the recently developed measures, several show promise for future use yet further validity and reliability testing is warranted. Future tool development should measure all subsets of executive function rather than only a few and include the recommended components of reliability and validity testing. [source]


    Evidence-based practice-focused interactive teaching strategy: a controlled study

    JOURNAL OF ADVANCED NURSING, Issue 6 2009
    Son C. Kim
    Abstract Title.,Evidence-based practice-focused interactive teaching strategy: a controlled study. Aim., This paper is a report of a study to evaluate the effectiveness of the evidence-based practice (EBP)-focused interactive teaching (E-FIT) strategy. Background., Although EBP is a mandatory competency for all healthcare professionals, little is known about the effectiveness of E-FIT in nursing. Methods., A quasi-experimental, controlled, pre- and post-test study involving senior, 4th-year nursing students (N = 208) at two nursing schools in the USA was carried out from August 2007 to May 2008. The experimental group (n = 88) received the E-FIT strategy intervention and the control group (n = 120) received standard teaching. A Knowledge, Attitudes and Behaviors Questionnaire for Evidence-Based Practice was used to assess the effectiveness of the E-FIT strategy. Results., Independent t -tests showed that the experimental group had statistically significant higher post-test Evidence-Based Practice Knowledge (mean difference = 0·25; P = 0·001) and Evidence-Based Practice Use (mean difference = 0·26; P = 0·015) subscale scores compared to the control group, but showed no statistically significant differences in Attitudes toward Evidence-Based Practice and Future Use of Evidence-Based Practice (mean difference = ,0·12; P = 0·398 and mean difference = 0·13; P = 0·255 respectively). Hierarchical multiple regression analyses of the post-test data indicated that the intervention explained 7·6% and 5·1% of variance in Evidence-Based Practice Knowledge and Evidence-Based Practice Use respectively. Conclusion., The EBP-focused interactive teaching strategy was effective in improving the knowledge and use of EBP among nursing students but not attitudes toward or future use of EBP. [source]


    REVIEW: Questionnaires in ecology: a review of past use and recommendations for best practice

    JOURNAL OF APPLIED ECOLOGY, Issue 3 2005
    PIRAN C. L. WHITE
    Summary 1Questionnaires, or social surveys, are used increasingly as a means of collecting data in ecology. We present a critical review of their use and give recommendations for good practice. 2We searched for papers in which questionnaires were used in 57 ecological academic journals, published over the period 1991,2003 inclusive. This provided a total sample size of 168 questionnaires from 127 papers published in 22 academic journals. 3Most questionnaires were carried out in North America and western Europe, and addressed species-level issues, principally focusing on mammals. The majority were concerned with impacts of species and/or their conservation, and just under half with human,wildlife interactions. 4Postal survey was the method used most frequently to carry out the questionnaires, followed by in-person interviews. Some questionnaires were conducted by telephone, and none was web-based. 5Most questionnaires were concerned with obtaining factual information or perceptions of facts. Ground-truthing (independent verification of the facts) was carried out in less than 10% of questionnaires. 6The mean (± SE) sample size (number of respondents) per questionnaire was 1422 ± 261 and the average (± SE) response rate was 63 ± 3%. These figures varied widely depending on the methods used to conduct the questionnaire. 7The analysis of data was mostly descriptive. Simple univariate methods were the most frequently used statistical tools, and data from a third of questionnaires were not subjected to any analysis beyond simple descriptions of the results. 8Synthesis and applications. We provide recommendations for best practice in the future use of questionnaires in ecology, as follows: (i) the definition of the target population, any hypotheses to be tested and procedures for the selection of participants should be clearly documented; (ii) questionnaires should be piloted prior to their use; (iii) the sample size should be sufficient for the statistical analysis; (iv) the rationale for the choice of survey method should be clearly stated; (v) the number of non-respondents should be minimized; (vi) the question and answer format should be kept as simple as possible; (vii) the structure of the questionnaire and the data emerging from it should be unambiguously shown in any publication; (viii) bias arising from non-response should be quantified; (ix) the accuracy of data should be assessed by ground-truthing where relevant; (x) the analysis of potentially interrelated data should be done by means of modelling. Researchers should also consider whether alternative, interpretative methods, such as in-depth interviews or participatory approaches, may be more appropriate, for example where the focus is on elucidating motivations or perceptions rather than testing factual hypotheses. [source]


    Reproductive Autonomy Rights and Genetic Disenhancement: Sidestepping the Argument from Backhanded Benefit

    JOURNAL OF APPLIED PHILOSOPHY, Issue 2 2004
    Martin Harvey
    abstract John Robertson has famously argued that the right to reproductive autonomy is exceedingly broad in scope. That is, as long as a particular reproductive preference such as having a deaf child is "determinative" of the decision to reproduce then such preferences fall under the protective rubric of reproductive autonomy rights. Importantly, the deafness in question does not constitute a harm to the child thereby wrought since unless the child could be born deaf he or she would otherwise never have existed, his or her prospective parents would simply have chosen to abort. As such, for this child, being born deaf counts as a benefit, albeit of the "backhanded" variety, since the only other practical alternative is nonexistence. In what follows, I want to investigate this argument in detail. The target of my investigation will be the possible future use of gene therapy technology to "disenhance" one's offspring. I intend to show that the apparently unlimited right to reproductive autonomy, that is, the right to choose both the quantity and qualities of future offspring, entailed by the argument from backhanded benefit can in fact be "sidestepped" through considering what sorts of reproductive practices we as a society ought to allow. [source]


    Fishes as models in studies of sexual selection and parental care

    JOURNAL OF FISH BIOLOGY, Issue 2003
    T. Amundsen
    Fishes are by far the most diverse group of vertebrates. This fact is in no way, however, reflected in their use as model organisms for understanding sexual selection or parental care. Why is this so? Is it because fishes are actually poor models? The usefulness of fishes as models for sexual selection and parental care is discussed by emphasizing some problems inherent in fish studies, along with a number of reasons why fishes are indeed excellently suited. The pros and cons of fishes as models are discussed mainly by comparison with birds, the most popular model organisms in animal behaviour. Difficulties include a lack of background knowledge for many species, and the problems of marking and observing fishes in their natural environment. Positive attributes include the diversity of lifestyles among fishes, and the ease with which they can be studied experimentally in the laboratory. How useful fish models can be is briefly illustrated by the impressive and broadly relevant advances derived from studies of guppies Poecilia reticulata and three-spined sticklebacks Gasterosteus aculeatus. A selection of topics is highlighted where fish studies have either advanced or could greatly enhance, the understanding of processes fundamental to animal reproductive dynamics. Such topics include sex role dynamics, the evolution of female ornamentation and mate choice copying. Finally, a number of potential pitfalls in the future use of fish as models for sexual selection and parental care are discussed. Researchers interested in these issues are recommended to make much more extensive use of fish models, but also to adopt a wider range of models among fishes. [source]