Home About us Contact | |||
Extensive Use (extensive + use)
Selected AbstractsAerobic biodegradation of the chloroethenes: pathways, enzymes, ecology, and evolutionFEMS MICROBIOLOGY REVIEWS, Issue 4 2010Timothy E. Mattes Abstract Extensive use and inadequate disposal of chloroethenes have led to prevalent groundwater contamination worldwide. The occurrence of the lesser chlorinated ethenes [i.e. vinyl chloride (VC) and cis -1,2-dichloroethene (cDCE)] in groundwater is primarily a consequence of incomplete anaerobic reductive dechlorination of the more highly chlorinated ethenes (tetrachloroethene and trichloroethene). VC and cDCE are toxic and VC is a known human carcinogen. Therefore, their presence in groundwater is undesirable. In situ cleanup of VC- and cDCE-contaminated groundwater via oxidation by aerobic microorganisms is an attractive and potentially cost-effective alternative to physical and chemical approaches. Of particular interest are aerobic bacteria that use VC or cDCE as growth substrates (known as the VC- and cDCE-assimilating bacteria). Bacteria that grow on VC are readily isolated from contaminated and uncontaminated environments, suggesting that they are widespread and influential in aerobic natural attenuation of VC. In contrast, only one cDCE-assimilating strain has been isolated, suggesting that their environmental occurrence is rare. In this review, we will summarize the current knowledge of the physiology, biodegradation pathways, genetics, ecology, and evolution of VC- and cDCE-assimilating bacteria. Techniques (e.g. PCR, proteomics, and compound-specific isotope analysis) that aim to determine the presence, numbers, and activity of these bacteria in the environment will also be discussed. [source] Differential mRNA expression levels and gene sequences of carboxylesterase in both deltamethrin resistant and susceptible strains of the cotton aphid, Aphis gossypiiINSECT SCIENCE, Issue 3 2008Chuan-Wang Cao Abstract Extensive use of insecticides on cotton has prompted resistance development in the cotton aphid, Aphis gossypii (Glover) in China. A deltamethrin-selected population of cotton aphids from Xinjiang Uygur Autonomous Region, China with 228.59-fold higher resistance to deltamethrin was used to examine how carboxylesterase conferred resistance to this pyrethroid insecticide. The carboxylesterase activity in the deltamethrin-resistant strain was 3.67-, 2.02- and 1.16-fold of the susceptible strain when using ,-naphthyl acetate (,-NA), ,-naphthyl acetate (,-NA) and ,-naphthyl butyrate (,-NB) as substrates, respectively. Carboxylesterase cDNA was cloned and sequenced from both deltamethrin-resistant and susceptible strains. The cDNA contained 1581 bp open reading frames (ORFs) coding a 526 amino acid protein. Only one amino acid substitution (Val87 -Ala) was observed between deltamethrin-resistant and susceptible strains but it is not genetically linked to resistance by the catalytic triad and signature motif analysis. The real-time polymerase chain reaction analysis indicated that the resistant strain had a 6.61-fold higher level of carboxylesterase mRNA than the susceptible strain. The results revealed that up-regulation of the carboxylesterase gene, not modified gene structure, may be responsible for the development of resistance in cotton aphids to deltamethrin. [source] Authority and leadership: the evolution of nursing management in 19th century teaching hospitalsJOURNAL OF NURSING MANAGEMENT, Issue 1 2008CAROL HELMSTADTER BA (Hons), RN (Retd) Aim, This study shows why some 19th century nursing managers were successful and some were not. Background, With the exception of Florence Nightingale, almost nothing has been written about 19th century nursing managers. Method, Classical historical method is used. Extensive use is made of secondary sources. Primary sources are found in the archives of the 12 London teaching hospitals, the Radcliffe Infirmary, the Convents of St John the Divine and the All Saints Sisters, and 16 000 Nightingale documents in the Collected Works of Florence Nightingale. Results, Success in delivering a highly competent nursing service depended on the matron's leadership and legitimate authority but she also had to have the support of her hospital board to gain access to allocation of scarce resources. Implications for nursing management, While the 19th century hospital environment was very different, how nurses directed under different circumstances clarifies our knowledge of successful nursing management in 2007. [source] Pharmacokinetic aspects of biotechnology productsJOURNAL OF PHARMACEUTICAL SCIENCES, Issue 9 2004Lisa Tang Abstract In recent years, biotechnologically derived peptide and protein-based drugs have developed into mainstream therapeutic agents. Peptide and protein drugs now constitute a substantial portion of the compounds under preclinical and clinical development in the global pharmaceutical industry. Pharmacokinetic and exposure/response evaluations for peptide and protein therapeutics are frequently complicated by their similarity to endogenous peptides and proteins as well as protein nutrients. The first challenge frequently comes from a lack of sophistication in various analytical techniques for the quantification of peptide and protein drugs in biological matrices. However, advancements in bioassays and immunoassays,along with a newer generation of mass spectrometry-based techniques,can often provide capabilities for both efficient and reliable detection. Selection of the most appropriate route of administration for biotech drugs requires comprehensive knowledge of their absorption characteristics beyond physicochemical properties, including chemical and metabolic stability at the absorption site, immunoreactivity, passage through biomembranes, and active uptake and exsorption processes. Various distribution properties dictate whether peptide and protein therapeutics can reach optimum target site exposure to exert the intended pharmacological response. This poses a potential problem, especially for large protein drugs, with their typically limited distribution space. Binding phenomena and receptor-mediated cellular uptake may further complicate this issue. Elimination processes,a critical determinant for the drug's systemic exposure,may follow a combination of numerous pathways, including renal and hepatic metabolism routes as well as generalized proteolysis and receptor-mediated endocytosis. Pharmacokinetic/pharmacodynamic (PK/PD) correlations for peptide and protein-based drugs are frequently convoluted by their close interaction with endogenous substances and physiologic regulatory feedback mechanisms. Extensive use of pharmacokinetic and exposure/response concepts in all phases of drug development has in the past been identified as a crucial factor for the success of a scientifically driven, evidence-based, and thus accelerated drug development process. Thus, PK/PD concepts are likely to continue and expand their role as a fundamental factor in the successful development of biotechnologically derived drug products in the future. © 2004 Wiley-Liss, Inc. and the American Pharmacists Association J Pharm Sci 93:2184,2204, 2004 [source] Combat Stress Casualties in Iraq.PERSPECTIVES IN PSYCHIATRIC CARE, Issue 3 2008Part 1: Behavioral Health Consultation at an Expeditionary Medical Group PURPOSE.,We review the role of military mental health professionals in consulting with inpatient medical patients and staff at a combat hospital and aeromedical evacuation staging facility in Iraq. CONCLUSIONS.,Behavioral health consultation with medical and surgical patients during hospitalization and prior to aeromedical evacuation can help identify patients with combat stress exposure that may require future mental health follow-up. PRACTICE IMPLICATIONS.,Extensive use of civilian mental health practitioners including nurse psychotherapists and psychiatric nurse practitioners will be needed to provide psychiatric care for the large number of U.S. veterans who return from deployment with combat stress related disorders. [source] Diclofenac and acute myocardial infarction in patients with no major risk factorsBRITISH JOURNAL OF CLINICAL PHARMACOLOGY, Issue 5 2007Susan S. Jick What is already known about this subject ,,We recently published the results of a study on the risk of acute myocardial infarction (AMI) in users of five nonsteroidal anti-inflammatory drugs during the years 2001 to 2005. ,,The results demonstrated, as has been reported in randomized trials, that rofecoxib and celecoxib increase the risk of AMI when taken for at least 10 months. ,,As expected, ibuprofen and naproxen did not materially increase the risk. ,,However, long-term users of diclofenac were at an increased risk of AMI similar to that of users of rofecoxib and celecoxib. What this study adds ,,Extensive use of diclofenac, similarly to rofecoxib and celecoxib, substantially increases the risk of AMI. ,,There is little suggestion of such an effect in users of ibuprofen and naproxen. Aims To explore further a recent finding that long-term users of diclofenac are at increased risk of acute myocardial infarction (AMI) similar to users of rofecoxib and celecoxib. Methods Using the General Practice Research Database, we conducted three separate nested case,control studies of three nonsteroidal anti-inflammatory drugs (NSAIDs) where use started after 1 January 1993 , diclofenac, ibuprofen and naproxen. Cases of AMI were identified between 1 January 1993 and 31 December 2000. Relative risk (RR) estimates for AMI in patients with no major clinical risk factors were determined for each NSAID according to number of prescriptions received compared with one prescription. Results were adjusted for variables possibly related to risk of AMI. Results There was no material elevation of AMI risk according to the number of prescriptions for ibuprofen [RRs and 95% confidence intervals (CIs) 1.0 (0.6, 1.6) and 1.7 (0.9, 3.1) for use of 10,19 and 20+ prescriptions, respectively, compared with one prescription] or naproxen [RRs 1.0 (0.5, 2.2) and 2.0 (0.9, 4.5) for use of 10,19 and 20+ prescriptions, respectively]. However, a substantial increased risk similar to that obtained in our prior study was found in patients who received ,10 prescriptions for diclofenac [RRs 1.9 (1.3, 2.7) and 2.0 (1.3, 3.0) for use of 10,19 and 20+ prescriptions, respectively]. Conclusions Extensive use of diclofenac substantially increases the risk of AMI. There is little suggestion of such an effect in users of ibuprofen and naproxen. [source] Interactive Web-based package for computer-aided learning of structural behaviorCOMPUTER APPLICATIONS IN ENGINEERING EDUCATION, Issue 3 2002X. F. Yuan Abstract This paper presents an innovative Web-based package named CALSB for computer-aided learning of structural behavior. The package was designed to be widely accessible through the Internet, user-friendly by the automation of many input functions and the extensive use of cursor movements, and dynamically interactive by linking all input and output data to a single graphical display on the screen. The package includes an analysis engine based on the matrix stiffness method, so the response of any two-dimensional skeletal structure can be predicted and graphically displayed. The package thus provides a virtual laboratory environment in which the user can "build and test" two-dimensional skeletal structures of unlimited choices to enhance his understanding of structural behavior. In addition, the package includes two other innovative features, structural games and paradoxes. The structural games in the package represent perhaps the first attempt at intentionally combining the learning of structural behavior with joy and excitement, while the structural paradoxes provide a stimulating environment conducive for the development of creative problem solving skills of the user. © 2002 Wiley Periodicals, Inc. Comput Appl Eng Educ 10: 121,136, 2002; Published online in Wiley InterScience (www.interscience.wiley.com); DOI 10.1002/cae.10020 [source] The Readability of Path-Preserving Clusterings of GraphsCOMPUTER GRAPHICS FORUM, Issue 3 2010Daniel Archambault Abstract Graph visualization systems often exploit opaque metanodes to reduce visual clutter and improve the readability of large graphs. This filtering can be done in a path-preserving way based on attribute values associated with the nodes of the graph. Despite extensive use of these representations, as far as we know, no formal experimentation exists to evaluate if they improve the readability of graphs. In this paper, we present the results of a user study that formally evaluates how such representations affect the readability of graphs. We also explore the effect of graph size and connectivity in terms of this primary research question. Overall, for our tasks, we did not find a significant difference when this clustering is used. However, if the graph is highly connected, these clusterings can improve performance. Also, if the graph is large enough and can be simplified into a few metanodes, benefits in performance on global tasks are realized. Under these same conditions, however, performance of local attribute tasks may be reduced. [source] A standards-based Grid resource brokering service supporting advance reservations, coallocation, and cross-Grid interoperabilityCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 18 2009Erik Elmroth Abstract The problem of Grid-middleware interoperability is addressed by the design and analysis of a feature-rich, standards-based framework for all-to-all cross-middleware job submission. The architecture is designed with focus on generality and flexibility and builds on extensive use, internally and externally, of (proposed) Web and Grid services standards such as WSRF, JSDL, GLUE, and WS-Agreement. The external use provides the foundation for easy integration into specific middlewares, which is performed by the design of a small set of plugins for each middleware. Currently, plugins are provided for integration into Globus Toolkit 4 and NorduGrid/ARC. The internal use of standard formats facilitates customization of the job submission service by replacement of custom components for performing specific well-defined tasks. Most importantly, this enables the easy replacement of resource selection algorithms by algorithms that address the specific needs of a particular Grid environment and job submission scenario. By default, the service implements a decentralized brokering policy, striving to optimize the performance for the individual user by minimizing the response time for each job submitted. The algorithms in our implementation perform resource selection based on performance predictions, and provide support for advance reservations as well as coallocation of multiple resources for coordinated use. The performance of the system is analyzed with focus on overall service throughput (up to over 250 jobs per min) and individual job submission response time (down to under 1,s). Copyright © 2009 John Wiley & Sons, Ltd. [source] Simple verification technique for complex Java bytecode subroutinesCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 7 2004Alessandro Coglio Abstract Java is normally compiled to bytecode, which is verified and then executed by the Java Virtual Machine. Bytecode produced via compilation must pass verification. The main cause of complexity for bytecode verification is subroutines, used by compilers to generate more compact code. The techniques to verify subroutines proposed in the literature reject certain programs produced by mundane compilers, are difficult to realize within an implementation of the Java Virtual Machine or are relatively complicated. This paper presents a novel technique which is very simple to understand, implement and prove sound. It is also very powerful: the set of accepted programs has a simple characterization which most likely includes all the code produced by current compilers and which enables future compilers to make more extensive use of subroutines. Copyright © 2004 John Wiley & Sons, Ltd. [source] Communications skills in dental education: a systematic research reviewEUROPEAN JOURNAL OF DENTAL EDUCATION, Issue 2 2010J. A. Carey Abstract Communication is an essential element of the relationship between patient and dentist. Dental schools are required to ensure that undergraduates are adequately trained in communication skills yet little evidence exists to suggest what constitutes appropriate training and how competency can be assessed. This review aimed to explore the scope and quality of evidence relating to communication skills training for dental students. Eleven papers fitted the inclusion criteria. The review found extensive use amongst studies of didactic learning and clinical role-play using simulated patients. Reported assessment methods focus mainly on observer evaluation of student interactions at consultation. Patient involvement in training appears to be minimal. This review recommends that several areas of methodology be addressed in future studies, the scope of research extended to include intra-operative communication, and that the role of real patients in the development of communication skills be active rather than passive. [source] Stereoselective Construction of a Highly Functionalized Taxoid ABC-Ring System: the C2,C9 Oxa-Bridge ApproachEUROPEAN JOURNAL OF ORGANIC CHEMISTRY, Issue 19 2005Sylvain Hamon Abstract The goal of this investigation is to assemble the 20-carbon unit 1 of the taxoid diterpene skeleton with a high level of stereocontrol by means of a three-reaction sequence developed in this laboratory. The strategy involves seven C,C bond-forming operations together with eighteen functional group transformations, circumventing the stereoselectivity issue altogether. Furthermore, there is no isomer formation and hence no need for chromatographic separation. A temporary oxa-bridge (C2/C9) was used as a problem-solving approach. The key step in the planned sequence was based on achieving the last C,C bonding between C11 and C12, following a successful C11 functionalization. X-ray analyses of 8b, 17, 18, 19, 20, and 21, together with extensive use of 800 MHz 1H (200 MHz 13C) NMR spectra, support the suggested structures. (© Wiley-VCH Verlag GmbH & Co. KGaA, 69451 Weinheim, Germany, 2005) [source] Looking for Coherence within the European Community*EUROPEAN LAW JOURNAL, Issue 2 2005Stefano Bertea It focuses on a specific dimension of this relationship and shows how the appeals to coherence made by the European Court of Justice have shaped a particular branch of the European legal order, namely, the judicial review of Community acts. The analysis of the Court of Justice's case law in this field shows that in its extensive use of coherence the Court of Justice explored and brought into play different types of coherence and, while it failed to distinguish between them, it made use of sorts of coherence that thus far legal theorists have disregarded. The article concludes that a closer collaboration between legal theory and legal practice would be profitable for both legal theorists and Community law specialists. [source] Computation of time delay margin for power system small-signal stabilityEUROPEAN TRANSACTIONS ON ELECTRICAL POWER, Issue 7 2009Saffet AyasunArticle first published online: 19 JUN 200 Abstract With the extensive use of phasor measurement units (PMU) in the wide-area measurement/monitoring systems (WAMS), time delays have become unavoidable in power systems. This paper presents a direct and exact method to compute the delay margin of power systems with single and commensurate time delays. The delay margin is the maximum amount of time delay that the system can tolerate before it becomes unstable for a given operating point. First, without using any approximation or substitution, the transcendental characteristic equation is converted into a polynomial without the transcendentality such that its real roots coincide with the imaginary roots of the characteristic equation exactly. The resulting polynomial also enables us to easily determine the delay dependency of the system stability and the sensitivities of crossing roots with respect to time delay. Then, an expression in terms of system parameters and imaginary root of the characteristic equation is derived for computing the delay margin. The proposed method is applied to a single-machine-infinite bus (SMIB) power system with an exciter. Delay margins are computed for a wide range of system parameters including generator mechanical power, damping and transient reactance, exciter gain, and transmission line reactance. The results indicate that the delay margin decreases as the mechanical power, exciter gain and line reactance increase while it increases with increasing generator transient reactance Additionally, the relationship between the delay margin and generator damping is found be relatively complex. Finally, the theoretical delay margin results are validated using the time-domain simulations of Matlab. Copyright © 2008 John Wiley & Sons, Ltd. [source] The Dialoguer: An Interactive Bilingual Interface to a Network Operating SystemEXPERT SYSTEMS, Issue 3 2001Emad Al-Shawakfa We have developed a bilingual interface to the Novell network operating system, called the Dialoguer. This system carries on a conversation with the user in Arabic or English or a combination of the two and attempts to help the user use the Novell network operating system. Learning to use an operating system is a major barrier in starting to use computers. There is no single standard for operating systems which makes it difficult for novice users to learn a new operating system. With the proliferation of client,server environments, users will eventually end up using one network operating system or another. These problems motivated our choice of an area to work in and they have made it easy to find real users to test our system. This system is both an expert system and a natural language interface. The system embodies expert knowledge of the operating system commands and of a large variety of plans that the user may want to carry out. The system also contains a natural language understanding component and a response generation component. The Dialoguer makes extensive use of case frame tables in both components. Algorithms for handling a bilingual dialogue are one of the important contributions of this paper along with the Arabic case frames. [source] Early pyrotechnology in the Near East: Experimental lime-plaster production at the Pre-Pottery Neolithic B site of Kfar HaHoresh, IsraelGEOARCHAEOLOGY: AN INTERNATIONAL JOURNAL, Issue 6 2008Y. Goren A characteristic hallmark of the Pre-Pottery Neolithic B (PPNB) in the southern Levant was the extensive use of lime plaster for architectural and other purposes. Yet no obvious kilns have been identified in archaeological contexts. Here we present details of an experimental pit-kiln modeling lime-plaster production based on observed burnt stone accumulations in pits at the PPNB site of Kfar HaHoresh in the lower Galilee. The experimental kiln was loaded in layers with ,500 kg of limestone (pebbles and stones) and ,1000 kg of fuel (branches and dung). Fired for 24 hours, and reaching a maximum 870°C, the kiln yielded almost 250 kg of quicklime (calcium oxide, CaO). Micromorphological samples, general observations, and scaled plan view drawings made immediately following and nine years after ignition demonstrate that the original shape of the kiln and residual quicklime within and around it rapidly dissipated through bioturbation, trampling by animals, erosion, rain, and exposure to the elements. This could account for the seeming absence of kilns within sites, although they were probably located close to where lime-plaster was applied, given the unstable nature and toxic effects of handling quicklime. Calculations of the manpower and fuel involved indicate that PPNB lime-plaster production may have been less labor intensive and less detrimental to the environment than previously portrayed. © 2008 Wiley Periodicals, Inc. [source] Skiing Less Often in a Warmer World: Attitudes of Tourists to Climate Change in an Australian Ski ResortGEOGRAPHICAL RESEARCH, Issue 2 2010CATHERINE MARINA PICKERING Abstract Climate change will affect tourism destinations that are dependent on natural resources, such as snow. Currently there is limited research into attitudes, intentions and actual visitation patterns of skiers in response to reduced snow cover. Therefore the awareness of, and attitudes towards, climate change of 351 ski tourists were assessed in the largest ski resort in Australia in 2007, repeating a survey conducted in 1996. Ninety percent of skiers in 2007 would ski less often in Australian resorts if the next five years had low natural snow, up from 75% of skiers surveyed in 1996: 69% would ski less often, 5% would give up and 16% would ski at the same levels but overseas. Nearly all skiers thought that climate change would affect the ski industry (87% compared with 78% in 1996), and that this would occur sooner than in the 1996 survey. Visitation in a poor snow year (2006, +0.85°C average annual temperature, 54% less natural snow) was ,13.6% of the long-term average, indicating poor natural snow resulted in decreased visitation, despite extensive use of snow making. The implications of changes in climate conditions and tourist attitudes for Australian ski resorts are assessed including for snow making and summer tourism. [source] Fire regimes of China: inference from statistical comparison with the United StatesGLOBAL ECOLOGY, Issue 5 2009Meg A. Krawchuk ABSTRACT Aim, Substantial overlap in the climate characteristics of the United States and China results in similar land-cover types and weather conditions, especially in the eastern half of the two countries. These parallels suggest similarities in fire regimes as well, yet relatively little is known about the historical role of fire in Chinese ecosystems. Consequently, we aimed to infer fire regime characteristics for China based on our understanding of climate,fire relationships in the United States. Location, The conterminous United States and the People's Republic of China. Methods, We used generalized additive models to quantify the relationship between reference fire regime classes adopted by the LANDFIRE initiative in the United States, and a global climate data set. With the models, we determined which climate variables best described the distribution of fire regimes in the United States then used these models to predict the spatial distribution of fire regimes in China. The fitted models were validated quantitatively using receiver operating characteristic area under the curve (AUC). We validated the predicted fire regimes in China by comparison with palaeoecological fire data and satellite-derived estimates of current fire activity. Results, Quantitative validation using the AUC indicated good discrimination of the distribution of fire regimes by models for the United States. Overall, fire regimes with more frequent return intervals were more likely in the east than in the west. The resolution of available historical and prehistorical fire data for China, including sediment cores, allowed only coarse, qualitative validation, but provided supporting evidence that fire has long been a part of ecosystem function in eastern China. MODIS satellite data illustrated that fire frequency within the last decade supported the classification of much of western China as relatively fire-free; however, much of south-eastern China experiences more fire activity than predicted with our models, probably as a function of the extensive use of fire by people. Conclusions, While acknowledging there are many cultural, environmental and historical differences between the United States and China, our fire regime models based on climate data demonstrate potential historical fire regimes for China, and propose that large areas of China share historical fire,vegetation,climate complexes with the United States. [source] Administrative Reform: Changing Leadership Roles?GOVERNANCE, Issue 4 2001Tom Christensen Three interwoven change elements characterize New Public Management: substantial horizontal and vertical specialization, substituting an integrated sector model for a fragmented functional model, and extensive use of contracts as part of a "make the manager manage" kind of incentive system. This article discusses the effects and implications of these reform elements on political-democratic processes in general, and on political, administrative, and public enterprise leadership roles more specifically. Examples from Norway and New Zealand illustrate the discussion. [source] Pesticides in Ground Water of the United States, 1992,1996GROUND WATER, Issue 6 2000Dana W. Kolpin During the first cycle of the National Water Quality Assessment (1992,1996), ground water in 20 of the nation's major hydro-logic basins was analyzed for 90 pesticide compounds (pesticides and degradates). One or more of the pesticide compounds examined were detected at 48.4% of the 2485 ground water sites sampled. However, approximately 70% of the sites where pesticides were detected, two or more pesticide compounds analyzed were present,documenting the prevalence of pesticide mixtures in ground water. The pesticide concentrations encountered were generally low, with the median total concentration (summation of concentrations for the 90 pesticide compounds) being 0.046 ,g/L. Pesticides were commonly detected in shallow ground water beneath both agricultural (60.4%) and urban (48.5%) areas. This is an important finding because, although agricultural activities have long been associated with pesticide contamination, urban areas have only recently been recognized as a potential source of these types of compounds. Pesticides with higher frequencies of detection were generally those with more extensive use, greater environmental persistence, and greater mobility in ground water (lower soil-water partition coefficients). [source] How States Augment the Capabilities of Technology,Pioneering FirmsGROWTH AND CHANGE, Issue 2 2002Maryann P. Feldman State governments offer a variety of programs to assist technology intensive entrepreneurial firms yet there is a limited understanding of how firms use these programs. This paper provides a framework for categorizing state technology programs and uses detailed case studies to examine how these programs augment firms' capabilities. It is concluded that firms made extensive use of state programs that provide access to university intellectual property and research facilities. In addition, firms participated in programs that provided incentives for faculty to conduct joint research with industry. Finally, state venture capital programs, though small relative to federal R&D grants or venture capital, appear to nurture firms' development. [source] Coding Response to a Case-Mix Measurement System Based on Multiple DiagnosesHEALTH SERVICES RESEARCH, Issue 4p1 2004Colin Preyra Objective. To examine the hospital coding response to a payment model using a case-mix measurement system based on multiple diagnoses and the resulting impact on a hospital cost model. Data Sources. Financial, clinical, and supplementary data for all Ontario short stay hospitals from years 1997 to 2002. Study Design. Disaggregated trends in hospital case-mix growth are examined for five years following the adoption of an inpatient classification system making extensive use of combinations of secondary diagnoses. Hospital case mix is decomposed into base and complexity components. The longitudinal effects of coding variation on a standard hospital payment model are examined in terms of payment accuracy and impact on adjustment factors. Principal Findings. Introduction of the refined case-mix system provided incentives for hospitals to increase reporting of secondary diagnoses and resulted in growth in highest complexity cases that were not matched by increased resource use over time. Despite a pronounced coding response on the part of hospitals, the increase in measured complexity and case mix did not reduce the unexplained variation in hospital unit cost nor did it reduce the reliance on the teaching adjustment factor, a potential proxy for case mix. The main implication was changes in the size and distribution of predicted hospital operating costs. Conclusions. Jurisdictions introducing extensive refinements to standard diagnostic related group (DRG)-type payment systems should consider the effects of induced changes to hospital coding practices. Assessing model performance should include analysis of the robustness of classification systems to hospital-level variation in coding practices. Unanticipated coding effects imply that case-mix models hypothesized to perform well ex ante may not meet expectations ex post. [source] Cardiovascular dialysis instability and convective therapiesHEMODIALYSIS INTERNATIONAL, Issue 2006Antonio SANTORO Abstract Acute hypotension is a frequent hemodialysis complication. Intratreatment vascular instability is a multifactorial process in which procedure-related and patient-related factors may influence the decrease in plasma volume and induce an impairment of cardiovascular regulatory mechanisms. Identification of the most susceptible patients and of the various risk factors may contribute to significantly improve cardiovascular stability during dialysis. In some high-risk patients, monitoring and biofeedback of the various hemodynamic variables, together with an extensive use of convection, can prevent the appearance of symptomatic hypotension and help in averting its onset. [source] Ultrathin Carbon Nanotube Mat Electrodes for Enhanced Amperometric DetectionADVANCED MATERIALS, Issue 30 2009Ioana Dumitrescu A cCVD method for producing transparent, conducting carbon nanotube (CNT) mats of near complete surface coverage is described. Disk-shaped UMEs using the CNT mats reveal reversible electrochemistry for outer sphere redox species and remarkably low capacitance. CNT mat UMEs can be used for the electrochemical detection of dopamine at micromole concentrations in albumin solution, with no decrease in electrode performance even after extensive use. [source] The evolution of mathematical immunologyIMMUNOLOGICAL REVIEWS, Issue 1 2007Yoram Louzoun Summary:, The types of mathematical models used in immunology and their scope have changed drastically in the past 10 years. Classical models were based on ordinary differential equations (ODEs), difference equations, and cellular automata. These models focused on the ,simple' dynamics obtained between a small number of reagent types (e.g. one type of receptor and one type of antigen or two T-cell populations). With the advent of high-throughput methods, genomic data, and unlimited computing power, immunological modeling shifted toward the informatics side. Many current applications of mathematical models in immunology are now focused around the concepts of high-throughput measurements and system immunology (immunomics), as well as the bioinformatics analysis of molecular immunology. The types of models have shifted from mainly ODEs of simple systems to the extensive use of Monte Carlo simulations. The transition to a more molecular and more computer-based attitude is similar to the one occurring over all the fields of complex systems analysis. An interesting additional aspect in theoretical immunology is the transition from an extreme focus on the adaptive immune system (that was considered more interesting from a theoretical point of view) to a more balanced focus taking into account the innate immune system also. We here review the origin and evolution of mathematical modeling in immunology and the contribution of such models to many important immunological concepts. [source] The decline of incentive pay in British manufacturingINDUSTRIAL RELATIONS JOURNAL, Issue 4 2010James Arrowsmith ABSTRACT Motivation theories and the strategic pay literature envisage that the management of employees can be well-served by financial incentives and other forms of pay flexibility. Traditionally, UK manufacturing has made extensive use of variable payments systems (VPS), notably piece-work and bonuses, but these have declined at the same time as managerial control over pay-setting has increased. Evidence from six case studies suggests that a focus on pay is only part of the picture. Increased competition and change makes the design of VPS more complex, and new forms of work organisation become the focus of performance. In this context, firms have (i) abandoned individual incentive pay and (ii) aggregated VPS in support of broader objectives. [source] Developing a dynamic project learning and cross-project learning capability: synthesizing two perspectivesINFORMATION SYSTEMS JOURNAL, Issue 6 2008Sue Newell Abstract Driven by the complexity of new products and services, project work has become increasingly common in all types of organizations. However, research on project learning suggests that often project teams do not meet their stated objectives and, moreover, there is limited organizational learning from the experiences of project work. We use the dynamic capabilities framework to argue that building a dynamic project learning capability is useful for organizations that make extensive use of projects. We use both survey and interview data to discuss the key ways in which such a dynamic capability can be built. Our survey data demonstrate the importance of documenting project learning, but our interview data show that teams are often remiss at documenting their learning. The results from the two different approaches are synthesized using Boland & Tenkasi's notions of perspective-making and perspective-taking. Importantly, combining the results from the two sets of data suggests that organizations need to emphasize the benefits from project reviews and documentation and explore ways in which the documents produced can be made more useful as boundary objects to encourage the sharing of learning across projects. [source] IT for niche companies: is an ERP system the solution?INFORMATION SYSTEMS JOURNAL, Issue 1 2007Kai A Olsen Abstract., Niche companies are per definition idiosyncratic. They survive in a competitive world by mastering a small market niche, providing what their customers need. This often requires a flexible organization, and the ability to customize products. To be more efficient, many of these companies rely on extensive use of IT, often by installing general Enterprise Resource Planning (ERP) systems. These systems have grown from isolated systems that handle planning based on incoming orders and the component structure of the various products, to systems with ambitions to embrace the total functioning of the company including vendor and customer relation management. In this paper, we present four case studies. One company is a part of a large enterprise, but performs niche functions within this enterprise. The other three are small- or medium-sized enterprises. Each of these performs in small niche markets. Common to all is the fact that they encounter problems with the utilization of their ERP systems. The major problem seems to be that the ERP system has an inherent business model that may not conform to the needs of the company. Without a good understanding of the underlying models and the constraints under which the fundamental algorithms operate, it is difficult to use these systems correctly. Even excellent systems may give bad results if they are applied to situations where they are not suited. Further, the monolithic structure of an ERP system, with a rather complicated parameter setting, is often insufficient to mould the system to the needs of a niche company. We discuss these problems based on our four case studies, and offer alternative approaches that may be considered. [source] Directional response of a reconstituted fine-grained soil,Part I: experimental investigationINTERNATIONAL JOURNAL FOR NUMERICAL AND ANALYTICAL METHODS IN GEOMECHANICS, Issue 13 2006Daniele Costanzo Abstract This paper discusses the results of a large experimental program designed to investigate in a systematic manner the main features of the incremental response of fine-grained soils. The results are obtained from triaxial stress probing experiments carried out on a French silty clay (Beaucaire Marl). All the tests have been performed on reconstituted specimens, normally consolidated to an initial state which is either isotropic or anisotropic. In the interpretation of the experimental results, extensive use is made of the concept of strain response envelope. The response envelopes obtained for different stress increment magnitudes are remarkably consistent with each other and indicate an inelastic and irreversible material response, i.e. a strong dependence on the stress increment direction, also at relatively small strain levels. A companion paper (Int. J. Numer. Anal. Meth. Geomech., this issue, 2006) assesses the performance of some advanced constitutive models in reproducing the behaviour of reconstituted Beaucaire Marl as observed in this experimental program. Copyright © 2006 John Wiley & Sons, Ltd. [source] Classical and advanced multilayered plate elements based upon PVD and RMVT.INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 3 2002Part 2: Numerical implementations Abstract This paper presents numerical evaluations related to the multilayered plate elements which were proposed in the companion paper (Part 1). Two-dimensional modellings with linear and higher-order (up to fourth order) expansion in the z -plate/layer thickness direction have been implemented for both displacements and transverse stresses. Layer-wise as well as equivalent single-layer modellings are considered on both frameworks of the principle of virtual displacements and Reissner mixed variational theorem. Such a variety has led to the implementation of 22 plate theories. As far as finite element approximation is concerned, three quadrilaters have been considered (four-, eight- and nine-noded plate elements). As a result, 22×3 different finite plate elements have been compared in the present analysis. The automatic procedure described in Part 1, which made extensive use of indicial notations, has herein been referred to in the considered computer implementations. An assessment has been made as far as convergence rates, numerical integrations and comparison to correspondent closed-form solutions are concerned. Extensive comparison to early and recently available results has been made for sample problems related to laminated and sandwich structures. Classical formulations, full mixed, hybrid, as well as three-dimensional solutions have been considered in such a comparison. Numerical substantiation of the importance of the fulfilment of zig-zag effects and interlaminar equilibria is given. The superiority of RMVT formulated finite elements over those related to PVD has been concluded. Two test cases are proposed as ,desk-beds' to establish the accuracy of the several theories. Results related to all the developed theories are presented for the first test case. The second test case, which is related to sandwich plates, restricts the comparison to the most significant implemented finite elements. It is proposed to refer to these test cases to establish the accuracy of existing or new higher-order, refined or improved finite elements for multilayered plate analyses. Copyright © 2002 John Wiley & Sons, Ltd. [source] |