Home About us Contact | |||
Flexible Approach (flexible + approach)
Selected AbstractsA Flexible Approach to Measurement Error Correction in Case,Control StudiesBIOMETRICS, Issue 4 2008A. Guolo Summary We investigate the use of prospective likelihood methods to analyze retrospective case,control data where some of the covariates are measured with error. We show that prospective methods can be applied and the case,control sampling scheme can be ignored if one adequately models the distribution of the error-prone covariates in the case,control sampling scheme. Indeed, subject to this, the prospective likelihood methods result in consistent estimates and information standard errors are asymptotically correct. However, the distribution of such covariates is not the same in the population and under case,control sampling, dictating the need to model the distribution flexibly. In this article, we illustrate the general principle by modeling the distribution of the continuous error-prone covariates using the skewnormal distribution. The performance of the method is evaluated through simulation studies, which show satisfactory results in terms of bias and coverage. Finally, the method is applied to the analysis of two data sets which refer, respectively, to a cholesterol study and a study on breast cancer. [source] A Flexible Approach to (S)-5-Alkyl Tetramic Acid Derivatives: Application to the Asymmetric Synthesis of (+)-Preussin and Protected (3S,4S)-AHPPA.CHEMINFORM, Issue 12 2004Pei-Qiang Huang Abstract For Abstract see ChemInform Abstract in Full Text. [source] The scope of nursing in Australia: a snapshot of the challenges and skills neededJOURNAL OF NURSING MANAGEMENT, Issue 2 2003Jacqueline Jones RN Contemporary nursing is an increasingly complex concept encompassing and encapsulating wide variation under the broad rubric of the nursing work place. This paper reports on a study that was designed to contribute to understandings of nursing practice by describing what nurses in Australia are doing everyday in various practice and work settings, the type of skills they need, the challenges they face and the interactions nurses have with other health workers. Drawing on the research which informed the National Review of Nurse Education in Australia in 2001, the paper raises issues critical to the management of contemporary nursing practice. Flexible approaches both to the day-to-day management of nurses and nursing, and educational preparation in partnership with key stakeholders, are a necessity if management of nursing is to keep pace with constant change in health care systems as well as facilitating the attraction and retention of nurses in those systems. [source] Tunable scheduling in a GridRPC frameworkCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 9 2008A. Amar Abstract Among existing grid middleware approaches, one simple, powerful, and flexible approach consists of using servers available in different administrative domains through the classic client,server or remote procedure call paradigm. Network Enabled Servers (NES) implement this model, also called GridRPC. Clients submit computation requests to a scheduler, whose goal is to find a server available on the grid using some performance metric. The aim of this paper is to give an overview of a NES middleware developed in the GRAAL team called distributed interactive engineering toolbox (DIET) and to describe recent developments around plug-in schedulers, workflow management, and tools. DIET is a hierarchical set of components used for the development of applications based on computational servers on the grid. Copyright © 2007 John Wiley & Sons, Ltd. [source] Coaxial Aerodynamically Assisted Bio-jets: A Versatile Paradigm for Directly Engineering Living Primary OrganismsENGINEERING IN LIFE SCIENCES (ELECTRONIC), Issue 6 2007S. Irvine Abstract In this paper, a coaxial jetting methodology is demonstrated as a first example (non-electric field driven) completely run by aerodynamic forces which are brought about by the application of a differential pressure for the safe handling of primary living organisms by means of jets as encapsulated droplets. Previously this jetting technique in this configuration has only been investigated for processing combinations of liquid-liquid and liquid-gas systems. These developmental studies into aerodynamically assisted jets (AAJ) have unearthed a versatile bio-jetting approach referred to here as coaxial aerodynamically assisted bio-jetting (CAABJ). In the current work, this flexible approach is demonstrated to handle two primary cell types for drop-and-placing onto several different substrates. Furthermore, the study assesses cellular viability of the post-treated cells in comparison to controls by way of flow cytometry. These first steps demonstrate the promise this protocol has in exploring the creation of biologically viable structures to form encapsulations of cells which would be useful as a direct tissue engineering to the immuno-hinding methodology in bio-repair and therapeutics. Therefore, these investigations place CAABJ into the cell jetting pursuit together with bio-electrosprays, which will undergo an explosive developmental research. [source] On the use of generalized linear models for interpreting climate variabilityENVIRONMETRICS, Issue 7 2005Richard E. Chandler Abstract Many topical questions in climate research can be reduced to either of two related problems: understanding how various different components of the climate system affect each other, and quantifying changes in the system. This article aims to justify the addition of generalized linear models to the climatologist's toolkit, by demonstrating that they offer an intuitive and flexible approach to such problems. In particular, we provide some suggestions as to how ,typical' climatological data structures may be represented within the GLM framework. Recurring themes include methods for space,time data and the need to cope with large datasets. The ideas are illustrated using a dataset of monthly U.S. temperatures. Copyright © 2005 John Wiley & Sons, Ltd. [source] A case-based reasoning approach to derive object-oriented models from software architecturesEXPERT SYSTEMS, Issue 4 2010German L. Vazquez Abstract: Software architectures are very important to capture early design decisions and reason about quality attributes of a system. Unfortunately, there are mismatches between the quality attributes prescribed by the architecture and those realized by its object-oriented implementation. The mismatches decrease the ability to reason architecturally about the system. Developing an object-oriented materialization that conforms to the original architecture depends on both the application of the right patterns and the developer's expertise. Since the space of allowed materializations can be really large, tool support for assisting the developer in the exploration of alternative materializations is of great help. In previous research, we developed a prototype for generating quality-preserving implementations of software architectures, using pre-compiled knowledge about architectural styles and frameworks. In this paper, we present a more flexible approach, called SAME, which focuses on the architectural connectors as the pillars for the materialization process. The SAME design assistant applies a case-based reasoning (CBR) metaphor to deal with connector-related materialization experiences and quality attributes. The CBR engine is able to recall and adapt past experiences to solve new materialization problems; thus SAME can take advantage of developers' knowledge. Preliminary experiments have shown that this approach can improve the exploration of object-oriented solutions that are still faithful to the architectural prescriptions. [source] 2D data modelling by electrical resistivity tomography for complex subsurface geologyGEOPHYSICAL PROSPECTING, Issue 2 2006E. Cardarelli ABSTRACT A new tool for two-dimensional apparent-resistivity data modelling and inversion is presented. The study is developed according to the idea that the best way to deal with ill-posedness of geoelectrical inverse problems lies in constructing algorithms which allow a flexible control of the physical and mathematical elements involved in the resolution. The forward problem is solved through a finite-difference algorithm, whose main features are a versatile user-defined discretization of the domain and a new approach to the solution of the inverse Fourier transform. The inversion procedure is based on an iterative smoothness-constrained least-squares algorithm. As mentioned, the code is constructed to ensure flexibility in resolution. This is first achieved by starting the inversion from an arbitrarily defined model. In our approach, a Jacobian matrix is calculated at each iteration, using a generalization of Cohn's network sensitivity theorem. Another versatile feature is the issue of introducing a priori information about the solution. Regions of the domain can be constrained to vary between two limits (the lower and upper bounds) by using inequality constraints. A second possibility is to include the starting model in the objective function used to determine an improved estimate of the unknown parameters and to constrain the solution to the above model. Furthermore, the possibility either of defining a discretization of the domain that exactly fits the underground structures or of refining the mesh of the grid certainly leads to more accurate solutions. Control on the mathematical elements in the inversion algorithm is also allowed. The smoothness matrix can be modified in order to penalize roughness in any one direction. An empirical way of assigning the regularization parameter (damping) is defined, but the user can also decide to assign it manually at each iteration. An appropriate tool was constructed with the purpose of handling the inversion results, for example to correct reconstructed models and to check the effects of such changes on the calculated apparent resistivity. Tests on synthetic and real data, in particular in handling indeterminate cases, show that the flexible approach is a good way to build a detailed picture of the prospected area. [source] BEYOND COMPARISON: HISTOIRE CROISÉE AND THE CHALLENGE OF REFLEXIVITY,HISTORY AND THEORY, Issue 1 2006MICHAEL WERNER ABSTRACT This article presents, in a programmatic way, the histoire croisée approach, its methodological implications and its empirical developments. Histoire croisée draws on the debates about comparative history, transfer studies, and connected or shared history that have been carried out in the social sciences in recent years. It invites us to reconsider the interactions between different societies or cultures, erudite disciplines or traditions (more generally, between social and cultural productions). Histoire croisée focuses on empirical intercrossings consubstantial with the object of study, as well as on the operations by which researchers themselves cross scales, categories, and viewpoints. The article first shows how this approach differs from purely comparative or transfer studies. It then develops the principles of pragmatic and reflexive induction as a major methodological principle of histoire croisée. While underlining the need and the methods of a historicization of both the objects and categories of analysis, it calls for a reconsideration of the way history can combine empirical and reflexive concerns into a dynamic and flexible approach. [source] McGrath v Riddell: A flexible approach to the insolvency distribution rules?INTERNATIONAL INSOLVENCY REVIEW, Issue 1 2010Blanca Mamutse The rules relating to the division of the insolvent estate assume considerable importance in the field of international insolvencies, where different legal systems interact. International instruments including the European insolvency regulation and the UNCITRAL Model Law on Cross-Border Insolvency have provided a framework which governs the relationship between local and foreign distribution schemes. For English lawyers, questions remain however regarding the future role of the courts' statutory power to cooperate with the courts of ,relevant' countries or territories, and of the common law principle of universalism. An important issue connected to the determination of such questions is the established judicial approach to the pari passu rule, in the application of domestic law. This paper examines the manifestation of this tension in the litigation arising from the collapse of the HIH Casualty & General Insurance group of companies. It notes the scope which remains for continued resort to the statutory power of cooperation, and the potential for the Cross-Border Insolvency Regulations 2006 to encourage a more flexible approach to resolving differences between distribution schemes. Copyright © 2010 John Wiley & Sons, Ltd. [source] A dynamic bandwidth resource allocation based on neural networks in euroskyway multimedia satellite systemINTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, Issue 1 2003P. Camarda Abstract Advanced traffic management based on the dynamic resource assignment allows a broadband satellite system such as euroskyway (ESW) to dynamically assign the resources of connections. The mechanisms of the dynamic assignment exploit variations of burstiness exhibited by real time and non-real time variable bit rate traffic sources to perform an optimized resource redistribution. The efficiency of the dynamic bandwidth allocation capability (DBAC) depends on the accuracy of the traffic source description; inaccurate assessment of the arrival process will cause an overhead and a degraded utilisation of system resources. In this paper a flexible traffic burstiness predictor for dynamic bandwidth resource allocation based on neural network is presented. The approach is able to perform an online estimation of expected resource requests implementing traffic resource assignment by using a sub-symbolic adaptive representation of the traffic source. The achieved results prove that the flexible approach is more effective than the ones based on fixed schemes designed using analytical traffic source description when applied on the satellite terminal ESW system component. Copyright © 2003 John Wiley & Sons, Ltd. [source] A flexible approach to evaluating soft conditions with unequal preferences in fuzzy databasesINTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 7 2007Gloria Bordogna A flexible model for evaluating soft query with unequal preferences in fuzzy databases is proposed. We assume that conditions with unequal preferences have an exclusive meaning like in the request "find a holiday accommodation such that big apartments are preferred to high rating hotels." In this case it is assumed that the aggregator of the soft conditions is an implicit OR. Conversely, conditions with unequal importance have an inclusive meaning, like in the query "find a house to rent that is cheap (most important), big (important), new (fairly important)." In this case the implicit aggregator is an AND. What we propose in this article is to model preferences as modifiers of the semantics of the evaluation function of the conditions. Because the soft conditions are aggregated by an OR, the more a soft condition is preferred, the more its evaluation function tolerates a greater undersatisfaction of the soft condition. The proposed approach is formalized by considering two alternative semantics of the evaluation function: the first semantics defines the evaluation function by means of a generalized fuzzy inclusion measure, and the second one as a generalized similarity measure. These functions are parameterized so that their modification is simply achieved by tuning the functions' parameters. © 2007 Wiley Periodicals, Inc. Int J Int Syst 22: 665,689, 2007. [source] Establishing contact and gaining trust: an exploratory study of care avoidanceJOURNAL OF ADVANCED NURSING, Issue 2 2010Gert Schout schout g., de jong g. & zeelen j. (2010) Establishing contact and gaining trust: an exploratory study of care avoidance. Journal of Advanced Nursing 66(2), 324,333. Abstract Title.,Establishing contact and gaining trust: an exploratory study of care avoidance. Aim., This paper is a report of a study conducted to explore the competencies , especially deep-rooted personal qualities , of care providers who succeed in making contact and gaining trust with clients who are inclined to avoid the care they need. Background., Demands, thresholds and fragmentation of services hinder the accessibility of health care, such that some severe mentally ill people do not receive the treatment they need or avoid healthcare services. Methods of establishing contact and gaining trust in mental health care include practical assistance, realistic expectations, establishing long-term goals, empathy and a client-centred and flexible approach. Method., A public mental healthcare practice in The Netherlands with outstanding performance was studied from 2002,2007 using participant observation, interviews with experienced care providers and interviews with clients with a long history of avoiding care facilities, conflicts and troubled relationships with care providers. Findings., A number of personal qualities are vital for establishing contact and gaining trust with these clients: altruism, a degree of compassion, loyalty, involvement, tenacity, a critical attitude to the mainstream, flexibility, optimism, diplomacy, patience, creativity, and a certain degree of immunity to stress. Conclusion., Care providers who establish contact and win trust employ ,non-judgemental appreciation'. They start from the acceptance of what is and try to connect with the client and their world. These professionals use their initial actions to identify and praise qualities and achievements of clients. This style of work is supported by a set of deep-rooted personal qualities we can summarize as ,empathy'. [source] Schematic representation of case study research designsJOURNAL OF ADVANCED NURSING, Issue 4 2007John P. Rosenberg Abstract Title.,Schematic representation of case study research designs Aim., The paper is a report of a study to demonstrate how the use of schematics can provide procedural clarity and promote rigour in the conduct of case study research. Background., Case study research is a methodologically flexible approach to research design that focuses on a particular case , whether an individual, a collective or a phenomenon of interest. It is known as the ,study of the particular' for its thorough investigation of particular, real-life situations and is gaining increased attention in nursing and social research. However, the methodological flexibility it offers can leave the novice researcher uncertain of suitable procedural steps required to ensure methodological rigour. Method., This article provides a real example of a case study research design that utilizes schematic representation drawn from a doctoral study of the integration of health promotion principles and practices into a palliative care organization. Discussion., The issues discussed are: (1) the definition and application of case study research design; (2) the application of schematics in research; (3) the procedural steps and their contribution to the maintenance of rigour; and (4) the benefits and risks of schematics in case study research. Conclusion., The inclusion of visual representations of design with accompanying explanatory text is recommended in reporting case study research methods. [source] An empowerment approach to needs assessment in health visiting practiceJOURNAL OF CLINICAL NURSING, Issue 5 2002ANNA M. HOUSTON BSc ,,This paper examines the usefulness of an integrated approach to needs assessment using an empowerment framework, within a health visitor/client interaction, in the home setting. ,,It is intended to demonstrate the existence of a flexible approach to assessing need that is based on research about necessary processes for carrying out health visiting. ,,The design of the tool described in this paper allows the use of professional judgement as well as fulfilling commissioning requirements to address health outcomes. ,,Health promotion and empowerment are central to health visiting practice and should be reflected in the way needs are assessed. ,,Many NHS trusts have introduced a system of targeting and prioritizing health visiting through a system of questioning to assess needs. This may reveal the work that health visitors do, but may also inhibit the open, listening approach required for client empowerment. ,,Different methods of assessing need can be used that do not compromise the commissioning requirements, the health visitor's duty of care or professional accountability. ,,The empowerment approach is key to the philosophy of health visiting. ,,There are ways of approaching needs assessment that do not compromise the ethos of partnership-working in a health promoting way. [source] Knowing , in MedicineJOURNAL OF EVALUATION IN CLINICAL PRACTICE, Issue 5 2008Joachim P. Sturmberg MBBS DORACOG MFM PhD FRACGP Abstract In this paper we argue that knowledge in health care is a multidimensional dynamic construct, in contrast to the prevailing idea of knowledge being an objective state. Polanyi demonstrated that knowledge is personal, that knowledge is discovered, and that knowledge has explicit and tacit dimensions. Complex adaptive systems science views knowledge simultaneously as a thing and a flow, constructed as well as in constant flux. The Cynefin framework is one model to help our understanding of knowledge as a personal construct achieved through sense making. Specific knowledge aspects temporarily reside in either one of four domains , the known, knowable, complex or chaotic, but new knowledge can only be created by challenging the known by moving it in and looping it through the other domains. Medical knowledge is simultaneously explicit and implicit with certain aspects already well known and easily transferable, and others that are not yet fully known and must still be learned. At the same time certain knowledge aspects are predominantly concerned with content, whereas others deal with context. Though in clinical care we may operate predominately in one knowledge domain, we also will operate some of the time in the others. Medical knowledge is inherently uncertain, and we require a context-driven flexible approach to knowledge discovery and application, in clinical practice as well as in health service planning. [source] Evaluating effectiveness of preoperative testing procedure: some notes on modelling strategies in multi-centre surveysJOURNAL OF EVALUATION IN CLINICAL PRACTICE, Issue 1 2008Dario Gregori PhD Abstract Rationale, In technology assessment in health-related fields the construction of a model for interpreting the economic implications of the introduction of a technology is only a part of the problem. The most important part is often the formulation of a model that can be used for selecting patients to submit to the new cost-saving procedure or medical strategy. The model is usually complicated by the fact that data are often non-homogeneous with respect to some uncontrolled variables and are correlated. The most typical example is the so-called hospital effect in multi-centre studies. Aims and objectives, We show the implications derived by different choices in modelling strategies when evaluating the usefulness of preoperative chest radiography, an exam performed before surgery, usually with the aim to detect unsuspected abnormalities that could influence the anaesthetic management and/or surgical plan. Method, We analyze the data from a multi-centre study including more than 7000 patients. We use about 6000 patients to fit regression models using both a population averaged and a subject-specific approach. We explore the limitations of these models when used for predictive purposes using a validation set of more than 1000 patients. Results, We show the importance of taking into account the heterogeneity among observations and the correlation structure of the data and propose an approach for integrating a population-averaged and subject specific approach into a single modeling strategy. We find that the hospital represents an important variable causing heterogeneity that influences the probability of a useful POCR. Conclusions, We find that starting with a marginal model, evaluating the shrinkage effect and eventually move to a more detailed model for the heterogeneity is preferable. This kind of flexible approach seems to be more informative at various phases of the model-building strategy. [source] Virtual colonoscopy: Issues in implementationJOURNAL OF MEDICAL IMAGING AND RADIATION ONCOLOGY, Issue 1 2005R Mendelson Summary The following issues and requirements related to the implementation of a CT colonography (CTC) service are important: (i) policies are needed regarding the indications for CTC. Concomitant with this is the need for education of potential referrers and patients. Expectations of the procedure, particularly by general practitioners, may be unrealistic and indications for referral may otherwise be inappropriate. At present there is not general acceptance of CTC for screening asymptomatic persons; (ii) a flexible approach to CT protocols is useful, dependant on the indication for and clinical context of referral, the age and body habitus of the patient; (iii) attention to the issues related to the special skills required by the reporting radiologist. While there is a temptation to regard CTC interpretation as an extension of skills used in interpreting other cross-sectional images, there is a need to realise that there are skills required specific to CTC and there should be adequate provision for training; (iv) matters related to reporting, such as reporting format, and lesions that will be reported/not reported; and (v) informed consent from the patient. Information should be provided with regard to the limitations of CTC, the implications of a positive finding and radiation dosage. [source] A Web-Based Interactive Database System for a Transcranial Doppler Ultrasound LaboratoryJOURNAL OF NEUROIMAGING, Issue 1 2006Mark J. Gorman MD ABSTRACT Background. Variations in transcranial Doppler (TCD) examination performance techniques and interpretive paradigms between individual laboratories are a common challenge in the practice of TCD. Demand for rapid access to patient ultrasound examination data and report for use in intensive care settings has necessitated a more flexible approach to data management. Both of these issues may benefit from a computerized approach. Methods. We describe the application of a World Wide Web-based database system for use in an ultrasound laboratory. Results. Databasing information while generating a TCD report is efficient. Web accessibility allows rapid and flexible communication of time-sensitive report information and interpretation for more expeditious clinical decision making. Conclusions. Web-based applications can extend the reach and efficiency of traditionally structured medical laboratories. [source] Validating the organizational climate measure: links to managerial practices, productivity and innovationJOURNAL OF ORGANIZATIONAL BEHAVIOR, Issue 4 2005Malcolm G. Patterson This paper describes the development and validation of a multidimensional measure of organizational climate, the Organizational Climate Measure (OCM), based upon Quinn and Rohrbaugh's Competing Values model. A sample of 6869 employees across 55 manufacturing organizations completed the questionnaire. The 17 scales contained within the measure had acceptable levels of reliability and were factorially distinct. Concurrent validity was measured by correlating employees' ratings with managers' and interviewers' descriptions of managerial practices and organizational characteristics. Predictive validity was established using measures of productivity and innovation. The OCM also discriminated effectively between organizations, demonstrating good discriminant validity. The measure offers researchers a relatively comprehensive and flexible approach to the assessment of organizational members' experience and promises applied and theoretical benefits. Copyright © 2005 John Wiley & Sons, Ltd. [source] A flexible approach to the design of new potent substance P receptor ligandsJOURNAL OF PHARMACY AND PHARMACOLOGY: AN INTERNATI ONAL JOURNAL OF PHARMACEUTICAL SCIENCE, Issue 7 2001R. Millet The development of small-molecule antagonists of the substance-P-preferring tachykinin NK1 receptor offers an excellent opportunity to exploit these molecules as novel therapeutic agents in diverse pathologies such as depression, emesis or asthma. GR71251 has previously been identified as a potent and selective substance-P-receptor antagonist. We have therefore undertaken the synthesis of new pseudopeptidic analogues based on the C-terminal sequence of GR71251. The evaluation of binding affinities toward NK1 and NK2 receptors has enabled us to propose new selective NK1 ligands with high affinity. Structure-activity relationships showed that the Trp-OBzl(CF3)2 moiety is essential for NK1 affinity and that the introduction of building units such as spirolactam, lactam or proline, leading to a constrained peptide, increased selectivity for NK1 receptors. These compounds constitute a useful starting point for new substance P antagonists and represent an attractive lead series for further studies on the design of specific NK1 antagonists. [source] Flexible modelling of neuron firing rates across different experimental conditions: an application to neural activity in the prefrontal cortex during a discrimination taskJOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES C (APPLIED STATISTICS), Issue 4 2006Carmen Cadarso-Suárez Summary., In many electrophysiological experiments the main objectives include estimation of the firing rate of a single neuron, as well as a comparison of its temporal evolution across different experimental conditions. To accomplish these two goals, we propose a flexible approach based on the logistic generalized additive model including condition-by-time interactions. If an interaction of this type is detected in the model, we then establish that the use of the temporal odds ratio curves is very useful in discriminating between the conditions under which the firing probability is higher. Bootstrap techniques are used for testing for interactions and constructing pointwise confidence bands for the true odds ratio curves. Finally, we apply the new methodology to assessing relationships between neural response and decision-making in movement-selective neurons in the prefrontal cortex of behaving monkeys. [source] The coming of age of agroforestryJOURNAL OF THE SCIENCE OF FOOD AND AGRICULTURE, Issue 9 2007PK Ramachandran Nair The success of modern agricultural and forestry production can be largely attributed to monoculture systems using a few select species. In the drive for maximizing yield and profit, the age-old tradition of using combined farming systems was essentially avoided and in some cases this has resulted in environmental problems such as land and water degradation and increased land clearing. During the last 30 years, however, the positive benefits of agroforestry to the producer and the environment have been increasingly recognized. Combining trees and crops in spatial or temporal arrangements has been shown to improve food and nutritional security and mitigate environmental degradation, offering a sustainable alternative to monoculture production. By providing supportive and complimentary roles with a flexible approach, agroforestry can offer specific social and environmental benefits across a range of landscapes and economies. More research and effort is needed to explore the full potential of agroforestry applications and to fuel awareness. As the plethora of benefits of agroforestry are realized, modern land-use systems are evolving towards a more sustainable and holistic approach to land management. Copyright © 2007 Society of Chemical Industry [source] C2 monitoring of cyclosporine in de novo liver transplant recipients: The clinician's perspectiveLIVER TRANSPLANTATION, Issue 5 2004Federico Villamil Adjusting cyclosporine (CsA) dose based on blood concentration at 2 hours after dose (C2) has been shown in prospective clinical trials to reduce the risk of rejection compared with conventional trough monitoring. In addition, it provides equivalent efficacy to tacrolimus in liver transplant patients, with a favorable safety profile. Target C2 should be defined on an individual basis depending on adjunctive therapy and the level of exposure required. It appears less critical to achieve target C2 in the first few days after liver transplantation than was previously believed. Achieving target C2 exposure in the initial period after transplant requires that changes in the proportion of cyclosporine absorbed from the gut be taken into account to avoid risk of overexposure. In addition, if a starting dose of 10,15 mg/day is used, it is advisable to delay increasing the dose until a trend in C2 level indicates this to be necessary. Immediate dose reduction is required if C2 exceeds target range. In patients with low C2 values, cyclosporine concentration at a later time point should be measured to establish whether the patient is a poor absorber or a delayed absorber of C2, and dose adjustments should be undertaken accordingly. In conclusion, this more flexible approach to C2 monitoring allows the dose of cyclosporine to be individualized effectively for each patient, which results in significant efficacy benefits while minimizing the risk of toxicity. (Liver Transpl 2004;10:577,583.) [source] Interpreting the significance of drinking by alcohol-dependent liver transplant patients: Fostering candor is the key to recoveryLIVER TRANSPLANTATION, Issue 6 2000Robert M. Weinrieb Few studies have examined the value of treating alcohol addiction either before or after liver transplantation. Nevertheless, most liver transplant programs and many insurance companies require 6 months to 1 year of abstinence from alcohol as a condition of eligibility for liver transplantation (the 6-month rule). We believe there are potentially harsh clinical consequences to the implementation of this rule. For example, the natural history of alcohol use disorders often involves brief fallbacks to drinking ("slips"), but when alcoholic liver transplant candidates slip, most are removed from consideration for transplantation or are required to accrue another 6 months of sobriety. Because there is no alternative treatment to liver transplantation for most patients with end-stage liver disease, the 6-month rule could be lethal in some circumstances. In this review, we survey the literature concerning the ability of the 6-month rule to predict drinking by alcoholic patients who undergo liver transplantation and examine its impact on the health consequences of drinking before and after liver transplantation. We believe that fostering candor between the alcoholic patient and the transplant team is the key to recovery from alcoholism. We conclude that it is unethical to force alcoholic liver patients who have resumed alcohol use while waiting for or after transplantation to choose between hiding their drinking to remain suitable candidates for transplantation or risk death by asking for treatment of alcoholism. Consequently, we advocate a flexible approach to clinical decision making for the transplant professional caring for an alcoholic patient who has resumed drinking and provide specific guidelines for patient management. [source] USING NETWORK ANALYSIS TO CHARACTERIZE FOREST STRUCTURENATURAL RESOURCE MODELING, Issue 2 2008MICHAEL M. FULLER Abstract Network analysis quantifies different structural properties of systems of interrelated parts using a single analytical framework. Many ecological phenomena have network-like properties, such as the trophic relationships of food webs, geographic structure of metapopulations, and species interactions in communities. Therefore, our ability to understand and manage such systems may benefit from the use of network-analysis techniques. But network analysis has not been applied extensively to ecological problems, and its suitability for ecological studies is uncertain. Here, we investigate the ability of network analysis to detect spatial patterns of species association in a tropical forest. We use three common graph-theoretic measures of network structure to quantify the effect of understory tree size on the spatial association of understory species with trees in the canopy: the node degree distribution (NDD), characteristic path length (CPL), and clustering coefficient (CC). We compute the NDD, CPL, and CC for each of seven size classes of understory trees. For significance testing, we compare the observed values to frequency distributions of each statistic computed from randomized data. We find that the ability of network analysis to distinguish observed patterns from those representing randomized data strongly depends on which aspects of structure are investigated. Analysis of NDD finds no significant difference between random and observed networks. However, analysis of CPL and CC detected nonrandom patterns in three and one of the seven size classes, respectively. Network analysis is a very flexible approach that holds promise for ecological studies, but more research is needed to better understand its advantages and limitations. [source] Mixed-mode chromatography/isotope ratio mass spectrometry,RAPID COMMUNICATIONS IN MASS SPECTROMETRY, Issue 5 2010James S. O. McCullagh Liquid chromatography coupled to molecular mass spectrometry (LC/MS) has been a standard technique since the early 1970s but liquid chromatography coupled to high-precision isotope ratio mass spectrometry (LC/IRMS) has only been available commercially since 2004. This development has, for the first time, enabled natural abundance and low enrichment ,13C measurements to be applied to individual analytes in aqueous mixtures creating new opportunities for IRMS applications, particularly for the isotopic study of biological molecules. A growing number of applications have been published in a range of areas including amino acid metabolism, carbohydrates studies, quantification of cellular and plasma metabolites, dietary tracer and nucleic acid studies. There is strong potential to extend these to new compounds and complex matrices but several challenges face the development of LC/IRMS methods. To achieve accurate isotopic measurements, HPLC separations must provide baseline-resolution between analyte peaks; however, the design of current liquid interfaces places severe restrictions on compatible flow rates and in particular mobile phase compositions. These create a significant challenge on which reports associated with LC/IRMS have not previously focused. Accordingly, this paper will address aspects of chromatography in the context of LC/IRMS, in particular focusing on mixed-mode separations and their benefits in light of these restrictions. It aims to provide an overview of mixed-mode stationary phases and of ways to improve high aqueous separations through manipulation of parameters such as column length, temperature and mobile phase pH. The results of several practical experiments are given using proteogenic amino acids and nucleosides both of which are of noted importance in the LC/IRMS literature. This communication aims to demonstrate that mixed-mode stationary phases provide a flexible approach given the constraints of LC/IRMS interface design and acts as a practical guide for the development of new chromatographic methods compatible with LC/IRMS applications. Copyright © 2010 John Wiley & Sons, Ltd. [source] PRIVACY, THE INDIVIDUAL AND GENETIC INFORMATION: A BUDDHIST PERSPECTIVEBIOETHICS, Issue 7 2009SORAJ HONGLADAROM ABSTRACT Bioinformatics is a new field of study whose ethical implications involve a combination of bioethics, computer ethics and information ethics. This paper is an attempt to view some of these implications from the perspective of Buddhism. Privacy is a central concern in both computer/information ethics and bioethics, and with information technology being increasingly utilized to process biological and genetic data, the issue has become even more pronounced. Traditionally, privacy presupposes the individual self but as Buddhism does away with the ultimate conception of an individual self, it has to find a way to analyse and justify privacy that does not presuppose such a self. It does this through a pragmatic conception that does not depend on a positing of the substantial self, which is then found to be unnecessary for an effective protection of privacy. As it may be possible one day to link genetic data to individuals, the Buddhist conception perhaps offers a more flexible approach, as what is considered to be integral to an individual person is not fixed in objectivity but depends on convention. [source] Bayesian Hierarchical Functional Data Analysis Via Contaminated Informative PriorsBIOMETRICS, Issue 3 2009Bruno Scarpa Summary A variety of flexible approaches have been proposed for functional data analysis, allowing both the mean curve and the distribution about the mean to be unknown. Such methods are most useful when there is limited prior information. Motivated by applications to modeling of temperature curves in the menstrual cycle, this article proposes a flexible approach for incorporating prior information in semiparametric Bayesian analyses of hierarchical functional data. The proposed approach is based on specifying the distribution of functions as a mixture of a parametric hierarchical model and a nonparametric contamination. The parametric component is chosen based on prior knowledge, while the contamination is characterized as a functional Dirichlet process. In the motivating application, the contamination component allows unanticipated curve shapes in unhealthy menstrual cycles. Methods are developed for posterior computation, and the approach is applied to data from a European fecundability study. [source] Sensitivity Analysis for Nonrandom Dropout: A Local Influence ApproachBIOMETRICS, Issue 1 2001Geert Verbeke Summary. Diggle and Kenward (1994, Applied Statistics43, 49,93) proposed a selection model for continuous longitudinal data subject to nonrandom dropout. It has provoked a large debate about the role for such models. The original enthusiasm was followed by skepticism about the strong but untestable assumptions on which this type of model invariably rests. Since then, the view has emerged that these models should ideally be made part of a sensitivity analysis. This paper presents a formal and flexible approach to such a sensitivity assessment based on local influence (Cook, 1986, Journal of the Royal Statistical Society, Series B48, 133,169). The influence of perturbing a missing-at-random dropout model in the direction of nonrandom dropout is explored. The method is applied to data from a randomized experiment on the inhibition of testosterone production in rats. [source] |