Home About us Contact | |||
Alternative Methods (alternative + methods)
Selected AbstractsALTERNATIVE METHODS IN THE ENDOSCOPIC MANAGEMENT OF DIFFICULT COMMON BILE DUCT STONESDIGESTIVE ENDOSCOPY, Issue 2010Dong Ki Lee The endoscopic method is accepted as a first treatment modality in the management of extrahepatic bile duct. Most large stones can be removed with basket and mechanical lithotripsy after endoscopic sphincterotomy. Currently, in treating large extrahepatic bile duct stones, endoscopic papillary large balloon dilation with mid-incision endoscopic sphincterotomy is actively performed instead of applying mechanical lithotripsy after full endoscopic sphincterotomy. Herein, we describe the conceptions, proper indications, methods and complications of endoscopic papillary large balloon dilation with regards to currently published reports. In addition, intracorporeal lithotripsy by peroral cholangioscopy with an ultra-slim upper endoscope is introduced, which is more convenient than previous conventional intracorporeal lithotripsy methods using mother,baby endoscopy or percutaneous transhepatic cholangioscopy. Lastly, biliary stenting with the choleretic agent administration method is briefly reviewed as an alternative treatment option for frail and elderly patients with large impacted common bile duct stones. [source] Neural Network Earnings per Share Forecasting Models: A Comparative Analysis of Alternative MethodsDECISION SCIENCES, Issue 2 2004Wei Zhang ABSTRACT In this paper, we present a comparative analysis of the forecasting accuracy of univariate and multivariate linear models that incorporate fundamental accounting variables (i.e., inventory, accounts receivable, and so on) with the forecast accuracy of neural network models. Unique to this study is the focus of our comparison on the multivariate models to examine whether the neural network models incorporating the fundamental accounting variables can generate more accurate forecasts of future earnings than the models assuming a linear combination of these same variables. We investigate four types of models: univariate-linear, multivariate-linear, univariate-neural network, and multivariate-neural network using a sample of 283 firms spanning 41 industries. This study shows that the application of the neural network approach incorporating fundamental accounting variables results in forecasts that are more accurate than linear forecasting models. The results also reveal limitations of the forecasting capacity of investors in the security market when compared to neural network models. [source] An Investigation of Alternative Methods for Item Mapping in the National Assessment of Educational ProgressEDUCATIONAL MEASUREMENT: ISSUES AND PRACTICE, Issue 2 2001Rebecca Zwick What is item mapping and how does it aid test score interpretation? Which item mapping technique produces the most consistent results and most closely matches expert opinion? [source] New Alternative Methods to Teach Surgical Techniques for Veterinary Medicine Students despite the Absence of Living Animals.ANATOMIA, HISTOLOGIA, EMBRYOLOGIA, Issue 3 2007Is that an Academic Paradox? Summary Due to a raised ethical mentality, veterinary schools are pursuing methods to preserve animal corpses used for surgical technique classes in an attempt to reduce the use of living animals for teaching. Generally speaking, animal and human bodies are usually preserved with 10% aqueous formalin solution especially for descriptive anatomy classes. Other possibilities include the use of glycerol, alcohol and phenol. At present, new fixatives have been developed to allow a better and longer preservation of animal corpses in order to maintain organoleptic characteristics, i.e. colour, texture, as close as possible to what students will deal with living animals. From 2004, in our college, surgical technique classes no longer use living animals for students' training. Instead, canine corpses chemically preserved with modified Larssen (MLS) and Laskowski (LS) solutions are preferred. The purpose of this study was to investigate comparatively the biological quality of preservation of these two solutions and to evaluate students' learning and acceptance of this new teaching method. Although these fixatives maintain body flexibility, LS solution failed to keep an ordinary tissue colouration (cadavers were intensely red) and tissue preservation was not adequate. By contrast, MLS solution, however, did not alter the colouration of cadavers which was fairly similar to that normally found in living animals. A remarkable characteristic was a very strong and unpleasant sugary odour in LS-preserved animals and therefore the MLS solution was the elected method to preserve cadavers for surgical technique classes. The students' feedback to the use of Larssen-preserved cadavers was very satisfactory, i.e. 96.6% of students were in favour of the use of cadavers for surgical training and on average 91.8% (2002,2003) of students preferred the MLS solution as the chemical preserver, whereas only 8.2% elected LS solution for teaching purposes. From the students' point of view (95.1%) the ideal class would be an initial training in MLS cadavers followed by classes with animals admitted to the Veterinary Hospital. [source] Alternative Methods for Developmental Toxicity TestingBASIC AND CLINICAL PHARMACOLOGY & TOXICOLOGY, Issue 5 2006Aldert H. Piersma The aims of these investigations have been to reduce animal experimentation, to refine effect assessment and mechanistic studies, and to accelerate and simplify safety testing in an area of toxicology that uses relatively many animals. Many alternatives have been developed over the years with different compexities, using biologic material ranging from continuous cell lines to complete embryos. The validation of alternatives and their application in testing strategies is still in its infancy, although significant steps towards these aims are currently being made. The introduction of the genomics technology is a promising emerging area in developmental toxicity testing in vitro. Future application of alternatives in testing strategies for developmental toxicity may significantly gain from the inclusion of gene expression studies, given the unique programme of gene expression changes in embryonic and foetal development. [source] Evolution of latex and its constituent defensive chemistry in milkweeds (Asclepias): a phylogenetic test of plant defense escalationENTOMOLOGIA EXPERIMENTALIS ET APPLICATA, Issue 1 2008Anurag A. Agrawal Abstract A tremendous diversity of plants exude sticky and toxic latex upon tissue damage, and its production has been widely studied as a defensive adaptation against insect herbivores. Here, we address variation in latex production and its constituent chemical properties (cardenolides and cysteine proteases) in 53 milkweeds [Asclepias spp. (Apocynaceae)], employing a phylogenetic approach to test macroevolutionary hypotheses of defense evolution. Species were highly variable for all three traits, and they showed little evidence for strong phylogenetic conservatism. Latex production and the constituent chemical defenses are thus evolutionarily labile and may evolve rapidly. Nonetheless, in phylogenetically independent analyses, we show that the three traits show some correlations (and thus share a correlated evolutionary history), including a positive correlation between latex exudation and cysteine protease activity. Conversely, latex exudation and cysteine protease activity both showed a trade-off with cardenolide concentrations in latex. We also tested whether these traits have increased in their phenotypic values as the milkweeds diversified, as predicted by plant defense escalation theory. Alternative methods of testing this prediction gave conflicting results , there was an overall negative correlation between amount of evolutionary change and amount of latex exudation; however, ancestral state reconstructions indicated that most speciation events were associated with increases in latex. We conclude by (i) summarizing the evidence of milkweed latex itself as a multivariate defense including the amount exuded and toxin concentrations within, (ii) assessing the coordinated evolution of latex traits and how this fits with our previous notion of ,plant defense syndromes', and finally, (iii) proposing a novel hypothesis that includes an ,evolving community of herbivores' that may promote the escalation or decline of particular defensive strategies as plant lineages diversify. [source] Community Leader Education to Increase Epilepsy Attendance at Clinics in Epworth, ZimbabweEPILEPSIA, Issue 8 2000D. E. Ball Summary: Objective: To determine whether educating community leaders about epilepsy would lead to an increase in epilepsy cases being diagnosed and treated at primary health centers. Methods: This was a single-arm cohort study performed in Epworth, a periurban township outside Harare, Zimbabwe. The subjects were Epworth community leaders (Local Board members, teachers, nurses, police officers, traditional healers, prophets). Educational workshops were given on epilepsy, its cause, and its management, and the number of new epilepsy cases on local primary health clinic registers 6 months after the workshops was measured. Results: Six new cases were recorded, all among patients previously diagnosed with epilepsy. This was a significant increase (p = 0.02) compared with the null hypothesis. Conclusion: Although there was a significant increase in new cases, these did not represent newly diagnosed patients. Significant prejudice within the community may still prevent identified patients with epilepsy from seeking treatment. Alternative methods must be sought to increase the awareness of epilepsy within low-income communities and to reach "hidden" people with epilepsy. [source] Efficient generation of human hepatocytes by the intrahepatic delivery of clonal human mesenchymal stem cells in fetal sheep,HEPATOLOGY, Issue 6 2007Jason Chamberlain Alternative methods to whole liver transplantation require a suitable cell that can be expanded to obtain sufficient numbers required for successful transplantation while maintaining the ability to differentiate into hepatocytes. Mesenchymal stem cells (MSCs) possess several advantageous characteristics for cell-based therapy and have been shown to be able to differentiate into hepatocytes. Thus, we investigated whether the intrahepatic delivery of human MSCs is a safe and effective method for generating human hepatocytes and whether the route of administration influences the levels of donor-derived hepatocytes and their pattern of distribution throughout the parenchyma of the recipient's liver. Human clonally derived MSCs were transplanted by an intraperitoneal (n = 6) or intrahepatic (n = 6) route into preimmune fetal sheep. The animals were analyzed 56,70 days after transplantation by immunohistochemistry, enzyme-linked immunosorbent assay, and flow cytometry. The intrahepatic injection of human MSCs was safe and resulted in more efficient generation of hepatocytes (12.5% ± 3.5% versus 2.6% ± 0.4%). The animals that received an intrahepatic injection exhibited a widespread distribution of hepatocytes throughout the liver parenchyma, whereas an intraperitoneal injection resulted in a preferential periportal distribution of human hepatocytes that produced higher amounts of albumin. Furthermore, hepatocytes were generated from MSCs without the need to first migrate/lodge to the bone marrow and give rise to hematopoietic cells. Conclusion: Our studies provide evidence that MSCs are a valuable source of cells for liver repair and regeneration and that, by the alteration of the site of injection, the generation of hepatocytes occurs in different hepatic zones, suggesting that a combined transplantation approach may be necessary to successfully repopulate the liver with these cells. (HEPATOLOGY 2007.) [source] An evaluation of two Rapid Access Chest Pain Clinics in central Lancashire, UKJOURNAL OF EVALUATION IN CLINICAL PRACTICE, Issue 3 2007Arif Rajpura BSc MBChB MBA MPH FFPH DRCOG DFFP Abstract Aim, The aim of the project was to assess critically, using Maxwell's six dimensions, the quality of the services provided by the two Rapid Access Chest Pain Clinics (RACPCs) in Central Lancashire. Methods, Data on the actual use of the clinic was obtained from the two RACPCs. A record linkage exercise between the database of patients from the RACPC and HES/mortality data was performed. Expected use of the clinic was established from the performance of other RACPCs and from published angina incidence figures. Patient and general practitioner views were obtained by conducting questionnaire surveys. Key recommendations, (1) Clinic is providing a valuable service and should be continued. (2) A standardized database should be created which includes ethnicity and final diagnosis. (3) Alternative methods for rapid diagnosis and management of chest pain need to be provided for patients who are not suitable for the exercise electrocardiogram. (4) Referral criteria should be redrafted in order to remove the exclusion criteria for patients with chest pain of longer duration than 3 months. (5) Further resources need to be targeted at cardiology outpatients and revascularizations, as waiting times for patients with a positive test are felt to be too long. [source] Generalized least-squares parameter estimation from multiequation implicit modelsAICHE JOURNAL, Issue 10 2003Simon L. Marshall Maximum likelihood fit of nonlinear, implicit, multiple-response models to data containing normally distributed random errors can be carried out by a combination of the Gauss-Newton generalized nonlinear least-square algorithm first described by Britt and Luecke in 1973, with a Fletcher-Reeves conjugate gradient search for initial parameter estimates. The convergence of the algorithm is further improved by adding a step-limiting procedure that ensures a reduction in the objective function for each iteration. Multiple-equation regression methods appropriate to the solution of explicit fixed-regressor models are derived from this general treatment as special cases. These include weighted nonlinear least squares (where the covariance matrix of the response is known), and uniformly weighted nonlinear least squares (where the responses are uncorrelated and characterized by a single common variance). Alternative methods for fixed-regressor fits of explicit multiequation models with an unknown covariance matrix of the responses are also considered. The moment-matrix determinant criterion appropriate in such situations is also efficiently minimized by use of the conjugate-gradient algorithm, which is considerably less sensitive to the accuracy of the initial parameter estimate than the more usual Gauss-Newton methods. The performance of the new algorithm for models defined by one, two, and three implicit functional constraints per point is illustrated by random-regressor fits of isothermal p,X and p,X,Y vapor,liquid equilibrium data, and ternary liquid,liquid equilibrium data, respectively. [source] Limitations of amorphous content quantification by isothermal calorimetry using saturated salt solutions to control relative humidity: Alternative methodsJOURNAL OF PHARMACEUTICAL SCIENCES, Issue 4 2010Nawel Khalef Abstract Despite the high sensitivity of isothermal calorimetry (IC), reported measurements of amorphous content by this technique show significant variability even for the same compound. An investigation into the reasons behind such variability is presented using amorphous lactose and salbutamol sulfate as model compounds. An analysis was carried out on the heat evolved as a result of the exchange of water vapor between the solid sample during crystallization and the saline solution reservoir. The use of saturated salt solutions as means of control of the vapor pressure of water within sealed ampoules bears inherent limitations that lead in turn to the variability associated with the IC technique. We present an alternative IC method, based on an open cell configuration that effectively addresses the limitations encountered with the sealed ampoule system. The proposed approach yields an integral whose value is proportional to the amorphous content in the sample, thus enabling reliable and consistent quantifications. © 2009 Wiley-Liss, Inc. and the American Pharmacists Association J Pharm Sci 99: 2080,2089, 2010 [source] Neural Networks in 3D QSARMOLECULAR INFORMATICS, Issue 5 2003David Abstract 3D QSAR data analysis typically deals with large numbers of descriptors and various methods have been used to cope with this multivariate problem. One popular method, CoMFA, makes use of Partial Least Squares (PLS) regression. Alternative methods of data analysis to PLS have been explored including artificial neural networks (ANN). Within 3D QSAR, ANNs have been successful in producing models showing reasonable predictive capabilities. This review will examine recent studies in 3D QSAR and QSPR using ANN methods. [source] Within-individual discrimination on the Concealed Information Test using dynamic mixture modelingPSYCHOPHYSIOLOGY, Issue 2 2009Izumi Matsuda Abstract Whether an examinee has information about a crime is determined by the Concealed Information Test based on autonomic differences between the crime-related item and other control items. Multivariate quantitative statistical methods have been proposed for this determination. However, these require specific databases of responses, which are problematic for field application. Alternative methods, using only an individual's data, are preferable, but traditionally such within-individual approaches have limitations because of small data sample size. The present study proposes a new within-individual judgment method, the hidden Markov discrimination method, in which time series-data are modeled with dynamic mixture distributions. This method was applied to experimental data and showed sufficient potential in discriminating guilty from innocent examinees in a mock theft experiment compared with performance of previous methods. [source] A privacy challenge to longitudinal study methods: patient-derived codesAUSTRALIAN AND NEW ZEALAND JOURNAL OF PUBLIC HEALTH, Issue 4 2006Fiona J. Clay Recent changes to privacy legislation in Australia have resulted in more stringent requirements with respect to maintaining the confdentiality of patient health information. We describe a method employed to de-identify health information collected in a longitudinal study using codes. Using a patient-derived code that did not change during the life of the study follow-up resulted in errors in a quarter of the follow-up surveys. This may introduce bias that could compromise the validity of the study. Alternative methods of coding may alleviate some of these issues. However, removal of some of the constraints imposed by interpretations of privacy legislation may be the best way forward. [source] For debate: problems with the DMF index pertinent to dental caries data analysisCOMMUNITY DENTISTRY AND ORAL EPIDEMIOLOGY, Issue 6 2005J. M. Broadbent Abstract , The Decayed, Missing, Filled (DMF) index has been used for over 50 years and is well established as the key measure of caries experience in dental epidemiology. Despite its long history of use, there is debate about the most appropriate number of surfaces to include for a missing tooth. Assigning the maximum possible value for the ,M' component of DMFS (Surfaces) leads to overestimation of an individual's caries experience, and in any associated comparisons of in-caries experience, whereas assigning the minimum possible value for the ,M' component has the opposite effect. Alternative methods of assigning the number of caries-affected surfaces for an extracted tooth are considered. The net caries increment and adjusted caries increment (common methods of correction of the crude increment measure for reversals) are discussed, along with incidence density, a measure of caries extent. Problems exist with the adjusted caries increment, particularly among cohorts with low mean baseline caries experience. Development of an alternative method of estimating the relationship of ,true' and ,examiner' reversals is advocated, as well as greater utilization of incidence density in dental epidemiology. [source] Categorizing Urgency of Infant Emergency Department Visits: Agreement between CriteriaACADEMIC EMERGENCY MEDICINE, Issue 12 2006Rakesh D. Mistry MD Abstract Background The lack of valid classification methods for emergency department (ED) visit urgency has resulted in large variation in reported rates of nonurgent ED utilization. Objectives To compare four methods of defining ED visit urgency with the criterion standard, implicit criteria, for infant ED visits. Methods This was a secondary data analysis of a prospective birth cohort of Medicaid-enrolled infants who made at least one ED visit in the first six months of life. Complete ED visit data were reviewed to assess urgency via implicit criteria. The explicit criteria (adherence to prespecified criteria via complete ED charts), ED triage, diagnosis, and resources methods were also used to categorize visit urgency. Concordance and agreement (,) between the implicit criteria and alternative methods were measured. Results A total of 1,213 ED visits were assessed. Mean age was 2.8 (SD ± 1.78) months, and the most common diagnosis was upper respiratory infection (21.0%). Using implicit criteria, 52.3% of ED visits were deemed urgent. Urgent visits using other methods were as follows: explicit criteria, 51.8%; ED triage, 60.6%; diagnosis, 70.3%; and resources, 52.7%. Explicit criteria had the highest concordance (78.3%) and agreement (,= 0.57) with implicit criteria. Of limited data methods, resources demonstrated the best concordance (78.1%) and agreement (,= 0.56), while ED triage (67.9%) and diagnosis (71.6%) exhibited lower concordance and agreement (,= 0.35 and ,= 0.42, respectively). Explicit criteria and resources equally misclassified urgency for 11.1% of visits; ED triage and diagnosis tended to overclassify visits as urgent. Conclusions The explicit criteria and resources methods best approximate implicit criteria in classifying ED visit urgency in infants younger than six months of age. If confirmed in further studies, resources utilized has the potential to be an inexpensive, easily applicable method for urgency classification of infant ED visits when limited data are available. [source] Comparing risk profiles of individuals diagnosed with diabetes by OGTT and HbA1cThe Danish Inter99 studyDIABETIC MEDICINE, Issue 8 2010R. Borg Diabet. Med. 27, 906,910 (2010) Abstract Aims, Glycated haemoglobin (HbA1c) has been proposed as an alternative to the oral glucose tolerance test for diagnosing diabetes. We compared the cardiovascular risk profile of individuals identified by these two alternative methods. Methods, We assessed the prevalence of cardiovascular risk factors in individuals with undiagnosed diabetes according to the World Health Organization classification or by the newly proposed HbA1c level , 6.5% among 6258 participants of the Danish Inter99 study. Receiver operating curve analysis assessed the ability of fasting: 2-h plasma glucose and HbA1c to distinguish between individuals at high and low risk of ischemic heart disease, predicted by the PRECARD program. Results, Prevalence of undiagnosed diabetes was 4.1% [95% confidence interval (CI) 3.7,4.7%] by the current oral glucose tolerance test definition, whereas 6.6% (95% CI 6.0,7.2%) had diabetes by HbA1c levels. HbA1c -defined individuals were relatively older with higher proportions of men, smokers, lipid abnormalities and macro-albuminuria, but they were leaner and had lower blood pressure. HbA1c was better than fasting- and 2-h plasma glucose at distinguishing between individuals of high and low predicted risk of ischaemic heart disease; however, the difference between HbA1c and fasting- and 2-h plasma glucose was not statistically significant. Conclusions, Compared with the current oral glucose tolerance test definition, more individuals were classified as having diabetes based on the HbA1c criteria. This group had as unfavourable a risk profile as those identified by the oral glucose tolerance test. [source] Antireflux stents for palliation of malignant esophagocardial stenosisDISEASES OF THE ESOPHAGUS, Issue 2 2007K. Schoppmeyer SUMMARY., Placement of self-expanding metal stents (SEMS) for palliation of malignant stenoses at the gastroesophageal junction is often associated with stent migration and reflux symptoms. SEMS with an antireflux mechanism have been developed to overcome the latter problem. The aim of this study was to evaluate the safety and efficacy of antireflux Z-stents. Patients with advanced squamous cell or adenocarcinoma of the distal esophagus or cardia suffering from dysphagia received an antireflux Z-stent. Technical success, complications of the procedure, clinical symptoms before and after stent placement, reinterventions and survival were recorded. Follow-up was accomplished by patient interviews and a standardized questionnaire for primary care physicians. Eighteen consecutive patients received an antireflux Z-stent. Seventeen of 18 stents were placed technically successful in a single endoscopic procedure. Mean dysphagia score improved from 2.2 to 0.6. Four patients (22%) had permanent reflux symptoms, an additional nine (50%) were taking proton pump inhibitors on a regular basis. In 10 patients, a re-intervention was necessary mainly due to dislocation of the stent. To ensure adequate nutrition three and two patients received a percutaneous gastrostomy and a jejunostomy, respectively. Median survival from stent insertion was 54 days (range, 3,201). Although placement of an antireflux Z-stent is technically feasible, its application is hampered by frequent stent migration and insufficient prevention of gastroesophageal reflux. Further technical improvements of stents or alternative methods like brachytherapy are required for satisfactory palliation of malignant gastroesophageal stenosis. [source] Structure,activity relationships for the mutagenicity and carcinogenicity of simple and ,-, unsaturated aldehydesENVIRONMENTAL AND MOLECULAR MUTAGENESIS, Issue 3 2003Romualdo Benigni Abstract Aldehydes are important industrial compounds that are used for the synthesis of chemicals and pharmaceuticals and as solvents, food additives, and disinfectants. Because of their reactivity, aldehydes are able to interact with electron-rich biological macromolecules and adverse health effects have been reported, including general toxicity, allergenic reactions, mutagenicity, and carcinogenicity. The cost, time, and number of animals necessary to adequately screen these chemicals places serious limitations on the number of aldehydes whose health potential can be studied and points to the need of using alternative methods for assessing, at least in a preliminary way, the risks associated with the use of aldehydes. A method of choice is the study of quantitative structure,activity relationships (QSARs). In the present work, we present QSAR models for the mutagenicity and carcinogenicity of simple aldehydes and ,-, unsaturated aldehydes. The models point to the role of electrophilicity, bulkiness, and hydrophobicity in the genotoxic activity of the aldehydes and lend themselves to the prediction of the activity of other untested chemicals of the same class. Environ. Mol. Mutagen. 42:136,143, 2003. © 2003 Wiley-Liss, Inc. [source] A methodology for inferring the causes of observed impairments in aquatic ecosystems,ENVIRONMENTAL TOXICOLOGY & CHEMISTRY, Issue 6 2002Glenn W. Suter II Abstract Biological surveys have become a common technique for determining whether aquatic communities have been injured. However, their results are not useful for identifying management options until the causes of apparent injuries have been identified. Techniques for determining causation have been largely informal and ad hoc. This paper presents a logical system for causal inference. It begins by analyzing the available information to generate causal evidence; available information may include spatial or temporal associations of potential cause and effect, field or laboratory experimental results, and diagnostic evidence from the affected organisms. It then uses a series of three alternative methods to infer the cause: Elimination of causes, diagnostic protocols, and analysis of the strength of evidence. If the cause cannot be identified with sufficient confidence, the reality of the effects is examined, and if the effects are determined to be real, more information is obtained to reiterate the process. [source] Partial regression method to fit a generalized additive modelENVIRONMETRICS, Issue 6 2007Shui He Abstract Generalized additive models (GAMs) have been used as a standard analytic tool in studies of air pollution and health during the last decade. The air pollution measure is usually assumed to be linearly related to the health indicator and the effects of other covariates are modeled through smooth functions. A major statistical concern is the appropriateness of fitting GAMs in the presence of concurvity. Generalized linear models (GLM) with natural cubic splines as smoothers (GLM,+,NS) have been shown to perform better than GAM with smoothing splines (GAM,+,S), in regard to the bias and variance estimates using standard model fitting methods. As nonparametric smoothers are attractive for their flexibility and easy implementation, search for alternative methods to fit GAM,+,S is warranted. In this article, we propose a method using partial residuals to fit GAM,+,S and call it the "partial regression" method. Simulation results indicate better performance of the proposed method compared to gam.exact function in S-plus, the standard tool in air pollution studies, in regard to bias and variance estimates. In addition, the proposed method is less sensitive to the degree of smoothing and accommodates asymmetric smoothers. Copyright © 2007 John Wiley & Sons, Ltd. [source] Sub-Optimality of Income Statement-Based Methods for Measuring Operational Risk under Basel II: Empirical Evidence from Spanish BanksFINANCIAL MARKETS, INSTITUTIONS & INSTRUMENTS, Issue 4 2007Enrique Bonsón The New Basel Capital Accord (Basel II) was created with the intention of establishing a framework in which financial entities can manage their risks in a more detailed and efficient way. Within this general reform movement, Operational Risk emerges as a fundamental variable. OR can be managed by three alternative methods: the Basic Indicator Approach, Standard Approach and Advanced Measurement Approach. The choice of which method to adopt has become of supreme interest for senior banking managers. This study analyzes the exactitude of the underlying implicit hypotheses that support each method, distinguishing between income statement based methods and the management accounting based method. In the present study the non-optimum character of the two Income Statement-based methods is empirically confirmed, in the light of the data provided by Spanish financial entities. [source] Tracing energy flow in stream food webs using stable isotopes of hydrogenFRESHWATER BIOLOGY, Issue 5 2010JACQUES C. FINLAY Summary 1. Use of the natural ratios of carbon and nitrogen stable isotopes as tracers of trophic interactions has some clear advantages over alternative methods for food web analyses, yet is limited to situations where organic materials of interest have adequate isotopic separation between potential sources. This constrains the use of natural abundance stable isotope approaches to a subset of ecosystems with biogeochemical conditions favourable to source separation. 2. Recent studies suggest that stable hydrogen isotopes (,D) could provide a robust tracer to distinguish contributions of aquatic and terrestrial production in food webs, but variation in ,D of consumers and their organic food sources are poorly known. To explore the utility of the stable hydrogen isotope approach, we examined variation in ,D in stream food webs in a forested catchment where variation in ,13C has been described previously. 3. Although algal ,D varied by taxa and, to a small degree, between sites, we found consistent and clear separation (by an average of 67,) from terrestrial carbon sources. Environmental conditions known to affect algal ,13C, such as water velocity and stream productivity did not greatly influence algal ,D, and there was no evidence of seasonal variation. In contrast, algal ,13C was strongly affected by environmental factors both within and across sites, was seasonally variable at all sites, and partially overlapped with terrestrial ,13C in all streams with catchment areas larger than 10 km2. 4. While knowledge of isotopic exchange with water and trophic fractionation of ,D for aquatic consumers is limited, consistent source separation in streams suggests that ,D may provide a complementary food web tracer to ,13C in aquatic food webs. Lack of significant seasonal or spatial variation in ,D is a distinct advantage over ,13C for applications in many aquatic ecosystems. [source] Influence of Race on Household Residential UtilityGEOGRAPHICAL ANALYSIS, Issue 3 2000M William Sermons Residential location choice models are an important tool employed by urban geographers, planners, and transportation engineers for understanding household residential location behavior and for predicting future residential location activity. Racial segregation and residential racial preferences have been studied extensively using a variety of analysis techniques in social science research, but racial preferences have generally not been adequately incorporated into residential location choice models. This research develops residential location choice model specifications with a variety of alternative methods of addressing racial preferences in residential location decisions. The research tests whether social class, family structure, and in-group racial preferences are sufficient to explain household sensitivity to neighborhood racial composition. The importance of the interaction between the proportion of in-group race neighbors and other-race neighbors is also evaluated. Models for the San Francisco Bay metropolitan area are estimated and evidence of significant avoidance behavior by households of all races is found. The results suggest that social class differences, family structure differences, and in-group racial preferences alone are not sufficient to explain household residential racial preference and that households of all races practice racial avoidance behavior. Particularly pronounced avoidance of black neighbors by Asian households, Hispanic neighbors by black households, and Asian neighbors by white households are found. Evidence of a decrease in household racial avoidance intensity in neighborhoods with large numbers of own-race neighbors is also found. [source] On the Application of Inductive Machine Learning Tools to Geographical AnalysisGEOGRAPHICAL ANALYSIS, Issue 2 2000Mark Gahegan Inductive machine learning tools, such as neural networks and decision trees, offer alternative methods for classification, clustering, and pattern recognition that can, in theory, extend to the complex or "deep" data sets that pervade geography. By contrast, traditional statistical approaches may fail, due to issues of scalability and flexibility. This paper discusses the role of inductive machine learning as it relates to geographical analysis. The discussion presented is not based on comparative results or on mathematical description, but instead focuses on the often subtle ways in which the various inductive learning approaches differ operationally, describing (1) the manner in which the feature space is partitioned or clustered, (2) the search mechanisms employed to identify good solutions, and (3) the different biases that each technique imposes. The consequences arising from these issues, when considering complex geographic feature spaces, are then described in detail. The overall aim is to provide a foundation upon which reliable inductive analysis methods can be constructed, instead of depending on piecemeal or haphazard experimentation with the various operational criteria that inductive learning tools call for. Often, it would appear that these criteria are not well understood by practitioners in the geographic sphere, which can lead to difficulties in configuration and operation, and ultimately to poor performance. [source] Rafts in oligodendrocytes: Evidence and structure,function relationshipGLIA, Issue 6 2006Ellen Gielen Abstract The plasma membrane of eukaryotic cells exhibits lateral inhomogeneities, mainly containing cholesterol and sphingomyelin, which provide liquid-ordered microdomains (lipid "rafts") that segregate membrane components. Rafts are thought to modulate the biological functions of molecules that become associated with them, and as such, they appear to be involved in a variety of processes, including signal transduction, membrane sorting, cell adhesion and pathogen entry. Although still a matter of ongoing debate, evidence in favor of the presence of these microdomains is gradually accumulating but a consensus on issues like their size, lifetime, composition, and biological significance has yet to be reached. Here, we provide an overview of the evidence supporting the presence of rafts in oligodendrocytes, the myelin-producing cells of the central nervous system, and discuss their functional significance. The myelin membrane differs fundamentally from the plasma membrane, both in lipid and protein composition. Moreover, since myelin membranes are unusually enriched in glycosphingolipids, questions concerning the biogenesis and functional relevance of microdomains thus appear of special interest in oligodendrocytes. The current picture of rafts in oligodendrocytes is mainly based on detergent methods. The robustness of such data is discussed and alternative methods that may provide complementary data are indicated. © 2006 Wiley-Liss, Inc. [source] Institutional Review Boards and Multisite Studies in Health Services Research: Is There a Better Way?HEALTH SERVICES RESEARCH, Issue 1 2005Jennifer L. Gold Objective. The following paper examines the issue of whether the current system for ethics review of multisite health services research protocols is adequate, or whether there exist alternative methods that should be considered. Principal Findings. (1) Investigators at different sites in a multisite project often have very different experiences with respect to the requirements and requests of the review board. Other problems include the waste of time and resources spent on document preparation for review boards, and delays in the commencement of research activities. (2) There are several possible reasons why there is variability in ethics review. These include the absence of standardized forms, differences in the background and experiences of board members, the influence of institutional or professional culture, and regional thinking. (3) Given the limited benefits derived from the variability in recommendations of multiple boards and the numerous problems encountered in seeking ethics approval from multiple boards suggest that some sort of reform is in order. Conclusions. The increasing number of multisite, health services research studies calls for a centralized system of ethics review. The local review model is simply not conducive to multisite studies, and jeopardizes the integrity of the research process. Centralized multisite review boards, together with standardized documents and procedure, electronic access to documentation, and training for board members are all possible solutions. Changes to the current system are necessary not only to facilitate the conduct of multisite research, but also to preserve the integrity of the ethics approval process in general. [source] A Relational Approach to Measuring Competition Among HospitalsHEALTH SERVICES RESEARCH, Issue 2 2002Min-Woong Sohn Objective. To present a new, relational approach to measuring competition in hospital markets and to compare this relational approach with alternative methods of measuring competition. Data Sources. The California Office of Statewide Health Planning and Development patient discharge abstracts and financial disclosure files for 1991. Study Design. Patient discharge abstracts for an entire year were used to derive patient flows, which were combined to calculate the extent of overlap in patient pools for each pair of hospitals. This produces a cross-sectional measure of market competition among hospitals. Principal Findings. The relational approach produces measures of competition between each and every pair of hospitals in the study sample, allowing us to examine a much more "local" as well as dyadic effect of competition. Preliminary analyses show the following: (1) Hospital markets are smaller than thought. (2) For-profit hospitals received considerably more competition from their neighbors than either nonprofit or government hospitals. (3) The size of a hospital does not matter in the amount of competition received, but the larger hospitals generated significantly more competition than smaller ones. Comparisons of this method to the other methods show considerable differences in identifying competitors, indicating that these methods are not as comparable as previously thought. Conclusion. The relational approach measures competition in a more detailed way and allows researchers to conduct more fine-grained analyses of market competition. This approach allows one to model market structure in a manner that goes far beyond the traditional categories of monopoly, oligopoly, and perfect competition. It also opens up an entirely new range of analytic possibilities in examining the effect of competition on hospital performance, price of medical care, changes in the market, technology acquisition, and many other phenomena in the health care field. [source] A cut-cell non-conforming Cartesian mesh method for compressible and incompressible flowINTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 11 2007J. Pattinson Abstract This paper details a multigrid-accelerated cut-cell non-conforming Cartesian mesh methodology for the modelling of inviscid compressible and incompressible flow. This is done via a single equation set that describes sub-, trans-, and supersonic flows. Cut-cell technology is developed to furnish body-fitted meshes with an overlapping mesh as starting point, and in a manner which is insensitive to surface definition inconsistencies. Spatial discretization is effected via an edge-based vertex-centred finite volume method. An alternative dual-mesh construction strategy, similar to the cell-centred method, is developed. Incompressibility is dealt with via an artificial compressibility algorithm, and stabilization achieved with artificial dissipation. In compressible flow, shocks are captured via pressure switch-activated upwinding. The solution process is accelerated with full approximation storage (FAS) multigrid where coarse meshes are generated automatically via a volume agglomeration methodology. This is the first time that the proposed discretization and solution methods are employed to solve a single compressible,incompressible equation set on cut-cell Cartesian meshes. The developed technology is validated by numerical experiments. The standard discretization and alternative methods were found equivalent in accuracy and computational cost. The multigrid implementation achieved decreases in CPU time of up to one order of magnitude. Copyright © 2007 John Wiley & Sons, Ltd. [source] Moving element method for train-track dynamicsINTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 11 2003C. G. Koh Abstract This paper presents a new approach, called the moving element method, for the dynamic analysis of train-track systems. By discretizing the rail beam on viscoelastic foundation into elements that ,flow' with the moving vehicle, the proposed method eliminates the need for keeping track of the vehicle position with respect to the track model. The governing equations are formulated in a co-ordinate system travelling at a constant velocity, and a class of conceptual elements (as opposed to physical elements) are derived for the rail beams. In the numerical study, four cases of moving vehicle are presented taking into consideration the effects of moving load and rail corrugation. The method is shown to work for varying vehicle velocity and multiple contact points, and has several advantages over the finite element method. The numerical solutions compare favourably with the results obtained by alternative methods. Copyright © 2003 John Wiley & Sons, Ltd. [source] |