Serious Drawback (serious + drawback)

Distribution by Scientific Domains


Selected Abstracts


CE frontal analysis based on simultaneous UV and contactless conductivity detection: A general setup for studying noncovalent interactions

ELECTROPHORESIS, Issue 3 2007
Henrik Jensen Dr.
Abstract CE frontal analysis (CE-FA) has been established as a powerful tool to study noncovalent interactions between macromolecules and small molecules such as drug substances or pharmaceutical excipients. However, when using traditional commercial CE instrumentation, a serious drawback is related to the fact that only UV-active compounds can be studied. In recent years, contactless conductivity detection has become an attractive alternative to UV detection in CE due to its high versatility. In this study, we combine contactless conductivity detection and UV detection in a highly versatile setup for profiling noncovalent interactions between low-molecular-weight molecules and macromolecules. In the case of molecules having a chromophore the setup allows determination of binding constants using two independent detectors. The new contactless conductivity detection cell is compatible with commercial CE instrumentation and is therefore easily implemented in any analysis laboratory with CE expertise. [source]


Teaching Foreign Policy with Memoirs

INTERNATIONAL STUDIES PERSPECTIVES, Issue 2 2002
Terry L. Deibel
Excerpts from the memoirs of high foreign policy officials, if carefully selected and structured, can be a valuable resource in the teaching of diplomatic history, American foreign policy, and international relations. Two decades of teaching a memoirs-only course to mid-career military officers and foreign affairs professionals in a seminar discussion format reveals many of their advantages. Memoirs are interesting reading that rarely fail to engage a reader's attention; they impart detailed knowledge of historical events; they provide a rich understanding of process and the neglected area of policy implementation; like case studies, they let students build vicarious experience in policymaking and execution; and they often provide what Alexander George called "policy-relevant generalizations." While lack of objectivity can be a serious drawback of first-person accounts, it provides its own lessons on the nature of history and can be offset by using multiple accounts of the same events and by combining memoirs with documents and historical works, or countering analytical studies. Although picking the most interesting and worthwhile excerpts, getting them in students' hands, and accommodating their length within the boundaries of a standard college course are additional challenges, professors who take them on should find that memoirs add a new level of excitement and realism to their courses. [source]


Optimizing the tuning parameters of least squares support vector machines regression for NIR spectra

JOURNAL OF CHEMOMETRICS, Issue 5 2006
T. Coen
Abstract Partial least squares (PLS) is one of the most used tools in chemometrics. Other data analysis techniques such as artificial neural networks and least squares support vector machines (LS-SVMs) have however made their entry in the field of chemometrics. These techniques can also model nonlinear relations, but the presence of tuning parameters is a serious drawback. These parameters balance the risk of overfitting with the possibility to model the underlying nonlinear relation. In this work a methodology is proposed to initialize and optimize those tuning parameters for LS-SVMs with radial basis function (RBF)-kernel based on a statistical interpretation. In this way, these methods become much more appealing for new users. The presented methods are applied on manure spectra. Although this dataset is only slightly nonlinear, good results were obtained. Copyright © 2007 John Wiley & Sons, Ltd. [source]


Use of dispersal,vicariance analysis in biogeography , a critique

JOURNAL OF BIOGEOGRAPHY, Issue 1 2010
Ullasa Kodandaramaiah
Abstract Aim, Analytical methods are commonly used to identify historical processes of vicariance and dispersal in the evolution of taxa. Currently, dispersal,vicariance analysis implemented in the software diva is the most widely used method. Despite some recognized shortcomings of the method, it has been treated as error-free in many cases and used extensively as the sole method to reconstruct histories of taxa. In light of this, an evaluation of the limitations of the method is needed, especially in relation to several newer alternatives. Methods, In an approach similar to simulation studies in phylogenetics, I use hypothetical taxa evolving in specific geological scenarios and test how well diva reconstructs their histories. Results,diva reconstructs histories accurately when evolution has been simple; that is, where speciation is driven mainly by vicariance. Ancestral areas are wrongly identified under several conditions, including complex patterns of dispersals and within-area speciation events. Several potentially serious drawbacks in using diva for inferences in biogeography are discussed. These include the inability to distinguish between contiguous range expansions and across-barrier dispersals, a low probability of invoking extinctions, incorrect constraints set on the maximum number of areas by the user, and analysing the ingroup taxa without sister groups. Main conclusions, Most problems with inferences based on diva are linked to the inflexibility and simplicity of the assumptions used in the method. These are frequently invalid, resulting in spurious reconstructions. I argue that it might be dangerous to rely solely on diva optimization to infer the history of a group. I also argue that diva is not ideally suited to distinguishing between dispersal and vicariance because it cannot a priori take into account the age of divergences relative to the timing of barrier formation. I suggest that other alternative methods can be used to corroborate the findings in diva, increasing the robustness of biogeographic hypotheses. I compare some important alternatives and conclude that model-based approaches are promising. [source]


Transient behavior of time-between-failures of complex repairable systems

QUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, Issue 4 2002
J. Bert Keats
Abstract It is well known for complex repairable systems (with as few as four components), regardless of the time-to-failure (TTF) distribution of each component, that the time-between-failures (TBFs) tends toward the exponential. This is a long-term or ,steady-state' property. Aware of this property, many of those modeling such systems tend to base spares provisioning, maintenance personnel availability and other decisions on an exponential TBFs distribution. Such a policy may suffer serious drawbacks. A non-homogeneous Poisson process (NHPP) accounts for these intervals for some time prior to ,steady-state'. Using computer simulation, the nature of transient TBF behavior is examined. The number of system failures until the exponential TBF assumption is valid is of particular interest. We show, using a number of system configurations and failure and repair distributions, that the transient behavior quickly drives the TBF distribution to the exponential. We feel comfortable with achieving exponential results for the TBF with 30 system failures. This number may be smaller for configurations with more components. However, at this point, we recommend 30 as the systems failure threshold for using the exponential assumption. Copyright © 2002 John Wiley & Sons, Ltd. [source]