Home About us Contact | |||
Wide Applicability (wide + applicability)
Selected AbstractsAbductive Diagnosis Using Time-Objects: Criteria for the Evaluation of SolutionsCOMPUTATIONAL INTELLIGENCE, Issue 1 2001Elpida T. Keravnou Diagnostic problem solving aims to account for, or explain, a malfunction of a system (human or other). Any plausible potential diagnostic solution must satisfy some minimum criteria relevant to the application. Often there will be several plausible solutions, and further criteria will be required to select the "best" explanation. Expert diagnosticians may employ different, complex criteria at different stages of their reasoning. These criteria may be combinations of some more primitive criteria, which therefore should be represented separately and explicitly to permit their flexible and transparent combined usage. In diagnostic reasoning there is a tight coupling between the formation of potential solutions and their evaluation. This is the essence of abductive reasoning. This article presents an abductive framework for diagnostic problem solving. Time-objects, an association of a property and an existence, are used as the representation formalism and a number of primitive, general evaluation criteria into which time has been integrated are defined. Each criterion provides an intuitive yardstick for evaluating the space of potential solutions. The criteria can be combined as appropriate for particular applications to define plausible and best explanations. The central principle is that when time is diagnostically significant, it should be modeled explicitly to enable a more accurate formulation and evaluation of diagnostic solutions. The integration of time and primitive evaluation criteria is illustrated through the Skeletal Dysplasias Diagnostician (SDD) system, a diagnostic expert system for a real-life medical domain. SDD's notions of plausible and best explanation are reviewed so as to show the difficulties in formalizing such notions. Although we illustrate our work by medical problems, it has been motivated by consideration of problems in a number of other domains (fermentation monitoring, air and ground traffic control, power distribution) and is intended to be of wide applicability. [source] Model updating using noisy response measurements without knowledge of the input spectrumEARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 2 2005Ka-Veng Yuen Abstract A new probabilistic model identification methodology is proposed using measured response time histories only. The proposed approach requires that the number of independent measurements is larger than the number of independent excitations. Under this condition, no input measurements or any information regarding the stochastic model of the input is required. Specifically, the method does not require the response to be stationary and does not assume any knowledge of the parametric form of the spectral density of the input. Therefore, the method has very wide applicability. The proposed approach allows one to obtain not only the most probable values of the updated model parameters but also their associated uncertainties using only one set of response data. It is found that the updated probability distribution can be well approximated by a Gaussian distribution centered at the most probable values of the parameters. Examples are presented to illustrate the proposed method. Copyright © 2004 John Wiley & Sons, Ltd. [source] Source-based morphometry: The use of independent component analysis to identify gray matter differences with application to schizophreniaHUMAN BRAIN MAPPING, Issue 3 2009Lai Xu Abstract We present a multivariate alternative to the voxel-based morphometry (VBM) approach called source-based morphometry (SBM), to study gray matter differences between patients and healthy controls. The SBM approach begins with the same preprocessing procedures as VBM. Next, independent component analysis is used to identify naturally grouping, maximally independent sources. Finally, statistical analyses are used to determine the significant sources and their relationship to other variables. The identified "source networks," groups of spatially distinct regions with common covariation among subjects, provide information about localization of gray matter changes and their variation among individuals. In this study, we first compared VBM and SBM via a simulation and then applied both methods to real data obtained from 120 chronic schizophrenia patients and 120 healthy controls. SBM identified five gray matter sources as significantly associated with schizophrenia. These included sources in the bilateral temporal lobes, thalamus, basal ganglia, parietal lobe, and frontotemporal regions. None of these showed an effect of sex. Two sources in the bilateral temporal and parietal lobes showed age-related reductions. The most significant source of schizophrenia-related gray matter changes identified by SBM occurred in the bilateral temporal lobe, while the most significant change found by VBM occurred in the thalamus. The SBM approach found changes not identified by VBM in basal ganglia, parietal, and occipital lobe. These findings show that SBM is a multivariate alternative to VBM, with wide applicability to studying changes in brain structure. Hum Brain Mapp, 2009. © 2008 Wiley-Liss, Inc. [source] Shearing flows of a dry granular material,hypoplastic constitutive theory and numerical simulationsINTERNATIONAL JOURNAL FOR NUMERICAL AND ANALYTICAL METHODS IN GEOMECHANICS, Issue 14 2006Chung Fang Abstract In the present study, the Goodman,Cowin theory is extended to incorporate plastic features to construct an elasto-visco-plastic constitutive model for flowing dry granular materials. A thermodynamic analysis, based on the Müller,Liu entropy principle, is performed to derive the equilibrium expressions of the constitutive variables. Non-equilibrium responses are proposed by use of a quasi-linear theory, in particular a hypoplastic-type relation is introduced to model the internal friction and plastic effects. It is illustrated that the Goodman,Cowin theory can appropriately be extended to include frictional effects into the evolution equation of the volume fraction (i.e. the so-called balance of equilibrated force) and the equilibrium expression of the Cauchy stress tensor. The implemented model is applied to investigate conventional steady isothermal granular flows with incompressible grains, namely simple plane shear, inclined gravity-driven and vertical channel-flows, respectively. Numerical results show that the hypoplastic effect plays a significant role in the behaviour of a flowing granular material. The obtained profiles of the velocity and the volume fraction with hypoplastic features are usually sharper and the shear-thinning effect is more significant than that without such plastic effects. This points at the possible wide applicability of the present model in the fields of granular materials and soil mechanics. In addition, the present paper also provides a framework for a possible extension of the hypoplastic theories which can be further undertaken. Copyright © 2006 John Wiley & Sons, Ltd. [source] Z+ fading memory and extensions of input,output mapsINTERNATIONAL JOURNAL OF CIRCUIT THEORY AND APPLICATIONS, Issue 4 2001Irwin W. Sandberg Abstract An Erratum for this article has been published in the International Journal of Circuit Theory and Applications 30(4) 2002, 179. Much is known about time-invariant non-linear systems with inputs and outputs defined on Z+ that possess approximately-finite memory. For example, under mild additional conditions, they can be approximated arbitrarily well by the maps of certain interesting simple structures. An important fact that gives meaning to results concerning such systems is that the approximately-finite-memory condition is known to be often met. Here we consider the known proposition that if a causal time-invariant discrete-time input,output map H has fading memory on a set of bounded functions defined on all of the integers Z, then H can be approximated arbitrarily well by a finite Volterra series operator. We show that in a certain sense, involving the existence of extensions of system maps, this result too has wide applicability. Copyright © 2001 John Wiley & Sons, Ltd. [source] Karl Pearson,The Scientific Life in a Statistical Age by Theodore M. Porter: A ReviewINTERNATIONAL STATISTICAL REVIEW, Issue 1 2009Herbert A. David Summary Porter presents an excellent account of the young Karl Pearson and his extraordinarily varied activities. These ranged from the Cambridge Mathematical Tripos Exams to German history and folklore, and included free thought, socialism, the woman's question, and the law. Returning to science, Pearson produced the famous Grammar of Science. He decided on a career in statistics only at age 35. Porter emphasizes Pearson's often acrimonious but largely successful battles to show the wide applicability and importance of statistics in many areas of science and public affairs. Eugenics became a passion for Pearson. Avoiding all formulas Porter fails to give any concrete ideas of even Pearson's most important contributions to statistical theory. We try to sketch these here. [source] Time-dependent density functional theory for nonadiabatic processesISRAEL JOURNAL OF CHEMISTRY, Issue 1-2 2005Roi Baer Time-dependent density functional theory (TDDFT) is a general and robust method allowing the study of electron dynamics whether induced by nuclear motion or by external fields. We give a brief overview of the theory and some numerical methods together with recent applications stressing the generality and wide applicability of the method. We also discuss recent attempts to extend the present TDDFT by incorporating memory terms into the exchange correlation potentials. [source] Darwinism,a new paradigm for organizational behavior?JOURNAL OF ORGANIZATIONAL BEHAVIOR, Issue 2 2006Nigel Nicholson The Special Issue reflects a growing interest in Darwinian ideas and their increasing application to work and organizational issues, analyzes factors that have impeded its adoption as a paradigm and considers the prospects for future growth. After a brief introduction to key concepts in the new Darwinism, some histories, and controversies are traced. Causes for the particularly slow uptake of the paradigm in Organizational Behavior (OB) are discussed, as well as some of the common misconceptions and incorrect attributions that have been leveled at evolutionary theory. The paper then overviews the scope and contents of the Special Issue (SI) papers, and concludes by considering future prospects for the field. The authors argue that the paradigm has compelling significance and wide applicability to the full range of OB topics and interests. Copyright © 2006 John Wiley & Sons, Ltd. [source] Bayesian change-point analysis for atomic force microscopy and soft material indentationJOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES C (APPLIED STATISTICS), Issue 4 2010Daniel Rudoy Summary., Material indentation studies, in which a probe is brought into controlled physical contact with an experimental sample, have long been a primary means by which scientists characterize the mechanical properties of materials. More recently, the advent of atomic force microscopy, which operates on the same fundamental principle, has in turn revolutionized the nanoscale analysis of soft biomaterials such as cells and tissues. The paper addresses the inferential problems that are associated with material indentation and atomic force microscopy, through a framework for the change-point analysis of pre-contact and post-contact data that is applicable to experiments across a variety of physical scales. A hierarchical Bayesian model is proposed to account for experimentally observed change-point smoothness constraints and measurement error variability, with efficient Monte Carlo methods developed and employed to realize inference via posterior sampling for parameters such as Young's modulus, which is a key quantifier of material stiffness. These results are the first to provide the materials science community with rigorous inference procedures and quantification of uncertainty, via optimized and fully automated high throughput algorithms, implemented as the publicly available software package BayesCP. To demonstrate the consistent accuracy and wide applicability of this approach, results are shown for a variety of data sets from both macromaterials and micromaterials experiments,including silicone, neurons and red blood cells,conducted by the authors and others. [source] Teaching Healthy Anger ManagementPERSPECTIVES IN PSYCHIATRIC CARE, Issue 2 2001Sandra P. Thomas PhD TOPIC. Teaching anger management in the community. PURPOSE. To describe anger management and offer guidelines for assessing potential participants and teaching healthy behaviors. SOURCES. Drawing from the literature, more than 10 years of quantitative and qualitative studies by our research team, and 5 years of experience in conducting anger management groups, the author presents basic principles of teaching anger management. A model is described for a 4-week group for women. CONCLUSIONS. Anger management has wide applicability to a variety of constituencies for both primary and secondary prevention. Advanced practice psychiatric nurses are well-qualified to provide this psychoeducational intervention. [source] A self-consistent scattering model for cirrus.THE QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY, Issue 629 2007I: The solar region Abstract In this paper a self-consistent scattering model for cirrus is presented. The model consists of an ensemble of ice crystals where the smallest ice crystal is represented by a single hexagonal ice column. As the overall ice crystal size increases, the ice crystals become progressively more complex by arbitrarily attaching other hexagonal elements until a chain-like ice crystal is formed, this representing the largest ice crystal in the ensemble. The ensemble consists of six ice crystal members whose aspect ratios (ratios of the major-to-minor axes of the circumscribed ellipse) are allowed to vary between unity and 1.84 for the smallest and largest ice crystal, respectively. The ensemble model's prediction of parameters fundamental to solar radiative transfer through cirrus such as ice water content and the volume extinction coefficient is tested using in situ based data obtained from the midlatitudes and Tropics. It is found that the ensemble model is able to generally predict the ice water content and extinction measurements within a factor of two. Moreover, the ensemble model's prediction of cirrus spherical albedo and polarized reflection are tested against a space-based instrument using one day of global measurements. The space-based instrument is able to sample the scattering phase function between the scattering angles of approximately 60° and 180° , and a total of 37 581 satellite pixels were used in the present analysis covering latitude bands between 43.75°S and 76.58°N. It is found that the ensemble model phase function is well able to minimize significantly differences between satellite-based measurements of spherical albedo and the ensemble model's prediction of spherical albedo. The satellite-based measurements of polarized reflection are found to be reasonably described by more simple members of the ensemble. The ensemble model presented in this paper should find wide applicability to the remote sensing of cirrus as well as more fundamental solar radiative transfer calculations through cirrus, and improved solar optical properties for climate and Numerical Weather Prediction models. Copyright © 2007 Royal Meteorological Society [source] Fitting Semiparametric Additive Hazards Models using Standard Statistical SoftwareBIOMETRICAL JOURNAL, Issue 5 2007Douglas E. Schaubel Abstract The Cox proportional hazards model has become the standard in biomedical studies, particularly for settings in which the estimation covariate effects (as opposed to prediction) is the primary objective. In spite of the obvious flexibility of this approach and its wide applicability, the model is not usually chosen for its fit to the data, but by convention and for reasons of convenience. It is quite possible that the covariates add to, rather than multiply the baseline hazard, making an additive hazards model a more suitable choice. Typically, proportionality is assumed, with the potential for additive covariate effects not evaluated or even seriously considered. Contributing to this phenomenon is the fact that many popular software packages (e.g., SAS, S-PLUS/R) have standard procedures to fit the Cox model (e.g., proc phreg, coxph), but as of yet no analogous procedures to fit its additive analog, the Lin and Ying (1994) semiparametric additive hazards model. In this article, we establish the connections between the Lin and Ying (1994) model and both Cox and least squares regression. We demonstrate how SAS's phreg and reg procedures may be used to fit the additive hazards model, after some straightforward data manipulations. We then apply the additive hazards model to examine the relationship between Model for End-stage Liver Disease (MELD) score and mortality among patients wait-listed for liver transplantation. (© 2007 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] |