Many Situations (many + situation)

Distribution by Scientific Domains


Selected Abstracts


The role of reactive oxygen species and nitric oxide in mast cell-dependent inflammatory processes

IMMUNOLOGICAL REVIEWS, Issue 1 2007
Emily J. Swindle
Summary:, Reactive oxygen species (ROS) and reactive nitrogen oxide species (RNOS), including nitric oxide, are produced in cells by a variety of enzymatic and non-enzymatic mechanisms. At high levels, both types of oxidants are used to kill ingested organisms within phagocytes. At low levels, RNOS may diffuse outside cells where they impact the vasculature and nervous system. Recent evidence suggests that low levels of ROS produced within cells are involved in cell signaling. Along with these physiological roles, many pathological conditions exist where detrimental high-level ROS and RNOS are produced. Many situations in which ROS/RNOS are associated also involve mast cell activation. In innate immunity, such mast cells are involved in the immune response toward pathogens. In acquired immunity, activation of mast cells by cross-linking of receptor-bound immunoglobulin E causes the release of mediators involved in the allergic inflammatory response. In this review, we describe the principle pathways for ROS and RNOS generation by cells and discuss the existence of such pathways in mast cells. In addition, we examine the evidence for a functional role for ROS and RNOS in mast cell secretory responses and discuss evidence for a direct relationship between ROS, RNOS, and mast cells in mast cell-dependent inflammatory conditions. [source]


Deadlock detection in MPI programs

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 11 2002
Glenn R. Luecke
Abstract The Message-Passing Interface (MPI) is commonly used to write parallel programs for distributed memory parallel computers. MPI-CHECK is a tool developed to aid in the debugging of MPI programs that are written in free or fixed format Fortran 90 and Fortran 77. This paper presents the methods used in MPI-CHECK 2.0 to detect many situations where actual and potential deadlocks occur when using blocking and non-blocking point-to-point routines as well as when using collective routines. Copyright © 2002 John Wiley & Sons, Ltd. [source]


Onychomycosis: diagnosis and topical therapy

DERMATOLOGIC THERAPY, Issue 2 2002
Philip Fleckman
Onychomycosis (true fungal infection of the nail plate) is a common malady that may present in several clinical patterns. Because many noninfectious disorders of the nail may masquerade as onychomycosis, the clinical diagnosis must be confirmed by wet mount (potassium hydroxide [KOH] examination), culture, or histology before treatment is begun. Although systemic therapy of onychomycosis with the newer drugs is more effective, the prospect of effective topical therapy is a welcome alternative in many situations. Choices for topical therapy are limited in the United States at this time. As new, improved choices are added to the therapeutic armamentarium, topical therapy my supersede systemic. In addition, the potential for synergism with systemic therapy and for prophylaxis of cleared infections is only now being explored. [source]


Measurement and data analysis methods for field-scale wind erosion studies and model validation,

EARTH SURFACE PROCESSES AND LANDFORMS, Issue 11 2003
Ted M. Zobeck
Abstract Accurate and reliable methods of measuring windblown sediment are needed to con,rm, validate, and improve erosion models, assess the intensity of aeolian processes and related damage, determine the source of pollutants, and for other applications. This paper outlines important principles to consider in conducting ,eld-scale wind erosion studies and proposes strategies of ,eld data collection for use in model validation and development. Detailed discussions include consideration of ,eld characteristics, sediment sampling, and meteorological stations. The ,eld shape used in ,eld-scale wind erosion research is generally a matter of preference and in many studies may not have practical signi,cance. Maintaining a clear non-erodible boundary is necessary to accurately determine erosion fetch distance. A ,eld length of about 300 m may be needed in many situations to approach transport capacity for saltation ,ux in bare agricultural ,elds. Field surface conditions affect the wind pro,le and other processes such as sediment emission, transport, and deposition and soil erodibility. Knowledge of the temporal variation in surface conditions is necessary to understand aeolian processes. Temporal soil properties that impact aeolian processes include surface roughness, dry aggregate size distribution, dry aggregate stability, and crust characteristics. Use of a portable 2 tall anemometer tower should be considered to quantify variability of friction velocity and aerodynamic roughness caused by surface conditions in ,eld-scale studies. The types of samplers used for sampling aeolian sediment will vary depending upon the type of sediment to be measured. The Big Spring Number Eight (BSNE) and Modi,ed Wilson and Cooke (MWAC) samplers appear to be the most popular for ,eld studies of saltation. Suspension ,ux may be measured with commercially available instruments after modi,cations are made to ensure isokinetic conditions at high wind speeds. Meteorological measurements should include wind speed and direction, air temperature, solar radiation, relative humidity, rain amount, soil temperature and moisture. Careful consideration of the climatic, sediment, and soil surface characteristics observed in future ,eld-scale wind erosion studies will ensure maximum use of the data collected. Copyright © 2003 John Wiley & Sons, Ltd. [source]


The Effects of Random and Discrete Sampling when Estimating Continuous,Time Diffusions

ECONOMETRICA, Issue 2 2003
Sahalia, Yacine Aït
High,frequency financial data are not only discretely sampled in time but the time separating successive observations is often random. We analyze the consequences of this dual feature of the data when estimating a continuous,time model. In particular, we measure the additional effects of the randomness of the sampling intervals over and beyond those due to the discreteness of the data. We also examine the effect of simply ignoring the sampling randomness. We find that in many situations the randomness of the sampling has a larger impact than the discreteness of the data. [source]


Stochastic modelling of global solar radiation measured in the state of Kuwait

ENVIRONMETRICS, Issue 7 2002
S. A. Al-Awadhi
Abstract Two stochastic models that capture the main features of daily exposure of global radiation in Kuwait are proposed. The development of these models is based on removing the annual periodicity and seasonal variation of solar radiation. Thus the daily radiation is decomposed as the sum of the trend component and a stochastic component. In many situations, there are dramatic changes in the radiation series through the year due to the condition of the weather, as is the case of the data from Kuwait. This would affect the accuracy of the model, and therefore the series is divided into two regimes: one corresponds to clear days where the value of the global radiation would be normal and the other to non-clear days where the value of global radiation would be very low. Then the trend component is expressed as a Fourier series taking into account such apparent breaks in the series. The stochastic component is first tested for linearity and Gaussianity and it is found that it does not satisfy these assumptions. Therefore, a linear time series model (ARMA modeling) may not be adequate and, to overcome this problem, a bilinear time series is used to model the stochastic component of daily global radiation in Kuwait. The method proposed considers first fitting an AR model to the data and then seeing whether a further reduction in the mean sum of squares can be achieved by introducing extra bilinear terms. The Akaike Information Criterion (AIC) is used to select the best model. Copyright © 2002 John Wiley & Sons, Ltd. [source]


Computational significance of transient dynamics in cortical networks

EUROPEAN JOURNAL OF NEUROSCIENCE, Issue 1 2008
Daniel Durstewitz
Abstract Neural responses are most often characterized in terms of the sets of environmental or internal conditions or stimuli with which their firing rate are correlated increases or decreases. Their transient (nonstationary) temporal profiles of activity have received comparatively less attention. Similarly, the computational framework of attractor neural networks puts most emphasis on the representational or computational properties of the stable states of a neural system. Here we review a couple of neurophysiological observations and computational ideas that shift the focus to the transient dynamics of neural systems. We argue that there are many situations in which the transient neural behaviour, while hopping between different attractor states or moving along ,attractor ruins', carries most of the computational and/or behavioural significance, rather than the attractor states eventually reached. Such transients may be related to the computation of temporally precise predictions or the probabilistic transitions among choice options, accounting for Weber's law in decision-making tasks. Finally, we conclude with a more general perspective on the role of transient dynamics in the brain, promoting the view that brain activity is characterized by a high-dimensional chaotic ground state from which transient spatiotemporal patterns (metastable states) briefly emerge. Neural computation has to exploit the itinerant dynamics between these states. [source]


Selective exposure to information: the impact of information limits

EUROPEAN JOURNAL OF SOCIAL PSYCHOLOGY, Issue 4 2005
Peter Fischer
In research on selective exposure to information, people have been found to predominantly seek information supporting rather than conflicting with their opinion. In most of these studies, participants were allowed to search for as many pieces of information as they liked. However, in many situations, the amount of information that people can search for is restricted. We report four experiments addressing this issue. Experiment 1 suggests that objective limits regarding the maximum number of pieces of information the participants could search for increases the preference for selecting supporting over conflicting information. In Experiment 2, just giving participants a cue about information scarcity induces the same effect, even in the absence of any objective restrictions. Finally, Experiment 3 and 4 clarify the underlying psychological process by showing that information limits increase selective exposure to information because information search is guided by the expected information quality, which is basically biased towards supporting information, and information limits act to reinforce this tendency. Copyright © 2005 John Wiley & Sons, Ltd. [source]


On dichotomizing phenotypes in family-based association tests: quantitative phenotypes are not always the optimal choice

GENETIC EPIDEMIOLOGY, Issue 5 2007
David Fardo
Abstract In family-based association studies, quantitative traits are thought to provide higher statistical power than dichotomous traits. Consequently, it is standard practice to collect quantitative traits and to analyze them as such. However, in many situations, continuous measurements are more difficult to obtain and/or need to be adjusted for other factors/confounding variables which also have to be measured. In such scenarios, it can be advantageous to record and analyze a "simplified/dichotomized" version of the original trait. Under fairly general circumstances, we derive here rules for the dichotomization of quantitative traits that maintain power levels that are comparable to the analysis of the original quantitative trait. Using simulation studies, we show that the proposed rules are robust against phenotypic misclassification, making them an ideal tool for inexpensive phenotyping in large-scale studies. The guidelines are illustrated by an application to an asthma study. Genet. Epidemiol. 2007. © 2007 Wiley-Liss, Inc. [source]


Borehole-guided AVO analysis of P-P and P-S reflections: Quantifying uncertainty on density estimates

GEOPHYSICAL PROSPECTING, Issue 5 2006
Hugues A. Djikpesse
ABSTRACT Seismic properties of isotropic elastic formations are characterized by the three parameters: acoustic impedance, Poisson's ratio and density. Whilst the first two are usually well estimated by analysing the amplitude variation with angle (AVA) of reflected P-P waves, density is known to be poorly resolved. However, density estimates would be useful in many situations encountered in oil and gas exploration, in particular, for minimizing risks in looking ahead while drilling. We design a borehole seismic experiment to investigate the reliability of AVA extracted density. Receivers are located downhole near the targeted reflectors and record reflected P-P and converted P-S waves. A non-linear, wide-angle-based Bayesian inversion is then used to access the a posteriori probability distributions associated with the estimation of the three isotropic elastic parameters. The analysis of these distributions suggests that the angular variation of reflected P-S amplitudes provides additional substantial information for estimating density, thus reducing the estimate uncertainty variance by more than one order of magnitude, compared to using only reflected P-waves. [source]


Multiple displacement amplification to create a long-lasting source of DNA for genetic studies,

HUMAN MUTATION, Issue 7 2006
Lovisa Lovmar
Abstract In many situations there may not be sufficient DNA collected from patient or population cohorts to meet the requirements of genome-wide analysis of SNPs, genomic copy number polymorphisms, or acquired copy number alternations. When the amount of available DNA for genotype analysis is limited, high performance whole-genome amplification (WGA) represents a new development in genetic analysis. It is especially useful for analysis of DNA extracted from stored histology slides, tissue samples, buccal swabs, or blood stains collected on filter paper. The multiple displacement amplification (MDA) method, which relies on isothermal amplification using the DNA polymerase of the bacteriophage ,29, is a recently developed technique for high performance WGA. This review addresses new trends in the technical performance of MDA and its applications to genetic analyses. The main challenge of WGA methods is to obtain balanced and faithful replication of all chromosomal regions without the loss of or preferential amplification of any genomic loci or allele. In multiple comparisons to other WGA methods, MDA appears to be most reliable for genotyping, with the most favorable call rates, best genomic coverage, and lowest amplification bias. Hum Mutat 27(7), 603,614, 2006. © 2006 Wiley-Liss, Inc. [source]


Emotional Intelligence: Toward Clarification of a Concept

INDUSTRIAL AND ORGANIZATIONAL PSYCHOLOGY, Issue 2 2010
CARY CHERNISS
There has been much confusion and controversy concerning the concept of emotional intelligence (EI). Three issues have been particularly bothersome. The first concerns the many conflicting definitions and models of EI. To address this issue, I propose that we distinguish between definitions and models and then adopt a single definition on which the major theorists already seem to agree. I further propose that we more clearly distinguish between EI and the related concept of emotional and social competence (ESC). The second issue that has generated concern is the question of how valid existing measures are. After reviewing the research on the psychometric properties of several popular tests, I conclude that although there is some support for many of them, they all have inherent limitations. We need to rely more on alternative measurement strategies that have been available for some time and also develop new measures that are more sensitive to context. The third area of contention concerns the significance of EI for outcomes such as job performance or leadership effectiveness. Recent research, not available to earlier critics, suggests that EI is positively associated with performance. However, certain ESCs are likely to be stronger predictors of performance than EI in many situations. Also, EI is likely to be more important in certain kinds of situations, such as those involving social interaction or significant levels of stress. Context makes a difference. [source]


,Right' way to ,do' illness?

INTERNAL MEDICINE JOURNAL, Issue 10 2006
Thinking critically about positive thinking
Abstract Exhortations to ,be positive' accompany many situations in life, either as a general injunction or in difficult situations where people are facing pressure or adversity. It is particularly evident in health care, where positive thinking has become an aspect of the way people are expected to ,do' illness in developed society. Positive thinking is framed both as a moral injunction and as a central belief system. It is thought to help patients cope emotionally with illness and to provide a biological benefit. Yet, the meanings, expectations and outcomes of positive thinking are infrequently questioned and the risks of positive thinking are rarely examined. We outline some of the latter and suggest that health professionals should exercise caution in both ,prescribing' positive thinking and in responding to patients and carers whose belief systems and feelings of obligation rest on it. [source]


Numerical characteristics of a simple finite element formulation for consolidation analysis

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN BIOMEDICAL ENGINEERING, Issue 10 2004
Guofu Zhu
Abstract The spatial oscillation of values in the consolidation analysis when using small time increments has been a common problem for most existing methods. In this paper, the numerical characteristics of a simple finite element formulation for 1-D consolidation analysis recently proposed by the authors have been examined in detail. This paper proves that the commonly encountered phenomenon of spatial oscillation due to small time increments does not occur in the simple finite element formulation. The criterion of minimum time step used in most existing methods can be eliminated at least for linear situations by using the simple formulation proposed by the authors. Thus, the consolidation analysis can be carried easily for many situations, such as the one involving a relatively impermeable clay layer sandwiched between sandy layers. Copyright © 2004 John Wiley & Sons, Ltd. [source]


Polynomial control: past, present, and future

INTERNATIONAL JOURNAL OF ROBUST AND NONLINEAR CONTROL, Issue 8 2007
Vladimír Ku
Abstract Polynomial techniques have made important contributions to systems and control theory. Engineers in industry often find polynomial and frequency domain methods easier to use than state equation-based techniques. Control theorists show that results obtained in isolation using either approach are in fact closely related. Polynomial system description provides input,output models for linear systems with rational transfer functions. These models display two important system properties, namely poles and zeros, in a transparent manner. A performance specification in terms of polynomials is natural in many situations; see pole allocation techniques. A specific control system design technique, called polynomial equation approach, was developed in the 1960s and 1970s. The distinguishing feature of this technique is a reduction of controller synthesis to a solution of linear polynomial equations of a specific (Diophantine or Bézout) type. In most cases, control systems are designed to be stable and meet additional specifications, such as optimality and robustness. It is therefore natural to design the systems step by step: stabilization first, then the additional specifications each at a time. For this it is obviously necessary to have any and all solutions of the current step available before proceeding any further. This motivates the need for a parametrization of all controllers that stabilize a given plant. In fact this result has become a key tool for the sequential design paradigm. The additional specifications are met by selecting an appropriate parameter. This is simple, systematic, and transparent. However, the strategy suffers from an excessive grow of the controller order. This article is a guided tour through the polynomial control system design. The origins of the parametrization of stabilizing controllers, called Youla,Ku,era parametrization, are explained. Standard results on reference tracking, disturbance elimination, pole placement, deadbeat control, H2 control, l1 control and robust stabilization are summarized. New and exciting applications of the Youla,Ku,era parametrization are then discussed: stabilization subject to input constraints, output overshoot reduction, and fixed-order stabilizing controller design. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Radio resource management across multiple protocol layers in satellite networks: a tutorial overview

INTERNATIONAL JOURNAL OF SATELLITE COMMUNICATIONS AND NETWORKING, Issue 5 2005
Paolo Barsocchi
Abstract Satellite transmissions have an important role in telephone communications, television broadcasting, computer communications, maritime navigation, and military command and control. Moreover, in many situations they may be the only possible communication set-up. Trends in telecommunications indicate that four major growth market/service areas are messaging and navigation services (wireless and satellite), mobility services (wireless and satellite), video delivery services (cable and satellite), and interactive multimedia services (fibre/cable, satellite). When using geostationary satellites (GEO), the long propagation delay may have great impact, given the end-to-end delay user's requirements of relevant applications; moreover, atmospheric conditions may seriously affect data transmission. Since satellite bandwidth is a relatively scarce resource compared to the terrestrial one (e.g. in optical transport networks), and the environment is harsher, resource management of the radio segment plays an important role in the system's efficiency and economy. The radio resource management (RMM) entity is responsible for the utilization of the air interface resources, and covers power control, handover, admission control, congestion control, bandwidth allocation, and packet scheduling. RRM functions are crucial for the best possible utilization of the capacity. RRM functions can be implemented in different ways, thus having an impact on the overall system efficiency. This tutorial aims to provide an overview of satellite transmission aspects at various OSI layers, with emphasis on the MAC layer; some cross-layer solutions for bandwidth allocation are also indicated. Far from being an exhaustive survey (mainly due to the extensive nature of the subject), it offers the readers an extensive bibliography, which could be used for further research on specific aspects. Copyright © 2005 John Wiley & Sons, Ltd. [source]


A Foundational Justification for a Weighted Likelihood Approach to Inference

INTERNATIONAL STATISTICAL REVIEW, Issue 3 2004
Russell J. Bowater
Summary Two types of probability are discussed, one of which is additive whilst the other is non-additive. Popular theories that attempt to justify the importance of the additivity of probability are then critically reviewed. By making assumptions the two types of probability put forward are utilised to justify a method of inference which involves betting preferences being revised in light of the data. This method of inference can be viewed as a justification for a weighted likelihood approach to inference where the plausibility of different values of a parameter , based on the data X, is measured by the quantity q(,) =l(X,, ,)w(,), where l(X,, ,) is the likelihood function and w(,) is a weight function. Even though, unlike Bayesian inference, the method has the disadvantageous property that the measure q(,) is generally non-additive, it is argued that the method has other properties which may be considered very desirable and which have the potential to imply that when everything is taken into account, the method is a serious alternative to the Bayesian approach in many situations. The methodology that is developed is applied to both a toy example and a real example. Résumé Deux types de probabilité sont discutées, dont l'une est additive et l'autre non additive. Les théories populaires qui tentent de justifier l'importance de l'additivité des probabilités font l'objet d'une analyse critique. En faisant des hypothèses, on utilise les deux types de probabilité proposés pour justifier une méthode d'inférence concernant des préférences de paris que l'on révise en fonction des données. Cette méthode d'inférence peut être considérée comme une justification de l'approche de vraisemblance pondérée, oú l'on mesure la plausibilité de différentes valeurs d'un paramètre ,. Bien que dans cette méthode, la mesure du paramètre de vraisemblance ne soit pas additive, la méthode a d'autres propriétés intéressantes qui font qu'elle peut être considérée comme une alternative sérieuse à l'aproche Bayésienne dans de nombreuses situations. La méthodologie est appliquée à la fois à un exemple fictif et un exemple réel. [source]


Muscle-derived stem cells: Implications for effective myoblast transfer therapy

IUBMB LIFE, Issue 11 2005
Tracey F. Lee-Pullen
Abstract Stem cells have been proposed as a wonder solution for tissue repair in many situations and have attracted much attention in the media for both their therapeutic potential and ethical implications. In addition to the excitement generated by embryonic stem cells, research has now identified a number of stem cells within adult tissues which pose much more realistic targets for therapeutic interventions. Myoblast transfer therapy (MTT) has long been viewed as a potential therapy for the debilitating muscle-wasting disorder Duchenne Muscular Dystrophy. This technique relies on the transplantation of committed muscle precursor cells directly into the muscle fibres but has had little success in clinical trials. The recent discovery of a population of cells within adult muscle with stem cell-like characteristics has interesting implications for the future of such putative cell transplantation therapies. This review focuses on the characterization and application of these potential muscle-derived stem cells (MDSC) to MTT. IUBMB Life, 57: 731-736, 2005 [source]


The nature of advocacy vs. paternalism in nursing: clarifying the ,thin line'

JOURNAL OF ADVANCED NURSING, Issue 8 2009
Meg Zomorodi
Abstract Title.,The nature of advocacy vs. paternalism in nursing: clarifying the ,thin line'. Aim., This paper is an exploration of the concepts of advocacy and paternalism in nursing and discusses the thin line between the two. Background., Nurses are involved in care more than any other healthcare professionals and they play a central role in advocating for patients and families. It is difficult to obtain a clear definition of advocacy, yet the concepts of advocacy and paternalism must be compared, contrasted, and discussed extensively. In many situations, only a thin line distinguishes advocacy from paternalism. Data sources., A literature search was conducted using PubMed and CINAHL databases (2000,2008) as well as a library catalogue for texts. Discussion., Four case stories were described in order to discuss the ,thin line' between advocacy and paternalism and develop communication strategies to eliminate ambiguity. Weighing the ethical principles of beneficence and autonomy helps to clarify advocacy and paternalism and provides an avenue for discussion among nurses practicing in a variety of settings. Implications for nursing., Advocacy and paternalism should be discussed at interdisciplinary rounds, and taken into consideration when making patient care decisions. It is difficult to clarify advocacy vs. paternalism, but strategies such as knowing the patient, clarifying information, and educating all involved are initial steps in distinguishing advocacy from paternalism. Conclusion., Truly ,knowing' patients, their life experiences, values, beliefs and wishes can help clarify the ,thin line' and gain a grasp of these difficult to distinguish theoretical concepts. [source]


A comparison of methods for analysing regression models with both spectral and designed variables

JOURNAL OF CHEMOMETRICS, Issue 10 2004
Kjetil Jørgensen
Abstract In many situations one performs designed experiments to find the relationship between a set of explanatory variables and one or more responses. Often there are other factors that influence the results in addition to the factors that are included in the design. To obtain information about these so-called nuisance factors, one can sometimes measure them using spectroscopic methods. The question then is how to analyze this kind of data, i.e. a combination of an orthogonal design matrix and a spectroscopic matrix with hundreds of highly collinear variables. In this paper we introduce a method that is an iterative combination of partial least squares (PLS) and ordinary least squares (OLS) and compare its performance with other methods such as direct PLS, OLS and a combination of principal component analysis and least squares. The methods are compared using two real data sets and using simulated data. The results show that the incorporation of external information from spectroscopic measurements gives more information from the experiment and lower variance in the parameter estimates. We also find that the introduced algorithm separates the information from the spectral and design matrices in a nice way. It also has some advantages over PLS in showing lower bias and being less influenced by the relative weighting of the design and spectroscopic variables. Copyright © 2005 John Wiley & Sons, Ltd. [source]


Capillary forces between two solid spheres linked by a concave liquid bridge: Regions of existence and forces mapping

AICHE JOURNAL, Issue 5 2009
David Megias-Alguacil
Abstract This article focuses on the capillary interactions arising when two spherical particles are connected by a concave liquid bridge. This scenario is found in many situations where particles are partially wetted by a liquid, like liquid films stabilized with nanoparticles. We analyze different parameters governing the liquid bridge: interparticle separation, wetting angle and liquid volume. The results are compiled in a liquid volume-wetting angle diagram in which the regions of existence (stability) or inexistence (instability) of the bridge are outlined and the possible maximum and minimal particle distances for which the liquid bridge may be found. Calculations of the capillary forces discriminate those conditions for which such force is repulsive or attractive. The results are plotted in form of maps that allow an easy understanding of the stability of a liquid bridge and the conditions at which it may be produced for the two particle model. © 2009 American Institute of Chemical Engineers AIChE J, 2009 [source]


A new reconstruction of multivariate normal orthant probabilities

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES B (STATISTICAL METHODOLOGY), Issue 1 2008
Peter Craig
Summary., A new method is introduced for geometrically reconstructing orthant probabilities for non-singular multivariate normal distributions. Orthant probabilities are expressed in terms of those for auto-regressive sequences and an efficient method is developed for numerical approximation of the latter. The approach allows more efficient accurate evaluation of the multivariate normal cumulative distribution function than previously, for many situations where the original distribution arises from a graphical model. An implementation is available as a package for the statistical software R and an application is given to multivariate probit models. [source]


A Generalized Portmanteau Test For Independence Of Two Infinite-Order Vector Autoregressive Series

JOURNAL OF TIME SERIES ANALYSIS, Issue 4 2006
Chafik Bouhaddioui
Primary 62M10; secondary 62M15 Abstract., In many situations, we want to verify the existence of a relationship between multivariate time series. Here, we propose a semiparametric approach for testing the independence between two infinite-order vector autoregressive (VAR(,)) series, which is an extension of Hong's [Biometrika (1996c) vol. 83, 615,625] univariate results. We first filter each series by a finite-order autoregression and the test statistic is a standardized version of a weighted sum of quadratic forms in the residual cross-correlation matrices at all possible lags. The weights depend on a kernel function and on a truncation parameter. Using a result of Lewis and Reinsel [Journal of Multivariate Analysis (1985) Vol. 16, pp. 393,411], the asymptotic distribution of the test statistic is derived under the null hypothesis and its consistency is also established for a fixed alternative of serial cross-correlation of unknown form. Apart from standardization factors, the multivariate portmanteau statistic proposed by Bouhaddioui and Roy [Statistics and Probability Letters (2006) vol. 76, pp. 58,68] that takes into account a fixed number of lags can be viewed as a special case by using the truncated uniform kernel. However, many kernels lead to a greater power, as shown in an asymptotic power analysis and by a small simulation study in finite samples. A numerical example with real data is also presented. [source]


Soil organic matter decline and compositional change associated with cereal cropping in southern Tanzania

LAND DEGRADATION AND DEVELOPMENT, Issue 1 2001
J. F. McDonagh
Abstract The spatial analogue method and 13C analytical techniques were used to reveal medium- to long-term changes in soil organic matter (SOM) in farmers' fields under maize in southern Tanzania. Aerial photography and detailed farmer interviews were used to relate land-use history to declines in SOM concentration and changes in composition. The research attempted to measure the rate of SOM decline and the extent to which farmers' residue management practice was allowing cereal residues to contribute to SOM. The combination of research methods employed in this study proved to be highly complementary. Results indicate that native SOM decreased by on average 50 per cent; after 25 years of cultivation. Under current residue management with cereal residues mostly grazed and burnt there is only a relatively modest contribution from cereal residues to SOM. When cereal residues are retained in the field it is likely they will contribute significantly to SOM but they are much less likely to build SOM in the medium to long term. The paper concludes that in many situations it is probably best for farmers to allow the majority of the residues to be eaten by cattle in these systems rather than attempt to build SOM or risk nitrogen immobilization in cropped fields. The greater importance of inputs of high-quality (e.g. legume) residues for nutrient supply in the short term is highlighted, in contrast to inputs of poor-quality (e.g. cereal) residues in an attempt to build SOM in the longer term. Copyright © 2001 John Wiley & Sons, Ltd. [source]


Molecular identification of prey in predator diets

MOLECULAR ECOLOGY, Issue 4 2002
W. O. C. Symondson
Abstract In many situations prey choice by predators in the field cannot be established or quantified using direct observation. The remains of some prey may be visually identified in the guts and faeces of predators but not all predators ingest such hard remains and even those that do consume them may also ingest soft-bodies prey that leave no recognizable remnants. The result is, at best, a biased picture of prey choice. A range of molecular techniques and applications are reviewed that allow prey remains to be identified, often to the species and even stage level. These techniques, all of which are still in use, include enzyme electrophoresis, a range of immunological approaches using polyclonal and monoclonal antibodies to detect protein epitopes, and recently developed polymerase chain reaction (PCR)-based methods for detecting prey DNA. Analyses may be postmortem, on invertebrate and vertebrate predators collected from the field, or noninvasive assays of the remains in regurgitated bird pellets or vertebrate faeces. It was concluded that although monoclonal antibodies are currently the most effective method in use today, PCR-based techniques have proved to be highly effective and versatile in recent laboratory trials and are likely to rapidly displace all other approaches. [source]


The Wrong Mental Image of Settlement

NEGOTIATION JOURNAL, Issue 1 2001
Christopher Honeyman
Negotiation participants usually think of "settlement" as the official end of a coflict; the author points out that this mental image is inaccurate in many situations, where a settlement is followed by additional eruptions of conflict. He uses the recent Good Friday peace accord in Northern Ireland as an example of the continuing nature of many conflicts; theorizes as to why we have this incorrect mental image in general; and suggests ways we can present a more accurate representatin of a conflict's life cycle. [source]


A bicriterion approach for routing problems in multimedia networks

NETWORKS: AN INTERNATIONAL JOURNAL, Issue 4 2003
João C. N. Clímaco
Abstract Routing problems in communication networks supporting multiple services, namely, multimedia applications, involve the selection of paths satisfying multiple constraints (of a technical nature) and seeking simultaneously to "optimize" the associated metrics. Although traditional models in this area are single-objective, in many situations, it is important to consider different, eventually conflicting, objectives. In this paper, we consider a bicriterion model dedicated to calculating nondominated paths for specific traffic flows (associated with video services) in multiservice high-speed networks. The mathematical formulation of the problem and the bicriterion algorithmic approach developed for its resolution are presented together with computational tests regarding an application to video-traffic routing in a high-speed network. The algorithmic approach is an adaptation of recent work by Ernesto Martins and his collaborators, namely, the MPS algorithm. © 2003 Wiley Periodicals, Inc. [source]


From the state to the family: reconfiguring the responsibility for long-term nursing care at home

NURSING INQUIRY, Issue 1 2002
Kristin Björnsdóttir
From the state to the family: reconfiguring the responsibility for long-term nursing care at home This paper discusses the implications of the shift in the location of the provision of healthcare services from healthcare institutions to the home, which has occurred or is projected to occur in coming years. It is argued that the responsibility for the provision of care and assistance needed by the elderly living at home and people with long-term conditions living at home has shifted from public services to the family. Studies of care-givers have shown that in many situations they experience tremendous burdens, financial difficulties and health problems. Their social lives have been confined to the home, and contacts with friends and neighbors have been significantly reduced. This situation needs to be addressed by nurses, who in many cases serve as the bridge between the home and the official healthcare system. Using Foucault's exploration of power, particularly his idea of governmentality, a genealogy of care-giving in the home in Iceland's health-care has been constructed. The main findings were that, although this is occurring somewhat later than in many other countries, the state is withdrawing from its previously defined responsibility for the health and well-being of the nation. At the same time the citizen's responsibility for maintaining health is emphasized. Based on these findings, the argument is made that nurses in Iceland can have a profound influence on policy-making in relation to the organization of services provided in those homes. Suggestions are made as to how this can be done, which may be of interest to nurses in other countries. [source]


On E-Auctions for Procurement Operations

PRODUCTION AND OPERATIONS MANAGEMENT, Issue 4 2007
Michael H. Rothkopf
Electronic auctions have revolutionized procurement in the last decade. In many situations, they have replaced negotiations for supplier selection and price setting. While they have often greatly reduced transaction costs and increased competition, they have also encountered problems and resistance from suppliers resenting their intrusion on cooperative supplier/buyer relationships. In response to these issues, procurement auctions have evolved in radical new directions. Buyers use business rules to limit adverse changes. Some procurement auctions allow bidders to offer variants in the specifications of products to be supplied. Most important, some suppliers are allowing bidders to bid on packages of items, not just individual items. This tends to change procurement auctions from zero-sum fights over supplier profit margins to win-win searches for synergies. These changes have opened up many new research areas. Researchers are trying to improve how to deal with the computational issues involved in package auctions and to analyze the new auctions forms that are evolving. In general, equilibrium incentives are not known, and dealing with ties in package auctions is an issue. Computer scientists are analyzing the use of computerized bidding agents. Mechanisms that combine auctions with fixed buy prices or with negotiations need to be analyzed. [source]


Government engagement with non-state providers of water and sanitation services

PUBLIC ADMINISTRATION & DEVELOPMENT, Issue 3 2006
Kevin SansomArticle first published online: 24 JUL 200
Abstract Increasingly, governments in developing countries recognise that the public sector alone cannot provide adequate water and sanitation services to all. Non-state providers (NSPs) including both formal and informal private providers, as well as civil society institutions, also have important roles to play. There are clear challenges for governments intending to work with NSPs, not least of which is the institutional compatibility between bureaucratic agencies and informal water and sanitation NSPs. However, positive examples of government agencies working effectively with NSPs are emerging in many countries. Government engagement with water and sanitation NSPs can be split into five main types: recognition, dialogue, facilitation/collaboration, contracting and regulation. In many situations, a lack of formal recognition of water or sanitation NSPs is an impediment to more productive forms of engagement. There are a number of potential intervention options within each of the five types of engagement that government agencies should carefully consider when supporting the development of NSP water and sanitation services. Copyright © 2006 John Wiley & Sons, Ltd. [source]