| |||
Causal
Terms modified by Causal Selected AbstractsPatent ductus arteriosus and cystic periventricular leucomalacia in preterm infantsACTA PAEDIATRICA, Issue 3 2001P Pladys Aim: To test the association between early disturbances in hemodynamics induced by left-to-right shunting through the duct and cystic periventricular leucomalacia. Patients: Forty-six preterm infants (27,32 wk) admitted to the neonatal intensive care unit with risk criteria. Methods: Patent ductus arteriosus was evaluated on days 1 and 4, and was significant (sPDA) in cases of absent or reversed end diastolic flow in the subductal aorta. Resistance index was measured in the anterior cerebral artery and in the subductal aorta. Main outcome: Diagnosis of cystic periventricular leucomalacia between day 10 and day 50. Results: The 12 infants who developed cystic periventricular leucomalacia were compared with those who did not. On day 1, sPDA was more frequent (64% vs 26%; p= 0.03) in the cystic periventricular leucomalacia group, left ventricular output was higher (median = 341 vs 279mlkg -1.min -1; p= 0.005), and rescue surfactant was more frequently used (83% vs 47%; p= 0.03). This latter association was confirmed by multivariate analysis. Resistance index in the anterior cerebral artery was increased in cases of significant patent ductus arteriosus (p < 0.01) and was correlated with resistance index in the subductal aorta. Conclusion: On day 1 in this selected population, sPDA has an effect on blood flow velocity waveform in cerebral arteries and is associated with an increase in the emergence of cystic periventricular leucomalacia. This association could be casual rather than causal. [source] Childhood trauma, psychosis and schizophrenia: a literature review with theoretical and clinical implicationsACTA PSYCHIATRICA SCANDINAVICA, Issue 5 2005J. Read Objective:, To review the research addressing the relationship of childhood trauma to psychosis and schizophrenia, and to discuss the theoretical and clinical implications. Method:, Relevant studies and previous review papers were identified via computer literature searches. Results:, Symptoms considered indicative of psychosis and schizophrenia, particularly hallucinations, are at least as strongly related to childhood abuse and neglect as many other mental health problems. Recent large-scale general population studies indicate the relationship is a causal one, with a dose-effect. Conclusion:, Several psychological and biological mechanisms by which childhood trauma increases risk for psychosis merit attention. Integration of these different levels of analysis may stimulate a more genuinely integrated bio-psycho-social model of psychosis than currently prevails. Clinical implications include the need for staff training in asking about abuse and the need to offer appropriate psychosocial treatments to patients who have been abused or neglected as children. Prevention issues are also identified. [source] Bayes nets and babies: infants' developing statistical reasoning abilities and their representation of causal knowledgeDEVELOPMENTAL SCIENCE, Issue 3 2007David M. Sobel A fundamental assumption of the causal graphical model framework is the Markov assumption, which posits that learners can discriminate between two events that are dependent because of a direct causal relation between them and two events that are independent conditional on the value of another event(s). Sobel and Kirkham (2006) demonstrated that 8-month-old infants registered conditional independence information among a sequence of events; infants responded according to the Markov assumption in such a way that was inconsistent with models that rely on simple calculations of associative strength. The present experiment extends these findings to younger infants, and demonstrates that such responses potentially develop during the second half of the first year of life. These data are discussed in terms of a developmental trajectory between associative mechanisms and causal graphical models as representations of infants' causal and statistical learning. [source] The relation between different dimensions of alcohol consumption and burden of disease: an overviewADDICTION, Issue 5 2010Jürgen Rehm ABSTRACT Aims As part of a larger study to estimate the global burden of disease and injury attributable to alcohol: to evaluate the evidence for a causal impact of average volume of alcohol consumption and pattern of drinking on diseases and injuries; to quantify relationships identified as causal based on published meta-analyses; to separate the impact on mortality versus morbidity where possible; and to assess the impact of the quality of alcohol on burden of disease. Methods Systematic literature reviews were used to identify alcohol-related diseases, birth complications and injuries using standard epidemiological criteria to determine causality. The extent of the risk relations was taken from meta-analyses. Results Evidence of a causal impact of average volume of alcohol consumption was found for the following major diseases: tuberculosis, mouth, nasopharynx, other pharynx and oropharynx cancer, oesophageal cancer, colon and rectum cancer, liver cancer, female breast cancer, diabetes mellitus, alcohol use disorders, unipolar depressive disorders, epilepsy, hypertensive heart disease, ischaemic heart disease (IHD), ischaemic and haemorrhagic stroke, conduction disorders and other dysrhythmias, lower respiratory infections (pneumonia), cirrhosis of the liver, preterm birth complications and fetal alcohol syndrome. Dose,response relationships could be quantified for all disease categories except for depressive disorders, with the relative risk increasing with increased level of alcohol consumption for most diseases. Both average volume and drinking pattern were linked causally to IHD, fetal alcohol syndrome and unintentional and intentional injuries. For IHD, ischaemic stroke and diabetes mellitus beneficial effects were observed for patterns of light to moderate drinking without heavy drinking occasions (as defined by 60+ g pure alcohol per day). For several disease and injury categories, the effects were stronger on mortality compared to morbidity. There was insufficient evidence to establish whether quality of alcohol had a major impact on disease burden. Conclusions Overall, these findings indicate that alcohol impacts many disease outcomes causally, both chronic and acute, and injuries. In addition, a pattern of heavy episodic drinking increases risk for some disease and all injury outcomes. Future studies need to address a number of methodological issues, especially the differential role of average volume versus drinking pattern, in order to obtain more accurate risk estimates and to understand more clearly the nature of alcohol,disease relationships. [source] Policy options for alcohol price regulation: the importance of modelling population heterogeneityADDICTION, Issue 3 2010Petra Sylvia Meier ABSTRACT Context and aims Internationally, the repertoire of alcohol pricing policies has expanded to include targeted taxation, inflation-linked taxation, taxation based on alcohol-by-volume (ABV), minimum pricing policies (general or targeted), bans of below-cost selling and restricting price-based promotions. Policy makers clearly need to consider how options compare in reducing harms at the population level, but are also required to demonstrate proportionality of their actions, which necessitates a detailed understanding of policy effects on different population subgroups. This paper presents selected findings from a policy appraisal for the UK government and discusses the importance of accounting for population heterogeneity in such analyses. Method We have built a causal, deterministic, epidemiological model which takes account of differential preferences by population subgroups defined by age, gender and level of drinking (moderate, hazardous, harmful). We consider purchasing preferences in terms of the types and volumes of alcoholic beverages, prices paid and the balance between bars, clubs and restaurants as opposed to supermarkets and off-licenses. Results Age, sex and level of drinking fundamentally affect beverage preferences, drinking location, prices paid, price sensitivity and tendency to substitute for other beverage types. Pricing policies vary in their impact on different product types, price points and venues, thus having distinctly different effects on subgroups. Because population subgroups also have substantially different risk profiles for harms, policies are differentially effective in reducing health, crime, work-place absence and unemployment harms. Conclusion Policy appraisals must account for population heterogeneity and complexity if resulting interventions are to be well considered, proportionate, effective and cost-effective. [source] Parental Separation and Children's Educational Attainment: A Siblings Analysis on Swedish Register DataECONOMICA, Issue 292 2006ANDERS BJÖRKLUND This paper analyses whether the commonly found negative relationship between parental separation in childhood and educational outcomes is causal or due mainly to selection. We use data on about 100,000 Swedish full biological siblings, born in 1948,63, and perform cross-section and sibling-difference estimations. Outcomes are measured as educational attainment in 1996. Our cross-section analysis shows the expected negative and significant relationship, while the relationship is not significant, though precisely estimated, in the sibling-difference analysis. This finding was robust to the sensitivity tests performed and is consistent with selection, rather than causation, being the explanation for the negative relationship. [source] Aggregate Investment and Political Instability: An Econometric InvestigationECONOMICA, Issue 279 2003Nauro F. Campos Although in theory the long-run effect of uncertainty on investment is ambiguous, available econometric evidence widely supports a negative association between aggregate investment and political instability. A shortcoming of this body of evidence is that it has failed to investigate the existence and direction of causality between these two variables. This paper fills this gap by testing for such causal and negative long-run relationship between political instability and investment. We find there is a causal relation going from instability to investment, but it is positive and particularly strong in low-income countries. This finding is robust to various sensitivity checks. [source] Variations in ,-Hexachlorocyclohexane enantiomer ratios in relation to microbial activity in a temperate estuaryENVIRONMENTAL TOXICOLOGY & CHEMISTRY, Issue 7 2003Tiruponithura V. Padma Abstract Changes in the enantiomer ratios (ERs) of chiral pollutants in the environment are often considered evidence of biological alteration despite the lack of data on causal or mechanistic relationships between microbial parameters and ER values. Enantiomer ratios that deviate from 1:1 in the environment provide evidence for the preferential microbial degradation of one enantiomer, whereas ER values equal to 1 provide no evidence for microbial degradation and may mistakenly be interpreted as evidence that biodegradation is not important. In an attempt to link biological and geochemical information related to enantioselective processes, we measured the ERs of the chiral pesticide ,-hexachlorocyclohexane (,-HCH) and bacterial activity (normalized to abundance) in surface waters of the York River (VA, USA) bimonthly throughout one year. Despite lower overall ,-HCH concentrations, ,-HCH ER values were unexpectedly close to 1:1 in the freshwater region of the estuary with the highest bacterial activity. In contrast, ER values were nonracemic (ER , 1) and ,-HCH concentrations were significantly higher in the higher salinity region of the estuary, where bacterial activity was lower. Examination of these data may indicate that racemic environmental ER values are not necessarily reflective of a lack of biodegradation or recent input into the environment, and that nonenantioselective biodegradation may be important in certain areas. [source] Development and evaluation of consensus-based sediment effect concentrations for polychlorinated biphenyls,ENVIRONMENTAL TOXICOLOGY & CHEMISTRY, Issue 5 2000Donald D. MacDonald Abstract Sediment-quality guidelines (SQGs) have been published for polychlorinated biphenyls (PCBs) using both empirical and theoretical approaches. Empirically based guidelines have been developed using the screening-level concentration, effects range, effects level, and apparent effects threshold approaches. Theoretically based guidelines have been developed using the equilibrium-partitioning approach. Empirically-based guidelines were classified into three general categories, in accordance with their original narrative intents, and used to develop three consensus-based sediment effect concentrations (SECs) for total PCBs (tPCBs), including a threshold effect concentration, a midrange effect concentration, and an extreme effect concentration. Consensus-based SECs were derived because they estimate the central tendency of the published SQGs and, thus, reconcile the guidance values that have been derived using various approaches. Initially, consensus-based SECs for tPCBs were developed separately for freshwater sediments and for marine and estuarine sediments. Because the respective SECs were statistically similar, the underlying SQGs were subsequently merged and used to formulate more generally applicable SECs. The three consensus-based SECs were then evaluated for reliability using matching sediment chemistry and toxicity data from field studies, dose-response data from spiked-sediment toxicity tests, and SQGs derived from the equilibrium-partitioning approach. The results of this evaluation demonstrated that the consensus-based SECs can accurately predict both the presence and absence of toxicity in field-collected sediments. Importantly, the incidence of toxicity increases incrementally with increasing concentrations of tPCBs. Moreover, the consensus-based SECs are comparable to the chronic toxicity thresholds that have been estimated from dose-response data and equilibrium-partitioning models. Therefore, consensus-based SECs provide a unifying synthesis of existing SQGs, reflect causal rather than correlative effects, and accurately predict sediment toxicity in PCB-contaminated sediments. [source] Risk factors for epiploic foramen entrapment colic in a UK horse population: A prospective case-control studyEQUINE VETERINARY JOURNAL, Issue 4 2008D. C. ARCHER Summary Reasons for performing study: Epiploic foramen entrapment (EFE) is a common cause of small intestinal strangulation in the horse and its epidemiology requires further investigation. Objectives: To identify horse- and management-level risk factors for EFE and to explore reasons for the apparent seasonality of this condition. Hypothesis: Horses exhibiting certain behaviours and those exposed to particular management practices that vary seasonally are at increased risk of EFE. Methods: A prospective unmatched, multicentre case-control study was conducted over 24 months in the UK. Data on 77 cases and 216 control horses were obtained from 9 collaborating clinics and logistic regression was used to identify associations between horse and management variables and the likelihood of EFE. Results: In a final multivariable model crib-biting/ windsucking behaviour was associated with the largest increase in likelihood of EFE. A history of colic in the previous 12 months, increased stabling in the previous 28 days and height of the horse also increased the likelihood of EFE. Horses with access to a mineral/salt lick, those easily frightened and horses not fed at the same time as others were at reduced risk of EFE. Conclusions: Horses exhibiting certain behaviours, those with a previous history of colic and horses of greater height appear to be at inherently greater risk of EFE. The increase in likelihood of EFE with increased duration of stabling may explain the apparent seasonality of this condition. Potential relevance: These findings assist identification of horses at high-risk of EFE and provide information on management strategies that may reduce this risk. If the observed associations are causal, avoiding sudden increases in duration of stabling, not feeding horses in the same group at the same time and providing a mineral/salt lick may reduce the likelihood of EFE. The risk factors identified in this study provide important clues to the aetiology of EFE. [source] Patients' health beliefs and coping prior to autologous peripheral stem cell transplantationEUROPEAN JOURNAL OF CANCER CARE, Issue 2 2007E. FRICK md The aim of this study was to determine the associations between health locus of control (LoC), causal attributions and coping in tumour patients prior to autologous peripheral blood stem cell transplantation. Patients completed the Questionnaire of Health Related Control Expectancies, the Questionnaire of Personal Illness Causes (QPIC), and the Freiburg Questionnaire of Coping with Illness. A total of 126 patients (45% women; 54% suffering from a multiple myeloma, 29% from non-Hodgkin lymphomas, and 17% from other malignancies) participated in the study. Cluster analysis yielded four LoC clusters: ,fatalistic external', ,powerful others', ,yeah-sayer' and ,double external'. Self-blaming QPIC items were positively correlated with depressive coping, and ,fate or destiny' attributions with religious coping (P < 0.001). The highest scores were found for ,active coping' in the LoC clusters ,powerful others' and ,yeah-sayer'. External LoC and an active coping style prevail before undergoing autologous peripheral blood stem cell transplantation, whereas the depressive coping is less frequent, associated with self-blaming causal attributions. Health beliefs include causal and control attributions, which can improve or impair the patient's adjustment. A mixture between internal and external attributions seems to be most adaptive. [source] Hypertrophic cardiomyopathy: from genetics to treatmentEUROPEAN JOURNAL OF CLINICAL INVESTIGATION, Issue 4 2010Ali J. Marian Eur J Clin Invest 2010; 40 (4): 360,369 Abstract Background, Hypertrophic cardiomyopathy (HCM) is the prototypic form of pathological cardiac hypertrophy. HCM is an important cause of sudden cardiac death in the young and a major cause of morbidity in the elderly. Design, We discuss the clinical implications of recent advances in the molecular genetics of HCM. Results, The current diagnosis of HCM is neither adequately sensitive nor specific. Partial elucidation of the molecular genetic basis of HCM has raised interest in genetic-based diagnosis and management. Over a dozen causal genes have been identified. MYH7 and MYBPC3 mutations account for about 50% of cases. The remaining known causal genes are uncommon and some are rare. Advances in DNA sequencing techniques have made genetic screening practical. The difficulty, particularly in the sporadic cases and in small families, is to discern the causal from the non-causal variants. Overall, the causal mutations alone have limited implications in risk stratification and prognostication, as the clinical phenotype arises from complex and often non-linear interactions between various determinants. Conclusions, The clinical phenotype of ,HCM' results from mutations in sarcomeric proteins and subsequent activation of multiple cellular constituents including signal transducers. We advocate that HCM, despite its current recognition and management as a single disease entity, involves multiple partially independent mechanisms, despite similarity in the ensuing phenotype. To treat HCM effectively, it is necessary to delineate the underlying fundamental mechanisms that govern the pathogenesis of the phenotype and apply these principles to the treatment of each subset of clinically recognized HCM. [source] Therapeutic human papillomavirus DNA vaccination strategies to control cervical cancerEUROPEAN JOURNAL OF IMMUNOLOGY, Issue 2 2007T.-C. Wu Abstract A persistent human papillomavirus (HPV) infection is considered causal and necessary for the continued growth of cervical cancer. Thus, vaccination against HPV represents a plausible approach to prevent and treat cervical cancer. A report in the current issue of the European Journal of Immunology describes a therapeutic HPV DNA vaccination strategy using the HPV-16 E7 antigen fused to the invariant chain to enhance the E7-specific CD8+ and CD4+ T cell immune responses, resulting in a potent anti-tumor effect against E7-expressing tumors. Continued exploration of HPV therapeutic DNA vaccines may lead to eventual clinical application. See accompanying article http://dx.doi.org/10.1002/eji.200636233 [source] Partial imaginary precursor cancelling in DFE for BPSK and GMSK modulationsEUROPEAN TRANSACTIONS ON TELECOMMUNICATIONS, Issue 6 2001Dan Raphaeli This paper examines the effect of partial imaginary precursor cancelling on the performance of a system which uses a DFE and employs a one dimensional modulation. As opposed to QAM, BPSK or GMSK (after proper manipulations) require minimization of the mean square (MS) of the error real part only. Therefore, a lower MSE is achieved if the DFE coefficients are computed taking this fact into account, by trying to minimize as few imaginary precursors as possible. However, this introduces a delay by the imaginary part of the feedback filter (FBF) which is no longer causal. In this paper we investigate the influence of the length of imaginary precursor cancelling on system performance, and its effect on carrier phase tracking. The equations for the computation of the DFE coefficients for none or partially cancelled imaginary precursors are derived and performance on GMSK channels is presented. [source] THE CONTRIBUTION OF AN HOURGLASS TIMER TO THE EVOLUTION OF PHOTOPERIODIC RESPONSE IN THE PITCHER-PLANT MOSQUITO, WYEOMYIA SMITHIIEVOLUTION, Issue 10 2003W. E. Bradshaw Abstract Photoperiodism, the ability to assess the length of day or night, enables a diverse array of plants, birds, mammals, and arthropods to organize their development and reproduction in concert with the changing seasons in temperate climatic zones. For more than 60 years, the mechanism controlling photoperiodic response has been debated. Photoperiodism may be a simple interval timer, that is, an hourglasslike mechanism that literally measures the length of day or night or, alternatively, may be an overt expression of an underlying circadian oscillator. Herein, we test experimentally whether the rhythmic response in Wyeomyia smithii indicates a causal, necessary relationship between circadian rhythmicity and the evolutionary modification of photoperiodic response over the climatic gradient of North America, or may be explained by a simple interval timer. We show that a day-interval timer is sufficient to predict the photoperiodic response of W. smithii over this broad geographic range and conclude that rhythmic responses observed in classical circadian-based experiments alone cannot be used to infer a causal role for circadian rhythmicity in the evolution of photoperiodic time measurement. More importantly, we argue that the pursuit of circadian rhyth-micity as the central mechanism that measures the duration of night or day has distracted researchers from consideration of the interval-timing processes that may actually be the target of natural selection linking internal photoperiodic time measurement to the external seasonal environment. [source] Flux compactification of M-theory on compact manifolds with Spin(7) holonomy,FORTSCHRITTE DER PHYSIK/PROGRESS OF PHYSICS, Issue 11-12 2005D. Constantin At the leading order, M-theory admits minimal supersymmetric compactifications if the internal manifold has exceptional holonomy. The inclusion of non-vanishing fluxes in M-theory and string theory compactifications induce a superpotential in the lower dimensional theory, which depends on the fluxes. In this work, we check the conjectured form of this superpotential in the case of warped M-theory compactifications on Spin(7) holonomy manifolds. We perform a Kaluza-Klein reduction of the eleven-dimensional supersymmetry transformation for the gravitino and we find by direct comparison the superpotential expression. We check the conjecture for the heterotic string compactified on a Calabi-Yau three-fold as well. The conjecture can be checked indirectly by inspecting the scalar potential obtained after the compactification of M-theory on Spin(7) holonomy manifolds with non-vanishing fluxes. The scalar potential can be written in terms of the superpotential and we show that this potential stabilizes all the moduli fields describing deformations of the metric except for the radial modulus. All the above analyses require the knowledge of the minimal supergravity action in three dimensions. Therefore we calculate the most general causal ,, =1 three-dimensional, gauge invariant action coupled to matter in superspace and derive its component form using Ectoplasmic integration theory. We also show that the three-dimensional theory which results from the compactification is in agreement with the more general supergravity construction. The compactification procedure takes into account higher order quantum correction terms in the low energy effective action. We analyze the properties of these terms on a Spin(7) background. We derive a perturbative set of solutions which emerges from a warped compactification on a Spin(7) holonomy manifold with non-vanishing flux for the M-theory field strength and we show that in general the Ricci flatness of the internal manifold is lost, which means that the supergravity vacua are deformed away from the exceptional holonomy. Using the superpotential form we identify the supersymmetric vacua out of this general set of solutions. [source] PERSPECTIVE: Rethinking the value of high wood densityFUNCTIONAL ECOLOGY, Issue 4 2010Markku Larjavaara Summary 1.,Current thinking holds that wood density mediates a tradeoff between strength and economy of construction, with higher wood density providing higher strength but at higher cost. 2.,Yet the further away wood fibres are from the central axis of the trunk, the more they increase the strength of the trunk; thus, a fat trunk of low-density wood can achieve greater strength at lower construction cost than a thin trunk of high-density wood. 3.,What then are the countervailing advantages of high wood density? 4.,We hypothesize that high wood density is associated with lower maintenance costs due to lower trunk surface area, as surface area correlates with maintenance respiration. 5.,This advantage would be particularly important to long-lived trees and could in part explain why they tend to have high wood density. 6.,High wood density has also been associated with lower risk of trunk breakage, xylem implosion and pathogen invasion, but we argue that these relationships are not causal and instead reflect correlated selection on other traits of value to long-lived trees. 7.,This revaluation of the costs and benefits of high wood density has important implications for understanding tree life-history evolution, functional diversity, forest carbon stocks and the impacts of global change. [source] Fitness consequences of temperature-mediated egg size plasticity in a butterflyFUNCTIONAL ECOLOGY, Issue 6 2003K. Fischer Summary 1By randomly dividing adult females of the butterfly Bicyclus anynana, reared in a common environment, among high and low temperatures, it is demonstrated that oviposition temperature induces a plastic response in egg size. Females kept at a lower temperature laid significantly larger eggs than those ovipositing at a higher temperature. 2Cross-transferring the experimentally manipulated eggs between temperatures and investigating hatching success showed that a lower rearing temperature is more detrimental for the smaller eggs produced at a higher temperature than for the larger eggs produced at a lower temperature, supporting an adaptive explanation. 3However, when examining two potential mechanisms for an increased fitness of larger offspring (higher desiccation resistance of larger eggs and higher starvation resistance of larger hatchlings), no direct link between egg size and offspring fitness was found. Throughout, i.e. even under benign conditions, larger offspring had a higher fitness. 4Therefore, egg size should be viewed as a conveniently measurable proxy for the plastic responses induced by temperature, but caution is needed before implying that egg size per se is causal in influencing offspring traits. [source] Perils and pitfalls of permutation tests for distinguishing the effects of neighbouring polymorphismsGENETIC EPIDEMIOLOGY, Issue 7 2006Joanna M. Biernacka Abstract In a small region several marker loci may be associated with a trait, either because they directly influence the trait or because they are in linkage disequilibrium (LD) with a causal variant. Having established a potentially causal effect at a primary variant, we may ask if any other variants in the region appear to further contribute to the trait, indicating that the additional variant is either causal or is in LD with another causal locus. Methods of approaching this problem using case-parent trio data include the stepwise conditional logistic regression approach described by Cordell and Clayton ([2002] Am. J. Hum. Genet. 70:124,141), and a constrained-permutation method recently proposed by Spijker et al. ([2005] Ann. Hum. Genet. 69:90,101). Through simulation we demonstrate that the procedure described by Spijker et al. [2005], as well as unconditional logistic regression with "affected family-based controls" (AFBACs), can lead to inflated type 1 errors in situations when haplotypes are not inferable for all trios, whereas the conditional logistic regression approach gives correct significance levels. We propose an alternative to the permutation method of Spijker et al. [2005], which does not rely on haplotyping, and results in correct type 1 errors and potentially high power when assumptions of random mating, Hardy-Weinberg Equilibrium, and multiplicative effects of disease alleles are satisfied. Genet. Epidemiol. 2006. © 2006 Wiley-Liss, Inc. [source] A Structural Equation Approach to Models with Spatial DependenceGEOGRAPHICAL ANALYSIS, Issue 2 2008Johan H. L. Oud We introduce the class of structural equation models (SEMs) and corresponding estimation procedures into a spatial dependence framework. SEM allows both latent and observed variables within one and the same (causal) model. Compared with models with observed variables only, this feature makes it possible to obtain a closer correspondence between theory and empirics, to explicitly account for measurement errors, and to reduce multicollinearity. We extend the standard SEM maximum likelihood estimator to allow for spatial dependence and propose easily accessible SEM software like LISREL 8 and Mx. We present an illustration based on Anselin's Columbus, OH, crime data set. Furthermore, we combine the spatial lag model with the latent multiple-indicators,multiple-causes model and discuss estimation of this latent spatial lag model. We present an illustration based on the Anselin crime data set again. [source] The Ideal Explanatory Text in History: A Plea for EcumenismHISTORY AND THEORY, Issue 3 2004Tor Egil Førland ABSTRACT This article presents Peter Railton's analysis of scientific explanation and discusses its application in historiography. Although Railton thinks covering laws are basic in explanation, his account is far removed from Hempel. The main feature of Railton's account is its ecumenism. The "ideal explanatory text," a central concept in Railton's analysis, has room for not only causal and intentional, but also structural and functional explanations. The essay shows this by analyzing a number of explanations in history. In Railton's terminology all information that reduces our insecurity as to what the explanandum is due is explanatory. In the "encyclopedic ideal explanatory text," different kinds of explanation converge in the explanandum from different starting points. By incorporating pragmatic aspects, Railton's account is well suited to show how explanations in historiography can be explanatory despite their lack of covering laws or tendency statements. Railton's account is also dynamic, showing how the explanatory quest is a never-ending search for better illumination of the ideal explanatory text. Railton's analysis is briefly compared to, and found compatible with, views on explanation presented by David Lewis, C. Behan McCullagh, and R. G. Collingwood. Confronted with Hans-Georg Gadamer's hermeneutics and Donald Davidson's insistence on the indeterminacy of interpretation, the essay suggests that the objectivity of the ideal explanatory text should be regarded as local, limited to the description under which the action is seen. [source] Executive functioning by 18-24-month-old children: effects of inhibition, working memory demands and narrative in a novel detour-reaching taskINFANT AND CHILD DEVELOPMENT, Issue 5 2006Nicola McGuigan Abstract Infants can inhibit a prepotent but wrong action towards a goal in order to perform a causal means-action. It is not clear, however, whether infants can perform an arbitrary means-action while inhibiting a prepotent response. In four experiments, we explore this executive functioning in 18,24-month-old children. The working memory and inhibition demands in a novel means-end problem were systematically varied in terms of the type and combination of means-action(s) (causal or arbitrary) contained within the task, the number of means-actions (1 or 2), the goal visual availability and whether the task was accompanied by a narrative. Experiments 1 and 2 showed that children performed tasks that contained causal as opposed to arbitrary information more accurately; accuracy was also higher in tasks containing only one step. Experiment 2 also demonstrated that performance in the arbitrary task improved significantly when all sources of prepotency were removed. In Experiment 3, task performance improved when the two means-actions were intelligibly linked to the task goal. Experiment 4 demonstrated that the use of a narrative that provided a meaningful (non-causal) link between the two means-actions also improved children's performance by assisting their working memory in the generation of a rationale. Findings provide an initial account of executive functioning in the months that bring the end of infancy. Copyright © 2006 John Wiley & Sons, Ltd. [source] Sequential analysis of lines of evidence,an advanced weight-of-evidence approach for ecological risk assessmentINTEGRATED ENVIRONMENTAL ASSESSMENT AND MANAGEMENT, Issue 4 2006Ruth N Hull Abstract Weight-of-evidence (WOE) approaches have been used in ecological risk assessment (ERA) for many years. The approaches integrate various types of data (e.g., from chemistry, bioassay, and field studies) to make an overall conclusion of risk. However, the current practice of WOE has several important difficulties, including a lack of transparency related to how each line of evidence is weighted or integrated into the overall weight-of-evidence conclusion. Therefore, a sequential analysis of lines of evidence (SALE) approach has been developed that advances the practice of WOE. It was developed for an ERA of chemical stressors but also can be used for nonchemical stressors and is equally applicable to the aquatic and terrestrial environments. The sequential aspect of the SALE process is a significant advancement and is based on 2 primary ideas. First, risks can be ruled out with the use of certain lines of evidence, including modeled hazard quotients (HQs) and comparisons of soil, water, or sediment quality with conservative soil, water or sediment quality guidelines. Thus, the SALE process recognizes that HQs are most useful in ruling out risk rather than predicting risk to ecological populations or communities. Second, the SALE process provides several opportunities to exit the risk assessment process, not only when risks are ruled out, but also when magnitude of effect is acceptable or when little or no evidence exists that associations between stressors and effects may be causal. Thus, the SALE approach explicitly includes interaction between assessors and managers. It illustrates to risk managers how risk management can go beyond the simple derivation of risk-based concentrations of chemicals of concern to risk management goals based on ecological metrics (e.g., species diversity). It also can be used to stimulate discussion of the limitations of the ERA science, and how scientists deal with uncertainty. It should assist risk managers by allowing their decisions to be based on a sequential, flexible, and transparent process that includes direct toxicity risks, indirect risks (via changes in habitat suitability), and the spatial and temporal factors that can influence the risk assessment. [source] CYP1A1 variants and smoking-related lung cancer in San Francisco bay area Latinos and African AmericansINTERNATIONAL JOURNAL OF CANCER, Issue 1 2005Margaret R. Wrensch Abstract We examined CYP1A1 T6235C (M1) and A4889G (M2) polymorphisms in San Francisco Bay Area African Americans and Latinos who were newly diagnosed with primary lung cancer from September 1998 to November 2002 and in age-gender-ethnicity frequency-matched controls. Owing mainly to rapid mortality of cases, overall percentages of cases genotyped were 26% and 32% for Latinos and African Americans, respectively. CYP1A1 variants were genotyped for Latinos (104 cases, 278 controls) and African Americans (226 cases, 551 controls). M1 and M2 frequencies in controls were 0.23 and 0.02 for African Americans and 0.38 and 0.29 for Latinos. In Latinos, the overall inverse odds ratio (OR) of 0.51 (95% CI = 0.32,0.81) for M1 variant genotype resulted from an inverse interaction with smoking. Nonsmokers with M1 genotype had a slight elevated OR (1.5; 0.59,3.7), but those with less than 30 or 30 or more pack-year history had 0.20 (0.06,0.70) and 0.21 (0.06,0.81) times (about 1/5) the odds expected if smoking and genotype were independent lung cancer risk factors. African Americans had interactions of similar magnitude that were not statistically significant. Results for M2 were very similar. Inverse interactions of CYP1A1 variants and smoking-associated lung cancer risk in Latinos might be causal, due to undetected bias or confounding, or represent a unique linkage disequilibrium between a new lung cancer locus and CYP1A1 in this highly admixed population. [source] From dynamic influence nets to dynamic Bayesian networks: A transformation algorithmINTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 8 2009Sajjad Haider This paper presents an algorithm to transform a dynamic influence net (DIN) into a dynamic Bayesian network (DBN). The transformation aims to bring the best of both probabilistic reasoning paradigms. The advantages of DINs lie in their ability to represent causal and time-varying information in a compact and easy-to-understand manner. They facilitate a system modeler in connecting a set of desired effects and a set of actionable events through a series of dynamically changing cause and effect relationships. The resultant probabilistic model is then used to analyze different courses of action in terms of their effectiveness to achieve the desired effect(s). The major drawback of DINs is their inability to incorporate evidence that arrive during the execution of a course of action (COA). Several belief-updating algorithms, on the other hand, have been developed for DBNs that enable a system modeler to insert evidence in dynamic probabilistic models. Dynamic Bayesian networks, however, suffer from the intractability of knowledge acquisition. The presented transformation algorithm combines the advantages of both DINs and DBNs. It enables a system analyst to capture a complex situation using a DIN and pick the best (or close-to-best) COA that maximizes the likelihood of achieving the desired effect. During the execution, if evidence becomes available, the DIN is converted into an equivalent DBN and beliefs of other nodes in the network are updated. If required, the selected COA can be revised on the basis of the recently received evidence. The presented methodology is applicable in domains requiring strategic level decision making in highly complex situations, such as war games, real-time strategy video games, and business simulation games. © 2009 Wiley Periodicals, Inc. [source] Delay-dependent robust control for singular discrete-time Markovian jump systems with time-varying delayINTERNATIONAL JOURNAL OF ROBUST AND NONLINEAR CONTROL, Issue 10 2010Wuneng Zhou Abstract The problem of delay-dependent robust stabilization for uncertain singular discrete-time systems with Markovian jumping parameters and time-varying delay is investigated. In terms of free-weighting-matrix approach and linear matrix inequalities, a delay-dependent condition is presented to ensure a singular discrete-time system to be regular, causal and stochastically stable based on which the stability analysis and robust stabilization problem are studied. An explicit expression for the desired state-feedback controller is also given. Some numerical examples are provided to demonstrate the effectiveness of the proposed approach. Copyright © 2009 John Wiley & Sons, Ltd. [source] Stability and stabilization of discrete-time singular Markov jump systems with time-varying delayINTERNATIONAL JOURNAL OF ROBUST AND NONLINEAR CONTROL, Issue 5 2010Shuping Ma Abstract The stochastic stability and stochastic stabilization of time-varying delay discrete-time singular Markov jump systems are discussed. For full and partial knowledge of transition probabilities cases, delay-dependent linear matrix inequalities (LMIs) conditions for the systems to be regular, causal and stochastically stable are given. Sufficient conditions are proposed for the existence of state feedback controller in terms of LMIs. Finally, two numerical examples to illustrate the effectiveness of the method are given. Copyright © 2009 John Wiley & Sons, Ltd. [source] Robust monotone gradient-based discrete-time iterative learning controlINTERNATIONAL JOURNAL OF ROBUST AND NONLINEAR CONTROL, Issue 6 2009D. H. Owens Abstract This paper considers the use of matrix models and the robustness of a gradient-based iterative learning control (ILC) algorithm using both fixed learning gains and nonlinear data-dependent gains derived from parameter optimization. The philosophy of the paper is to ensure monotonic convergence with respect to the mean-square value of the error time series. The paper provides a complete and rigorous analysis for the systematic use of the well-known matrix models in ILC. Matrix models provide necessary and sufficient conditions for robust monotonic convergence. They also permit the construction of accurate sufficient frequency domain conditions for robust monotonic convergence on finite time intervals for both causal and non-causal controller dynamics. The results are compared with recently published results for robust inverse-model-based ILC algorithms and it is seen that the algorithm has the potential to improve the robustness to high-frequency modelling errors, provided that resonances within the plant bandwidth have been suppressed by feedback or series compensation. Copyright © 2008 John Wiley & Sons, Ltd. [source] Robust stabilization for uncertain discrete singular systems with state delayINTERNATIONAL JOURNAL OF ROBUST AND NONLINEAR CONTROL, Issue 16 2008Zhengguang Wu Abstract The robust stabilization problem for uncertain discrete singular time-delay systems is addressed in this paper. In terms of strict linear matrix inequality and a finite sum inequality, a delay-dependent criterion for the nominal systems to be admissible is obtained. Based on the criterion, a state feedback controller, which guarantees that, for all admissible uncertainties, the resulting closed-loop system is regular, causal and stable, is constructed. An explicit expression for the desired controller is also given. The obtained results include both delay-independent and delay-dependent cases. Some numerical examples are introduced to show the effectiveness of the given results. Copyright © 2008 John Wiley & Sons, Ltd. [source] Reviews: A review of hereditary and acquired coagulation disorders in the aetiology of ischaemic strokeINTERNATIONAL JOURNAL OF STROKE, Issue 5 2010Lonneke M. L. De Lau The diagnostic workup in patients with ischaemic stroke often includes testing for prothrombotic conditions. However, the clinical relevance of coagulation abnormalities in ischaemic stroke is uncertain. Therefore, we reviewed what is presently known about the association between inherited and acquired coagulation disorders and ischaemic stroke, with a special emphasis on the methodological aspects. Good-quality data in this field are scarce, and most studies fall short on epidemiological criteria for causal inference. While inherited coagulation disorders are recognised risk factors for venous thrombosis, there is no substantial evidence for an association with arterial ischaemic stroke. Possible exceptions are the prothrombin G20210A mutation in adults and protein C deficiency in children. There is proof of an association between the antiphospholipid syndrome and ischaemic stroke, but the clinical significance of isolated mildly elevated antiphospholipid antibody titres is unclear. Evidence also suggests significant associations of increased homocysteine and fibrinogen concentrations with ischaemic stroke, but whether these associations are causal is still debated. Data on other acquired coagulation abnormalities are insufficient to allow conclusions regarding causality. For most coagulation disorders, a causal relation with ischaemic stroke has not been definitely established. Hence, at present, there is no valid indication for testing all patients with ischaemic stroke for these conditions. Large prospective population-based studies allowing the evaluation of interactive and subgroup effects are required to appreciate the role of coagulation disorders in the pathophysiology of arterial ischaemic stroke and to guide the management of individual patients. [source] |