Random

Distribution by Scientific Domains
Distribution within Life Sciences

Terms modified by Random

  • random access memory
  • random allocation
  • random amplification
  • random amplified polymorphic dna
  • random amplified polymorphic dna marker
  • random arrangement
  • random assignment
  • random association
  • random assumption
  • random change
  • random coefficient
  • random coil
  • random coil conformation
  • random component
  • random copolymer
  • random copolymerization
  • random d
  • random distribution
  • random disturbance
  • random dynamical system
  • random effect
  • random effect model
  • random effects
  • random effects distribution
  • random effects meta-analysi
  • random effects model
  • random effects models
  • random element
  • random environment
  • random error
  • random expectation
  • random factor
  • random fashion
  • random field
  • random fluctuation
  • random forest
  • random genetic drift
  • random glucose
  • random graph
  • random laser
  • random locations
  • random mating
  • random matrix theory
  • random media
  • random model
  • random mutagenesi
  • random mutation
  • random nature
  • random network
  • random noise
  • random number
  • random order
  • random orientation
  • random packing
  • random parameter
  • random pattern
  • random perturbation
  • random phase approximation
  • random placement
  • random point
  • random poly
  • random population
  • random primer
  • random process
  • random sample
  • random sampling
  • random sampling design
  • random sampling method
  • random selection
  • random sequence
  • random site
  • random subsample
  • random subset
  • random system
  • random term
  • random time
  • random tree
  • random variable
  • random variation
  • random vector
  • random vibration
  • random walk
  • random walk model

  • Selected Abstracts


    The Random-Facet simplex algorithm on combinatorial cubes,

    RANDOM STRUCTURES AND ALGORITHMS, Issue 3 2002
    Bernd Gärtner
    The RANDOM -FACET algorithm is a randomized variant of the simplex method which is known to solve any linear program with n variables and m constraints using an expected number of pivot steps which is subexponential in both n and m. This is the theoretically fastest simplex algorithm known to date if m , n; it provably beats most of the classical deterministic variants which require exp(,(n)) pivot steps in the worst case. RANDOM -FACET has independently been discovered and analyzed ten years ago by Kalai as a variant of the primal simplex method, and by Matous,ek, Sharir, and Welzl in a dual form. The essential ideas and results connected to RANDOM -FACET can be presented in a particularly simple and instructive way for the case of linear programs over combinatorialn - cubes. I derive an explicit upper bound of (1) on the expected number of pivot steps in this case, using a new technique of "fingerprinting" pivot steps. This bound also holds for generalized linear programs, similar flavors of which have been introduced and studied by several researchers. I then review an interesting class of generalized linear programs, due to Matous,ek, showing that RANDOM -FACET may indeed require an expected number of pivot steps in the worst case. The main new result of the paper is a proof that all actual linear programs in Matous,ek's class are solved by RANDOM -FACET with an expected polynomial number of pivot steps. This proof exploits a combinatorial property of linear programming which has only recently been discovered by Holt and Klee. The result establishes the first scenario in which an algorithm that works for generalized linear programs "recognizes" proper linear programs. Thus, despite Matous,ek's worst-case result, the question remains open whether RANDOM -FACET (or any other simplex variant) is a polynomial-time algorithm for linear programming. Finally, I briefly discuss extensions of the combinatorial cube results to the general case. © 2002 Wiley Periodicals, Inc. Random Struct. Alg., 20:353,381, 2002 [source]


    Random controls on semi-rhythmic spacing of pools and riffles in constriction-dominated rivers

    EARTH SURFACE PROCESSES AND LANDFORMS, Issue 11 2001
    Douglas M. Thompson
    Abstract Average pool spacing between five and seven bankfull widths has been documented in environments throughout the world, but has limited theoretical justification in coarse-bedded and bedrock environments. Pool formation in coarse-bedded and bedrock channels has been attributed to bedrock and boulder constrictions. Because the spacing of these constrictions may be irregular in nature, it is difficult to reconcile pool-formation processes with the supposedly rhythmic spacing of pools and riffles. To address these issues, a simulation model for pool and riffle formation is used to demonstrate that semi-rhythmic spacing of pools with an approximate spacing of five to seven bankfull widths can be recreated from a random distribution of obstructions and minimum pool- and riffle-length criteria. It is assumed that a pool,riffle couplet will achieve a minimum length based on dominant-discharge conditions. Values for the minimum-length assumption are based on field data collected in New England and California, while the theoretical basis relies on the demonstrated hydraulic response of individual pools to elongation. Results from the simulations show that the location of pools can be primarily random in character, but still assume an average spacing between four and eight bankfull widths for a variety of conditions. Field verification data generally support the model but highlight a highly skewed distribution of pool-forming elements and pool spacing. The relation between pool spacing and bankfull widths is attributed to the common geometric response of these features to dominant-discharge conditions. Copyright © 2001 John Wiley & Sons, Ltd. [source]


    The Effects of Random and Discrete Sampling when Estimating Continuous,Time Diffusions

    ECONOMETRICA, Issue 2 2003
    Sahalia, Yacine Aït
    High,frequency financial data are not only discretely sampled in time but the time separating successive observations is often random. We analyze the consequences of this dual feature of the data when estimating a continuous,time model. In particular, we measure the additional effects of the randomness of the sampling intervals over and beyond those due to the discreteness of the data. We also examine the effect of simply ignoring the sampling randomness. We find that in many situations the randomness of the sampling has a larger impact than the discreteness of the data. [source]


    The rapid spread of invasive Eurasian Collared Doves Streptopelia decaocto in the continental USA follows human-altered habitats

    IBIS, Issue 3 2010
    IKUKO FUJISAKI
    Understanding factors related to the range expansion trajectory of a successful invasive species may provide insights into environmental variables that favour additional expansion or guide monitoring and survey efforts for this and other invasive species. We examined the relationship of presence and abundance of Eurasian Collared Doves Streptopelia decaocto to environmental factors using recent data from the North American Breeding Bird Survey to understand factors influencing its expansion into the continental USA. A zero-inflated Poisson (ZIP) model was used to account for excess zero observations because this species was not observed on the majority of survey routes, despite its large geographical range. Model fit was improved when we included environmental covariates as compared with the null model, which only included distance from the route where this species was first observed. Probability of zero count was positively related to the distance from the first route and road density and was inversely related to minimum temperature and distance to coast. Abundance of the species was positively related to road density and was inversely related to annual precipitation and distance to coast. Random intercept by land-cover type also improved model fit. Model fit was improved with the ZIP model over the standard Poisson model, suggesting that presence and abundance of this species are characterized by different environmental factors. However, overall low accuracy of model-predicted presence/absence and abundance with the independent validation dataset may indicate either that there are other explanatory factors or that there is great uncertainty in the species' colonization process. Our large-scale study provides additional evidence that the range expansion of this species tends to follow human-altered landscapes such as road and agricultural areas as well as responding to general geographical features such as coastlines or thermal clines. Such patterns may hold true for other invasive species and may provide guidelines for monitoring and assessment activities in other invasive taxa. [source]


    PAQM: an adaptive and proactive queue management for end-to-end TCP congestion control

    INTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, Issue 8 2004
    Seungwan RyuArticle first published online: 2 SEP 200
    Abstract Two functions, the congestion indicator (i.e. how to detect congestion) and the congestion control function (i.e. how to avoid and control congestion), are used at a router to support end-to-end congestion control in the Internet. Random early detection (RED) (IEEE/ACM Trans. Networking 1993; 1(4):397,413) enhanced the two functions by introducing queue length averaging and probabilistic early packet dropping. In particular, RED uses an exponentially weighted moving average (EWMA) queue length not only to detect incipient congestion but also to smooth the bursty incoming traffic and its resulting transient congestion. Following RED, many active queue management (AQM)-based extensions have been proposed. However, many AQM proposals have shown severe problems with detection and control of the incipient congestion adaptively to the dynamically changing network situations. In this paper, we introduce and analyse a feedback control model of TCP/AQM dynamics. Then, we propose the Pro-active Queue Management (PAQM) mechanism, which is able to provide proactive congestion avoidance and control using an adaptive congestion indicator and a control function under a wide range of traffic environments. The PAQM stabilizes the queue length around the desired level while giving smooth and low packet loss rates and high network resource utilization. Copyright © 2004 John Wiley & Sons, Ltd. [source]


    Qualitative Analysis of Medicare Claims in the Last 3 Years of Life: A Pilot Study

    JOURNAL OF AMERICAN GERIATRICS SOCIETY, Issue 1 2005
    Amber E. Barnato MD
    Objectives: To study end-of-life care of a representative sample of older people using qualitative interpretation of administrative claims by clinicians and to explore whether this method yields insights into patient care, including continuity, errors, and cause of death. Design: Random, stratified sampling of decedents and all their Medicare-covered healthcare claims in the 3 years before death from a 5% sample of elderly fee-for-service beneficiaries, condensation of all claims into a chronological clinical summary, and abstraction by two independent clinicians using a standardized form. Setting: United States. Participants: One hundred Medicare fee-for-service older people without disability or end-stage renal disease entitlement who died in 1996 to 1999 and had at least 36 months of continuous Part A and Part B enrollment before death. Measurements: Qualitative narrative of the patient's medical course; clinician assessment of care continuity and apparent medical errors; cause, trajectory, and place of death. Results: The qualitative narratives developed by the independent abstracters were highly concordant. Clinicians felt that 75% of cases lacked continuity of care that could have improved the quality of life and the way the person died, and 13% of cases had a medical error identified by both abstracters. Abstracters disagreed about assignment of a single cause of death in 28% of cases, and abstracters and the computer algorithm disagreed in 43% of cases. Conclusion: Qualitative claims analysis illuminated many problems in the care of chronically ill older people at the end of life and suggested that traditional vital statistics assignation of a single cause of death may distort policy priorities. This novel approach to claims review is feasible and deserves further study. [source]


    Learning Styles of Interior Design Students as Assessed by the Gregorc Style Delineator

    JOURNAL OF INTERIOR DESIGN, Issue 1 2001
    Stephanie A. Watson Ed.D.
    The purpose of this study was to determine the preferred learning style of undergraduate students majoring in interior design. The Gregorc Style Delineator, a self-report instrument to determine learning style, was administered to 147 undergraduate interior design students enrolled in Foundation for Interior Design Education Research (FIDER) accredited programs located within the Southwest Region of the United States. To determine the dominant learning style of undergraduate interior design students, frequency distributions were compiled. Overall, the most important finding in this study was the diversity of learning styles among interior design students. Not only were all learning styles represented in the sample, but 49% of students exhibited dominance in more than one style,unlike the results of previous studies with non interior design students. The most common learning styles found among interior design students are a logical and hands-on learning style, known as Concrete Sequential, and a combination of experimental, imaginative, and people-oriented learning styles, known as Concrete Random/Abstract Random. Diversity in student learning styles supports the argument for the need for instructors to have a repertoire of teaching methods. Instructors should be knowledgeable in learning style theory, should know their own learning style, and should be able to teach using a variety of styles. [source]


    A population-based epidemiologic study of irritable bowel syndrome in South China: stratified randomized study by cluster sampling

    ALIMENTARY PHARMACOLOGY & THERAPEUTICS, Issue 11 2004
    L. S. Xiong
    Summary Background :,The detailed population-based data on irritable bowel syndrome in South China are lacking. Aims :,To assess the prevalence of irritable bowel syndrome in South China and its impact on health-related quality of life. Subjects and methods :,A face-to-face interview was carried out in South China to assess the prevalence of irritable bowel syndrome. Random clustered sampling of permanent inhabitants aged 18,80 years was carried out under stratification of urban and suburban areas. The impact of irritable bowel syndrome on health-related quality of life was evaluated using the Chinese version of SF-36. Results :,A total of 4178 subjects (1907 male and 2271 female) were interviewed. The adjusted prevalence of irritable bowel syndrome in South China is 11.50% according to the Manning criteria and 5.67% according to the Rome II criteria. Factors including history of analgesic use such as non-steroidal anti-inflammatory drug (odds ratio 3.83), history of food allergies (odds ratio 2.68), psychological distress (odds ratio 2.18), life events (odds ratio 1.89), history of dysentery (odds ratio 1.63) and negative coping style (odds ratio 1.28) were significantly associated with the presence of irritable bowel syndrome (P < 0.05). Irritable bowel syndrome was significantly associated with a decrement in health-related quality of life score. Conclusion :,Irritable bowel syndrome is a common disorder in South China and has a negative impact on health-related quality of life. [source]


    The EuroPrevall-INCO surveys on the prevalence of food allergies in children from China, India and Russia: the study methodology

    ALLERGY, Issue 3 2010
    G. W. K. Wong
    To cite this article: Wong GWK, Mahesh PA, Ogorodova L, Leung TF, Fedorova O, Holla AD, Fernandez-Rivas M, Clare Mills EN, Kummeling I, van Ree R, Yazdanbakhsh M, Burney P. The EuroPrevall-INCO surveys on the prevalence of food allergies in children from China, India and Russia: the study methodology. Allergy 2010; 65: 385,390. Abstract Background:, Very little is known regarding the global variations in the prevalence of food allergies. The EuroPrevall-INCO project has been developed to evaluate the prevalence of food allergies in China, India and Russia using the standardized methodology of the EuroPrevall protocol used for studies in the European Union. The epidemiological surveys of the project were designed to estimate variations in the prevalence of food allergy and exposure to known or suspected risk factors for food allergy and to compare the data with different European countries. Methods:, Random samples of primary schoolchildren were recruited from urban and rural regions of China, Russia and India for screening to ascertain possible adverse reactions to foods. Cases and controls were then selected to answer a detailed questionnaire designed to evaluate the possible risk factors of food allergies. Objective evidence of sensitisation including skin-prick test and serum specific IgE measurement was also collected. Results:, More than 37 000 children from the three participating countries have been screened. The response rates for the screening phase ranged from 83% to 95%. More than 3000 cases and controls were studied in the second phase of the study. Furthur confirmation of food allergies by double blind food challenge was conducted. Conclusions:, This will be the first comparative study of the epidemiology of food allergies in China, India, and Russia using the same standardized methodology. The findings of these surveys will complement the data obtained from Europe and provide insights into the development of food allergy. [source]


    Common variants in FCER1A influence total serum IgE levels from cord blood up to six years of life

    ALLERGY, Issue 9 2009
    C.-M. Chen
    Background:, In a recent genome wide scan, a functional promoter variant (rs2251746) in the gene encoding the alpha chain of the high affinity receptor for immunoglobulin E (IgE) (FCER1A) was identified as major determinant of serum IgE levels. Objective:, The aim of this study was to investigate the role of rs2251746 on total IgE levels measured at different stages of life from birth (cord blood) up to the age of 6 and to evaluate its interaction with the environmental influences in two German birth cohorts. Method:, Data from two German birth cohorts were analysed (n = 1043 for the LISA cohort and n = 1842 for the GINI cohort). In the studies, total serum IgE was measured from cord blood, and blood samples taken at the age of 2/3 and 6 years. In a subgroup of the LISA study, house dust samples were collected at age of 3 months and the amount of endotoxin was determined. Random effect models were used to analyse the longitudinal health outcomes. Results:, In the two cohorts, the heterozygote and the rare homozygote of rs2251746 was consistently associated with lower total IgE levels from birth up to the age of 6 years with an allele-dose effect (P < 0.02 for blood samples taken at each time point in both cohorts). No interaction between the two FCER1A encoding gene and environmental exposures including endotoxin, worm infestation and day care centre attendance during early childhood were observed. Conclusion:, Common variants in FCER1A strongly influence basal IgE production independently from environmental stimuli. These effects can be observed already in cord blood pointing to altered gene expression in foetus. [source]


    Random and localized resistive switching observation in Pt/NiO/Pt

    PHYSICA STATUS SOLIDI - RAPID RESEARCH LETTERS, Issue 6 2007
    Jung-Bin Yun
    Abstract Resistive memory switching devices based on transition metal oxides are now emerging as a candidate for nonvolatile memories. To visualize nano-sized (10 nm to 30 nm in diameter) conducting filamentary paths in the surface of NiO thin films during repetitive switching, current sensing,atomic force microscopy and ultra-thin (<5 nm) Pt films as top electrodes were used. Some areas (or spots), which were assumed to be the beginning of the conducting filaments, appeared (formation) and disappeared (rupture) in a localized and random fashion during the switching and are thought to contribute to resistive memory switching. (© 2007 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


    Random, systematic, and common cause failure: How do you manage them?,

    PROCESS SAFETY PROGRESS, Issue 4 2006
    Michela Gentile
    Abstract A safety instrumented system (SIS) may fail to operate as desired when one or more of its devices fail due to random, systematic, and common cause events. IEC 61511 (ANSI/ISA 84.00.01,2004) stresses the importance of minimizing the propagation of device failure into system failure through design, operating, inspection, and maintenance practices. To fully understand the lifecycle requirements, it is first necessary to understand the types of failures and their potential effects on the SIS. Although several technical standards and other specialized literature address the topic, it is still a "fuzzy" matter, subject to misunderstanding and discussion. IEC 61511 Clause 11.9 requires that the SIL be verified using quantitative analysis, such as reliability block diagrams, fault tree analysis, and Markov modeling. This analysis includes only those dangerous failures that are random in nature. Common cause failures may or may not be included in the verification calculation depending on whether they exhibit random or systematic behavior. Any personnel assigned responsibility for verifying the SIL should understand each failure type and the strategies that can be used against it. Consequently, this article provides an overview of random, systematic, and common cause failures and clarifies the differences in their management within IEC 61511. © 2006 American Institute of Chemical Engineers Process Saf Prog, 2006 [source]


    The N-terminal region of Pseudomonas type III effector AvrPtoB elicits Pto-dependent immunity and has two distinct virulence determinants

    THE PLANT JOURNAL, Issue 4 2007
    Fangming Xiao
    Summary Resistance to bacterial speck disease in tomato is activated by the physical interaction of the host Pto kinase with either of the sequence-dissimilar type III effector proteins AvrPto or AvrPtoB (HopAB2) from Pseudomonas syringae pv. tomato. Pto-mediated immunity requires Prf, a protein with a nucleotide-binding site and leucine-rich repeats. The N-terminal 307 amino acids of AvrPtoB were previously reported to interact with the Pto kinase, and we show here that this region (AvrPtoB1-307) is sufficient for eliciting Pto/Prf-dependent immunity against P. s. pv. tomato. AvrPtoB1-307 was also found to be sufficient for a virulence activity that enhances ethylene production and increases growth of P. s. pv. tomato and severity of speck disease on susceptible tomato lines lacking either Pto or Prf. Moreover, we found that residues 308,387 of AvrPtoB are required for the previously reported ability of AvrPtoB to suppress pathogen-associated molecular patterns-induced basal defenses in Arabidopsis. Thus, the N-terminal region of AvrPtoB has two structurally distinct domains involved in different virulence-promoting mechanisms. Random and targeted mutagenesis identified five tightly clustered residues in AvrPtoB1-307 that are required for interaction with Pto and for elicitation of immunity to P. s. pv. tomato. Mutation of one of the five clustered residues abolished the ethylene-associated virulence activity of AvrPtoB1-307. However, individual mutations of the other four residues, despite abolishing interaction with Pto and avirulence activity, had no effect on AvrPtoB1-307 virulence activity. None of these mutations affected the basal defense-suppressing activity of AvrPtoB1-387. Based on sequence alignments, estimates of helical propensity, and the previously reported structure of AvrPto, we hypothesize that the Pto-interacting domains of AvrPto and AvrPtoB1-307 have structural similarity. Together, these data support a model in which AvrPtoB1-307 promotes ethylene-associated virulence by interaction not with Pto but with another unknown host protein. [source]


    Variable Selection in the Cox Regression Model with Covariates Missing at Random

    BIOMETRICS, Issue 1 2010
    Ramon I. Garcia
    Summary We consider variable selection in the Cox regression model (Cox, 1975,,Biometrika,362, 269,276) with covariates missing at random. We investigate the smoothly clipped absolute deviation penalty and adaptive least absolute shrinkage and selection operator (LASSO) penalty, and propose a unified model selection and estimation procedure. A computationally attractive algorithm is developed, which simultaneously optimizes the penalized likelihood function and penalty parameters. We also optimize a model selection criterion, called the,ICQ,statistic (Ibrahim, Zhu, and Tang, 2008,,Journal of the American Statistical Association,103, 1648,1658), to estimate the penalty parameters and show that it consistently selects all important covariates. Simulations are performed to evaluate the finite sample performance of the penalty estimates. Also, two lung cancer data sets are analyzed to demonstrate the proposed methodology. [source]


    A comparison of segmental vs subtotal/total colectomy for colonic Crohn's disease: a meta-analysis

    COLORECTAL DISEASE, Issue 2 2006
    P. P. Tekkis
    Abstract Objective, Using meta-analytical techniques the present study evaluated differences in short-term and long-term outcomes of adult patients with colonic Crohn's disease who underwent either colectomy with ileorectal anastomosis (IRA) or segmental colectomy (SC). Methods, Comparative studies published between 1988 and 2002, of subtotal/total colectomy and ileorectal anastomosis vs segmental colectomy, were used. The study end points included were surgical and overall recurrence, time to recurrence, postoperative morbidity and incidence of permanent stoma. Random and fixed-effect meta-analytical models were used to evaluate the study outcomes. Sensitivity analysis, funnel plot and meta-regressive techniques were carried out to explain the heterogeneity and selection bias between the studies. Results, Six studies, consisting of a total of 488 patients (223 IRA and 265 SC) were included. Analysis of the data suggested that there was no significant difference between IRA and SC in recurrence of Crohn's disease. Time to recurrence was longer in the IRA group by 4.4 years (95% CI: 3.1,5.8), P < 0.001. There was no difference between the incidence of postoperative complications (OR = 1.4., 95% CI 0.16,12.74) or the need for a permanent stoma between the two groups (OR = 2.75, 95% CI 0.78,9.71). Patients with two or more colonic segments involved were associated with lower re-operation rate in the IRA group, a difference which did not reach statistical significance (P = 0.177). Conclusions, Both procedures were equally effective as treatment options for colonic Crohn's disease however, patients in the SC group exhibited recurrence earlier than those in the IRA group. The choice of operation is dependent on the extent of colonic disease, with a trend towards better outcomes with IRA for two or more colonic segments involved. Since no prospective randomised study has been undertaken, a clear view about which approach is more suitable for localised colonic Crohn's disease cannot be obtained. [source]


    Reducing the bias of probing depth and attachment level estimates using random partial-mouth recording

    COMMUNITY DENTISTRY AND ORAL EPIDEMIOLOGY, Issue 1 2006
    James D. Beck
    Abstract , Objectives:, To evaluate the bias and precision of probing depth (PD) and clinical attachment level (CAL) estimates of random and fixed partial examination methods compared with full-mouth examinations. Methods:, PD and CAL were calculated on six sites for up to 28 teeth (considered to be the gold standard with no bias) and three fixed-site selection methods (FSSMs) that resulted in a partial examination of sites: the Ramfjord method, and the NIDCR methods used in NHANES I, and NHANES 2000. Finally, seven random-site selection methods (RSSMs) were created by sampling the following number of sites: 84, 42, 36, 28, 20, 15, 10 and 6. To compare bias and precision of the methods we calculated percent relative bias and relative error. Results:, Estimates of means, standard deviations (SD), relative bias and relative error for RSSMs were almost identical to the full-mouth examination, but SDs increase slightly when fewer than 28 sites were sampled and relative bias and error increase for methods sampling fewer than 20 sites. The FSSMs had very low relative error, but much higher relative bias indicating underestimation. The FSSM with the smallest bias and error was the Ramfjord method, but the Random 36 method had less bias and less relative error. The NHANES 2000 method was the FSSM with the lowest bias and relative error for estimates of Extent Scores (percent of sites ,3, 4, 5, or 5 mm PD or CAL) but random methods sampling fewer sites performed just as well. Both FSSMs and RSSMs underestimated prevalence, especially prevalence of less frequently occurring conditions, but most RSSMs were less likely to underestimate prevalence than the FSSMs. Conclusion:, The promise of reducing bias and increasing precision of the estimates support the continued development and examination of RSSMs. [source]


    Fuzzy Monte Carlo Simulation and Risk Assessment in Construction

    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 4 2010
    N. Sadeghi
    However, subjective and linguistically expressed information results in added non-probabilistic uncertainty in construction management. Fuzzy logic has been used successfully for representing such uncertainties in construction projects. In practice, an approach that can handle both random and fuzzy uncertainties in a risk assessment model is necessary. This article discusses the deficiencies of the available methods and proposes a Fuzzy Monte Carlo Simulation (FMCS) framework for risk analysis of construction projects. In this framework, we construct a fuzzy cumulative distribution function as a novel way to represent uncertainty. To verify the feasibility of the FMCS framework and demonstrate its main features, the authors have developed a special purpose simulation template for cost range estimating. This template is employed to estimate the cost of a highway overpass project. [source]


    Initialization Strategies in Simulation-Based SFE Eigenvalue Analysis

    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 5 2005
    Song Du
    Poor initializations often result in slow convergence, and in certain instances may lead to an incorrect or irrelevant answer. The problem of selecting an appropriate starting vector becomes even more complicated when the structure involved is characterized by properties that are random in nature. Here, a good initialization for one sample could be poor for another sample. Thus, the proper eigenvector initialization for uncertainty analysis involving Monte Carlo simulations is essential for efficient random eigenvalue analysis. Most simulation procedures to date have been sequential in nature, that is, a random vector to describe the structural system is simulated, a FE analysis is conducted, the response quantities are identified by post-processing, and the process is repeated until the standard error in the response of interest is within desired limits. A different approach is to generate all the sample (random) structures prior to performing any FE analysis, sequentially rank order them according to some appropriate measure of distance between the realizations, and perform the FE analyses in similar rank order, using the results from the previous analysis as the initialization for the current analysis. The sample structures may also be ordered into a tree-type data structure, where each node represents a random sample, the traverse of the tree starts from the root of the tree until every node in the tree is visited exactly once. This approach differs from the sequential ordering approach in that it uses the solution of the "closest" node to initialize the iterative solver. The computational efficiencies that result from such orderings (at a modest expense of additional data storage) are demonstrated through a stability analysis of a system with closely spaced buckling loads and the modal analysis of a simply supported beam. [source]


    Introducing dimensionless parameters into the correlation of NMR relaxation time to transport properties of porous media

    CONCEPTS IN MAGNETIC RESONANCE, Issue 3 2007
    Manolis M. Tomadakis
    Abstract Dimensionless parameters representing the viscous permeability (k) and NMR relaxation time (T1) of particle beds, while accounting also for the particle size, are shown to improve drastically the accuracy of k-T1 correlations in the slow diffusion regime, in the absence of bulk relaxation effects. The finding is based on a regression analysis of numerical results for k and T1 in both random and ordered isotropic and anisotropic beds of fibers. Use of the formation factor (F) improves further the accuracy of the correlations only for the strongly anisotropic unidirectional arrays of fibers. A survey of related literature reveals an extensive effort in recent years in upgrading k-T1 correlations, driven primarily by applications in petroleum and gas field exploration and recovery. © 2007 Wiley Periodicals, Inc. Concepts Magn Reson Part A 30A: 154,164, 2007. [source]


    Detecting the Effects of Fishing on Seabed Community Diversity: Importance of Scale and Sample Size

    CONSERVATION BIOLOGY, Issue 2 2003
    Michel J. Kaiser
    I investigated the importance of the extent of area sampled to the observed outcome of comparisons of the diversity of seabed assemblages in different areas of the seabed that experience either low or high levels of fishing disturbance. Using a finite data set within each disturbance regime, I pooled samples of the benthic communities at random. Thus, although individual sample size increased with each additional level of pooled data, the number of samples decreased accordingly. Detecting the effects of disturbance on species diversity was strongly scale-dependent. Despite increased replication at smaller scales, disturbance effects were more apparent when larger but less numerous samples were collected. The detection of disturbance effects was also affected by the choice of sampling device. Disturbance effects were apparent with pooled anchor-dredge samples but were not apparent with pooled beam-trawl samples. A more detailed examination of the beam-trawl data emphasized that a whole-community approach to the investigation of changes in diversity can miss responses in particular components of the community ( e.g., decapod crustacea ). The latter may be more adversely affected by disturbance than the majority of the taxa found within the benthic assemblage. Further, the diversity of some groups ( e.g., echinoderms ) actually increased with disturbance. Experimental designs and sampling regimes that focus on diversity at only one scale may miss important disturbance effects that occur at larger or smaller scales. Resumen: Las perturbaciones antropogénicas de ambientes terrestres y marinos, tales como la tala y la pesca, se identifican generalmente con impactos negativos sobre la diversidad de especies. Sin embargo, observaciones empíricas a menudo no apoyan este supuesto. Investigué la importancia de la extensión del área muestreada sobre los resultados observados de comparaciones de la diversidad de ensamblajes de fondos marinos en diferentes áreas que experimentaron niveles bajos o altos de perturbación por pesca. Usando un juego finito de datos dentro de cada régimen de perturbación, se combinaron las muestras de comunidades bénticas de manera aleatoria. Por lo tanto, a pesar de que el tamaño de muestra individual incrementó con cada nivel adicional de datos combinados, el número de muestras disminuyó en consecuencia. La detección de los efectos de la perturbación sobre la diversidad de especies dependió en gran medida de la escala. A pesar del incremento en replicación de las escalas pequeñas, los efectos de la perturbación fueron más visibles cuando las muestras recolectadas fueron más grandes pero menos numerosas. La detección de los efectos de la perturbación también fueron afectados por la selección del equipo de muestreo. Los efectos de la perturbación eran evidentes cuando se usaron muestras mezcladas de dragas de ancla, pero no fueron evidentes para muestras mezcladas de redes de arrastre con vigas. Un análisis más detallado de los datos de las redes de arrastre muestran que una aproximación a nivel de toda la comunidad para investigar los cambios de diversidad puede resultar en la pérdida de información a nivel de componentes específicos ( por ejemplo crustáceos decápodos ) de la comunidad. Estos pueden ser adversamente afectados en mayor medida por la perturbación que la mayoría de los taxones que integran el ensamblaje béntico. Además, la diversidad de algunos grupos ( por ejemplo los equinodermos ) de hecho aumentó con la perturbación. Los diseños experimentales y los regímenes de muestreo que se enfocan en la diversidad a una sola escala pueden no detectar los efectos importantes de la perturbación que ocurren a mayores o menores escalas. [source]


    Genetic Effects of Multiple Generations of Supportive Breeding

    CONSERVATION BIOLOGY, Issue 6 2001
    Jinliang Wang
    This procedure is intended to increase population size without introducing exogenous genes into the managed population. Previous work examining the genetic effects of a single generation of supportive breeding has shown that although a successful program increases the census population size, it may reduce the genetically effective population size and thereby induce excessive inbreeding and loss of genetic variation. We expand and generalize previous analyses of supportive breeding and consider the effects of multiple generations of supportive breeding on rates of inbreeding and genetic drift. We derived recurrence equations for the inbreeding coefficient and coancestry, and thereby equations for inbreeding and variance effective sizes, under three models for selecting captive breeders: at random, preferentially among those born in captivity, and preferentially among those born in the wild. Numerical examples indicate that supportive breeding, when carried out successfully over multiple generations, may increase not only the census but also the effective size of the supported population as a whole. If supportive breeding does not result in a substantial and continuous increase of the census size of the breeding population, however, it might be genetically harmful because of elevated rates of inbreeding and genetic drift. Resumen: La práctica de apoyar poblaciones silvestres débiles mediante la captura de una fracción de los individuos silvestres, su cautiverio para la reproducción y la liberación a su descendencia en habitas naturales para que convivan con organismos silvestres se conoce como reproducción de apoyo y se ha empleado ampliamente en la biología de la conservación y en el manejo de pesca y vida silvestre. Este procedimiento tiene la intención de incrementar el tamaño de la población sin introducir genes exógenos en la población bajo manejo. Trabajos previos sobre los efectos genéticos de una sola generación de reproducción de apoyo muestran que, aunque un programa exitoso incrementa el tamaño poblacional, puede reducir la población genéticamente efectivae inducir así un exceso de consanguinidad y pérdida de variación genética. Expandimos y generalizamos análisis previos de la reproducción de apoyo y consideramos los efectos de múltiples generaciones de reproducción de soporte en las tasas de consanguinidad y de deriva génica. Derivamos ecuaciones de recurrencia para el coeficiente de consanguinidad y de coancestría, y por tanto ecuaciones de tamaños efectivos de consanguinidad y de varianza, para tres modelos de selección de reproductores en cautiverio : aleatoria, preferentemente entre los nacidos en cautiverio y preferentemente entre los nacidos en libertad. Los ejemplos numéricos indican que la reproducción de apoyo, cuando es exitosa en múltiples generaciones, puede ser favorable para el incremento no solo del tamaño, sino del tamaño efectivo de la población soportada en su conjunto. Sin embargo, si la reproducción de soporte no resulta en un incremento sustancial y continuo del tamaño de la población, puede ser genéticamente dañina debido a las altas tasas de consanguinidad y de deriva genética. [source]


    An Ecological and Economic Assessment of the Nontimber Forest Product Gaharu Wood in Gunung Palung National Park, West Kalimantan, Indonesia

    CONSERVATION BIOLOGY, Issue 6 2001
    Gary D. Paoli
    We studied the demographic effect and economic returns of harvesting aromatic gaharu wood from fungus-infected trees of Aquilaria malaccensis Lam. at Gunung Palung National Park, Indonesia, to evaluate the management potential of gaharu wood. Aquilaria malaccensis trees openface> 20 cm in diameter occurred at low preharvest densities (0.16,0.32 ha) but were distributed across five of six forest types surveyed. During a recent harvest, 75% of trees were felled, with harvest intensities ranging from 50% to 100% among forest types. Overall, 50% of trees contained gaharu wood, but trees at higher elevations contained gaharu wood more frequently ( 73%) than trees at lower elevation (27%). The mean density of regeneration ( juveniles> 15 cm in height) near adult trees (3,7 m away) was 0.2/m2, 200 times greater than at random in the forest (10/ha), but long-term data on growth and survivorship are needed to determine whether regeneration is sufficient for population recovery. Gaharu wood extraction from Gunung Palung was very profitable for collectors, generating an estimated gross financial return per day of US $8.80, triple the mean village wage. Yet, the estimated sustainable harvest of gaharu wood at natural tree densities generates a mean net present value of only $10.83/ha, much lower than that of commercial timber harvesting, the dominant forest use in Kalimantan. Returns per unit area could be improved substantially, however, by implementing known silvicultural methods to increase tree densities, increase the proportion of trees that produce gaharu wood, and shorten the time interval between successive harvests. The economic potential of gaharu wood is unusual among nontimber forest products and justifies experimental trials to develop small-scale cultivation methods. Resumen: Datos ecológicos y económicos son esenciales para la identificación de productos forestales no maderables tropicales con potencial para la extracción sostenible y rentable en un sistema bajo manejo. Estudiamos el efecto demográfico y los beneficios económicos de la cosecha de la madera aromática gaharu de árboles de Aquilaria malaccenis Lam infectados por hongos en el Parque Nacional Gunung Palung Indonesia para evaluar el potencial de manejo de la madera. Arboles de Aquilaria malaccenis> 20 cm de diámetro ocurrieron en bajas densidades precosecha (0.16,0.32 ha,1) pero se distribuyeron en cinco de los seis tipos de bosque muestreados. Durante una cosecha reciente, 75% de los árboles fueron cortados, con intensidades de cosecha entre 50 y 100% en los tipos de bosque. En conjunto, 50% de los árboles contenían madera gaharu, pero árboles de elevaciones mayores contenían madera gaharu más frecuentemente ( 73%) que árboles de elevaciones menores (27%). La densidad promedio de regeneración ( juveniles> 15 cm de altura) cerca de árboles adultos (de 3 a 7 m de distancia) fue de 0.2 m,2, 200 veces mayor que en el bosque (10 ha,1), pero se requieren datos a largo plazo sobre el crecimiento y la supervivencia para determinar si la regeneración es suficiente para la recuperación de la población. La extracción de madera gaharu de Gunung Palung fue muy redituable, generando un rendimiento financiero bruto estimado en US $8.80 diarios, el triple del salario promedio en la zona. Sin embargo, la cosecha sostenible estimada de madera gaharu en densidades naturales de árboles genera un valor presente neto de sólo $10.83 ha,1, mucho menor que el de la cosecha comercial de madera, uso dominante del bosque en Kalimantan. Sin embargo, los rendimientos por unidad de área podrían mejorar sustancialmente mediante la instrumentación de métodos silviculturales para incrementar la densidad de árboles, incrementar la proporción de árboles que producen madera gaharu y reducir el intervalo de tiempo entre cosechas sucesivas. El potencial económico de la madera gaharu es poco usual entre los productos forestales no maderables y justifica la experimentación para desarrollar métodos de cultivo en pequeña escala. [source]


    Analysis of Particle Pumping Using SOLDOR/NEUT2D Code in the JT-60U Tokamak

    CONTRIBUTIONS TO PLASMA PHYSICS, Issue 1-3 2008
    H. Kawashima
    Abstract In order to understand the particle pumping on JT-60U, we analyze the roles of atomic and molecular processes using SOLDOR/NEUT2D code. A case of short strike point distance shows that most of neutrals produced on the targets go toward the exhaust slot directly. Whereas, neutrals are scattered in the spherically at random for the long distance case by collision processes and a few of them go toward the slot. It is clarified that the incident neutrals to the slot at low ne/high Te divertor plasma condition are dominated by atoms. Those at high ne/low Te condition are dominated by molecules due to elastic collision. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


    Branch-and-Price Methods for Prescribing Profitable Upgrades of High-Technology Products with Stochastic Demands*

    DECISION SCIENCES, Issue 1 2004
    Purushothaman Damodaran
    ABSTRACT This paper develops a model that can be used as a decision support aid, helping manufacturers make profitable decisions in upgrading the features of a family of high-technology products over its life cycle. The model integrates various organizations in the enterprise: product design, marketing, manufacturing, production planning, and supply chain management. Customer demand is assumed random and this uncertainty is addressed using scenario analysis. A branch-and-price (B&P) solution approach is devised to optimize the stochastic problem effectively. Sets of random instances are generated to evaluate the effectiveness of our solution approach in comparison with that of commercial software on the basis of run time. Computational results indicate that our approach outperforms commercial software on all of our test problems and is capable of solving practical problems in reasonable run time. We present several examples to demonstrate how managers can use our models to answer "what if" questions. [source]


    A mutation in the zebrafish Na,K-ATPase subunit atp1a1a.1 provides genetic evidence that the sodium potassium pump contributes to left-right asymmetry downstream or in parallel to nodal flow

    DEVELOPMENTAL DYNAMICS, Issue 7 2006
    Elin Ellertsdottir
    Abstract While there is a good conceptual framework of dorsoventral and anterioposterior axes formation in most vertebrate groups, understanding of left-right axis initiation is fragmentary. Diverse mechanisms have been implied to contribute to the earliest steps of left-right asymmetry, including small molecule signals, gap junctional communication, membrane potential, and directional flow of extracellular liquid generated by monocilia in the node region. Here we demonstrate that a mutation in the zebrafish Na,K-ATPase subunit atp1a1a causes left-right defects including isomerism of internal organs at the anatomical level. The normally left-sided Nodal signal spaw as well as its inhibitor lefty are expressed bilaterally, while pitx2 may appear random or bilateral. Monocilia movement and fluid circulation in Kupffer's vesicle are normal in atp1a1am883 mutant embryos. Therefore, the Na,K-ATPase is required downstream or in parallel to monocilia function during initiation of left-right asymmetry in zebrafish. Developmental Dynamics 235:1794,1808, 2006. © 2006 Wiley-Liss, Inc. [source]


    Structural and biophysical simulation of angiogenesis and vascular remodeling ,

    DEVELOPMENTAL DYNAMICS, Issue 4 2001
    Ralf Gödde
    Abstract The purpose of this report is to introduce a new computer model for the simulation of microvascular growth and remodeling into arteries and veins that imitates angiogenesis and blood flow in real vascular plexuses. A C++ computer program was developed based on geometric and biophysical initial and boundary conditions. Geometry was defined on a two-dimensional isometric grid by using defined sources and drains and elementary bifurcations that were able to proliferate or to regress under the influence of random and deterministic processes. Biophysics was defined by pressure, flow, and velocity distributions in the network by using the nodal-admittance-matrix-method, and accounting for hemodynamic peculiarities like Fahraeus-Lindqvist effect and exchange with extravascular tissue. The proposed model is the first to simulate interdigitation between the terminal branches of arterial and venous trees. This was achieved by inclusion of vessel regression and anastomosis in the capillary plexus and by remodeling in dependence from hemodynamics. The choice of regulatory properties influences the resulting vascular patterns. The model predicts interdigitating arteriovenous patterning if shear stress-dependent but not pressure-dependent remodeling was applied. By approximating the variability of natural vascular patterns, we hope to better understand homogeneity of transport, spatial distribution of hemodynamic properties and biomass allocation to the vascular wall or blood during development, or during evolution of circulatory systems. © 2001 Wiley-Liss, Inc. [source]


    Mechanical computation in neurons

    DEVELOPMENTAL NEUROBIOLOGY, Issue 11 2009
    Jummi Laishram
    Abstract Growth cones are the main motile structures located at the tip of neurites and are composed of a lamellipodium from which thin filopodia emerge. In this article, we analyzed the kinetics and dynamics of growth cones with the aim to understand two major issues: first, the strategy used by filopodia and lamellipodia during their exploration and navigation; second, what kind of mechanical problems neurons need to solve during their operation. In the developing nervous system and in the adult brain, neurons constantly need to solve mechanical problems. Growth cones must decide how to explore the environment and in which direction to grow; they also need to establish the appropriate contacts, to avoid obstacles and to determine how much force to exert. Here, we show that in sparse cultures, filopodia grow and retract following statistical patterns, nearly optimal for an efficient exploration of the environment. In a dense culture, filopodia exploration is still present although significantly reduced. Analysis on 1271, 6432, and 185 pairs of filopodia of DRG, PC12 and Hippocampal neurons respectively showed that the correlation coefficient |,| of the growth of more than 50% of filopodia pairs was >0.15. From a computational point of view, filopodia and lamellipodia motion can be described by a random process in which errors are corrected by efficient feedback loops. This article argues that neurons not only process sensory signals, but also solve mechanical problems throughout their entire lifespan, from the early stages of embryogenesis to adulthood. © 2009 Wiley Periodicals, Inc. Develop Neurobiol, 2009 [source]


    Parental and perinatal factors influencing the development of handedness in captive chimpanzees

    DEVELOPMENTAL PSYCHOBIOLOGY, Issue 6 2006
    William D. Hopkins
    Abstract It has been proposed that human right handedness is determined by genetic factors associated with the emergence of language, whereas non-human primate handedness is determined by random, non-genetic factors. These different mechanisms account for differences in the distribution of handedness between human and non-human primates. Here we report evidence that genetic factors play a role in the determination of handedness in chimpanzees. We further report that differential rearing has no influence on the expression of handedness in related individuals. Contrary to many theories of the origin of handedness, these results indicate that genetic factors have a significant influence on handedness in chimpanzees. © 2006 Wiley Periodicals, Inc. Dev Psychobiol 48: 428,435, 2006. [source]


    Screening for gestational diabetes; past, present and future

    DIABETIC MEDICINE, Issue 5 2002
    F. W. F. Hanna
    Abstract Gestational diabetes is carbohydrate intolerance, with onset or first recognition of hyperglycaemia during pregnancy. Several studies have suggested that gestational hyperglycaemia is associated with adverse maternal and fetal outcomes, promoting the case for screening. Conversely, others argue that screening for gestational diabetes may colour the clinical judgement, influencing further management, e.g. more ,unjustified' caesarean sections. Additionally, the lack of definitive data either on a clear-cut glycaemic threshold for the development of adverse outcomes or on the impact of intervention is emphasized by opponents of screening. This review attempts to evaluate the available data on screening for gestational diabetes. Oral glucose tolerance test is promoted on the basis that the diabetogenic stress of pregnancy is encountered during late gestation and is best recognized in the fed state. There are different tests, including the 1 h/50-g, 2 h/75-g and 3 h/100-g tests, with practical limitations, including the time and cost involved and the unpleasant supra-physiological glucose load that is unrelated to body weight, and issues of reproducibility and sensitivity/specificity profiles. Despite its convenience, the poor sensitivity of random glucose has precluded its routine use for screening. Fasting glucose appears to be promising but further testing is required to ensure satisfactory sensitivity/specificity in different populations. Despite its limitations, the oral glucose tolerance test has become established as the ,most acceptable' diagnostic test for gestational diabetes. More convenient methods, e.g. fasting or random or post-load glucose, have to be validated therefore against the oral glucose tolerance test to gain acceptance for routine screening. Diabet. Med 19, 351,358 (2002) [source]


    Pathological gambling: an increasing public health problem

    ACTA PSYCHIATRICA SCANDINAVICA, Issue 4 2001
    Article first published online: 7 JUL 200
    Gambling has always existed, but only recently has it taken on the endlessly variable and accessible forms we know today. Gambling takes place when something valuable , usually money , is staked on the outcome of an event that is entirely unpredictable. It was only two decades ago that pathological gambling was formally recognized as a mental disorder, when it was included in the DSM-III in 1980. For most people, gambling is a relaxing activity with no negative consequences. For others, however, gambling becomes excessive. Pathological gambling is a disorder that manifests itself through the irrepressible urge to wager money. This disorder ultimately dominates the gambler's life, and has a multitude of negative consequences for both the gambler and the people they interact with, i.e. friends, family members, employers. In many ways, gambling might seem a harmless activity. In fact, it is not the act of gambling itself that is harmful, but the vicious cycle that can begin when a gambler wagers money they cannot afford to lose, and then continues to gamble in order to recuperate their losses. The gambler's ,tragic flaw' of logic lies in their failure to understand that gambling is governed solely by random, chance events. Gamblers fail to recognize this and continue to gamble, attempting to control outcomes by concocting strategies to ,beat the game'. Most, if not all, gamblers try in some way to predict the outcome of a game when they are gambling. A detailed analysis of gamblers' selfverbalizations reveals that most of them behave as though the outcome of the game relied on their personal ,skills'. From the gambler's perspective, skill can influence chance , but in reality, the random nature of chance events is the only determinant of the outcome of the game. The gambler, however, either ignores or simply denies this fundamental rule (1). Experts agree that the social costs of pathological gambling are enormous. Changes in gaming legislation have led to a substantial expansion of gambling opportunities in most industrialized countries around the world, mainly in Europe, America and Australia. Figures for the United States' leisure economy in 1996 show gross gambling revenues of $47.6 billion, which was greater than the combined revenue of $40.8 billion from film box offices, recorded music, cruise ships, spectator sports and live entertainment (2). Several factors appear to be motivating this growth: the desire of governments to identify new sources of revenue without invoking new or higher taxes; tourism entrepreneurs developing new destinations for entertainment and leisure; and the rise of new technologies and forms of gambling (3). As a consequence, prevalence studies have shown increased gambling rates among adults. It is currently estimated that 1,2% of the adult population gambles excessively (4, 5). Given that the prevalence of gambling is related to the accessibility of gambling activities, and that new forms of gambling are constantly being legalized throughout most western countries, this figure is expected to rise. Consequently, physicians and mental health professionals will need to know more about the diagnosis and treatment of pathological gamblers. This disorder may be under-diagnosed because, clinically, pathological gamblers usually seek help for the problems associated with gambling such as depression, anxiety or substance abuse, rather than for the excessive gambling itself. This issue of Acta Psychiatrica Scandinavica includes the first national survey of problem gambling completed in Sweden, conducted by Volberg et al. (6). This paper is based on a large sample (N=9917) with an impressively high response rate (89%). Two instruments were used to assess gambling activities: the South Oaks Gambling Screen-Revised (SOGS-R) and an instrument derived from the DSM-IV criteria for pathological gambling. Current (1 year) and lifetime prevalence rates were collected. Results show that 0.6% of the respondents were classified as probable pathological gamblers, and 1.4% as problem gamblers. These data reveal that the prevalence of pathological gamblers in Sweden is significantly less than what has been observed in many western countries. The authors have pooled the rates of problem (1.4%) and probable pathological gamblers (0.6%), to provide a total of 2.0% for the current prevalence. This 2% should be interpreted with caution, however, as we do not have information on the long-term evolution of these subgroups of gamblers; for example, we do not know how many of each subgroup will become pathological gamblers, and how many will decrease their gambling or stop gambling altogether. Until this information is known, it would be preferable to keep in mind that only 0.6% of the Swedish population has been identified as pathological gamblers. In addition, recent studies show that the SOGS-R may be producing inflated estimates of pathological gambling (7). Thus, future research in this area might benefit from the use of an instrument based on DSM criteria for pathological gambling, rather than the SOGS-R only. Finally, the authors suggest in their discussion that the lower rate of pathological gamblers obtained in Sweden compared to many other jurisdictions may be explained by the greater availability of games based on chance rather than games based on skill or a mix of skill and luck. Before accepting this interpretation, researchers will need to demonstrate that the outcomes of all games are determined by other factor than chance and randomness. Many studies have shown that the notion of randomness is the only determinant of gambling (1). Inferring that skill is an important issue in gambling may be misleading. While these are important issues to consider, the Volberg et al. survey nevertheless provides crucial information about gambling in a Scandinavian country. Gambling will be an important issue over the next few years in Sweden, and the publication of the Volberg et al. study is a landmark for the Swedish community (scientists, industry, policy makers, etc.). This paper should stimulate interesting discussions and inspire new, much-needed scientific investigations of pathological gambling. Acta Psychiatrica Scandinavica Guido Bondolfi and Robert Ladouceur Invited Guest Editors References 1.,LadouceurR & WalkerM. The cognitive approach to understanding and treating pathological gambling. In: BellackAS, HersenM, eds. Comprehensive clinical psychology. New York: Pergamon, 1998:588 , 601. 2.,ChristiansenEM. Gambling and the American economy. In: FreyJH, ed. Gambling: socioeconomic impacts and public policy. Thousand Oaks, CA: Sage, 1998:556:36 , 52. 3.,KornDA & ShafferHJ. Gambling and the health of the public: adopting a public health perspective. J Gambling Stud2000;15:289 , 365. 4.,VolbergRA. Problem gambling in the United States. J Gambling Stud1996;12:111 , 128. 5.,BondolfiG, OsiekC, FerreroF. Prevalence estimates of pathological gambling in Switzerland. Acta Psychiatr Scand2000;101:473 , 475. 6.,VolbergRA, AbbottMW, RönnbergS, MunckIM. Prev-alence and risks of pathological gambling in Sweden. Acta Psychiatr Scand2001;104:250 , 256. 7.,LadouceurR, BouchardC, RhéaumeNet al. Is the SOGS an accurate measure of pathological gambling among children, adolescents and adults?J Gambling Stud2000;16:1 , 24. [source]