Home About us Contact | |||
Random Nature (random + nature)
Selected AbstractsParallel bandwidth characteristics calculations for thin avalanche photodiodes on a SGI Origin 2000 supercomputerCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 12 2004Yi Pan Abstract An important factor for high-speed optical communication is the availability of ultrafast and low-noise photodetectors. Among the semiconductor photodetectors that are commonly used in today's long-haul and metro-area fiber-optic systems, avalanche photodiodes (APDs) are often preferred over p - i - n photodiodes due to their internal gain, which significantly improves the receiver sensitivity and alleviates the need for optical pre-amplification. Unfortunately, the random nature of the very process of carrier impact ionization, which generates the gain, is inherently noisy and results in fluctuations not only in the gain but also in the time response. Recently, a theory characterizing the autocorrelation function of APDs has been developed by us which incorporates the dead-space effect, an effect that is very significant in thin, high-performance APDs. The research extends the time-domain analysis of the dead-space multiplication model to compute the autocorrelation function of the APD impulse response. However, the computation requires a large amount of memory space and is very time consuming. In this research, we describe our experiences in parallelizing the code in MPI and OpenMP using CAPTools. Several array partitioning schemes and scheduling policies are implemented and tested. Our results show that the code is scalable up to 64 processors on a SGI Origin 2000 machine and has small average errors. Copyright © 2004 John Wiley & Sons, Ltd. [source] Pathological gambling: an increasing public health problemACTA PSYCHIATRICA SCANDINAVICA, Issue 4 2001Article first published online: 7 JUL 200 Gambling has always existed, but only recently has it taken on the endlessly variable and accessible forms we know today. Gambling takes place when something valuable , usually money , is staked on the outcome of an event that is entirely unpredictable. It was only two decades ago that pathological gambling was formally recognized as a mental disorder, when it was included in the DSM-III in 1980. For most people, gambling is a relaxing activity with no negative consequences. For others, however, gambling becomes excessive. Pathological gambling is a disorder that manifests itself through the irrepressible urge to wager money. This disorder ultimately dominates the gambler's life, and has a multitude of negative consequences for both the gambler and the people they interact with, i.e. friends, family members, employers. In many ways, gambling might seem a harmless activity. In fact, it is not the act of gambling itself that is harmful, but the vicious cycle that can begin when a gambler wagers money they cannot afford to lose, and then continues to gamble in order to recuperate their losses. The gambler's ,tragic flaw' of logic lies in their failure to understand that gambling is governed solely by random, chance events. Gamblers fail to recognize this and continue to gamble, attempting to control outcomes by concocting strategies to ,beat the game'. Most, if not all, gamblers try in some way to predict the outcome of a game when they are gambling. A detailed analysis of gamblers' selfverbalizations reveals that most of them behave as though the outcome of the game relied on their personal ,skills'. From the gambler's perspective, skill can influence chance , but in reality, the random nature of chance events is the only determinant of the outcome of the game. The gambler, however, either ignores or simply denies this fundamental rule (1). Experts agree that the social costs of pathological gambling are enormous. Changes in gaming legislation have led to a substantial expansion of gambling opportunities in most industrialized countries around the world, mainly in Europe, America and Australia. Figures for the United States' leisure economy in 1996 show gross gambling revenues of $47.6 billion, which was greater than the combined revenue of $40.8 billion from film box offices, recorded music, cruise ships, spectator sports and live entertainment (2). Several factors appear to be motivating this growth: the desire of governments to identify new sources of revenue without invoking new or higher taxes; tourism entrepreneurs developing new destinations for entertainment and leisure; and the rise of new technologies and forms of gambling (3). As a consequence, prevalence studies have shown increased gambling rates among adults. It is currently estimated that 1,2% of the adult population gambles excessively (4, 5). Given that the prevalence of gambling is related to the accessibility of gambling activities, and that new forms of gambling are constantly being legalized throughout most western countries, this figure is expected to rise. Consequently, physicians and mental health professionals will need to know more about the diagnosis and treatment of pathological gamblers. This disorder may be under-diagnosed because, clinically, pathological gamblers usually seek help for the problems associated with gambling such as depression, anxiety or substance abuse, rather than for the excessive gambling itself. This issue of Acta Psychiatrica Scandinavica includes the first national survey of problem gambling completed in Sweden, conducted by Volberg et al. (6). This paper is based on a large sample (N=9917) with an impressively high response rate (89%). Two instruments were used to assess gambling activities: the South Oaks Gambling Screen-Revised (SOGS-R) and an instrument derived from the DSM-IV criteria for pathological gambling. Current (1 year) and lifetime prevalence rates were collected. Results show that 0.6% of the respondents were classified as probable pathological gamblers, and 1.4% as problem gamblers. These data reveal that the prevalence of pathological gamblers in Sweden is significantly less than what has been observed in many western countries. The authors have pooled the rates of problem (1.4%) and probable pathological gamblers (0.6%), to provide a total of 2.0% for the current prevalence. This 2% should be interpreted with caution, however, as we do not have information on the long-term evolution of these subgroups of gamblers; for example, we do not know how many of each subgroup will become pathological gamblers, and how many will decrease their gambling or stop gambling altogether. Until this information is known, it would be preferable to keep in mind that only 0.6% of the Swedish population has been identified as pathological gamblers. In addition, recent studies show that the SOGS-R may be producing inflated estimates of pathological gambling (7). Thus, future research in this area might benefit from the use of an instrument based on DSM criteria for pathological gambling, rather than the SOGS-R only. Finally, the authors suggest in their discussion that the lower rate of pathological gamblers obtained in Sweden compared to many other jurisdictions may be explained by the greater availability of games based on chance rather than games based on skill or a mix of skill and luck. Before accepting this interpretation, researchers will need to demonstrate that the outcomes of all games are determined by other factor than chance and randomness. Many studies have shown that the notion of randomness is the only determinant of gambling (1). Inferring that skill is an important issue in gambling may be misleading. While these are important issues to consider, the Volberg et al. survey nevertheless provides crucial information about gambling in a Scandinavian country. Gambling will be an important issue over the next few years in Sweden, and the publication of the Volberg et al. study is a landmark for the Swedish community (scientists, industry, policy makers, etc.). This paper should stimulate interesting discussions and inspire new, much-needed scientific investigations of pathological gambling. Acta Psychiatrica Scandinavica Guido Bondolfi and Robert Ladouceur Invited Guest Editors References 1.,LadouceurR & WalkerM. The cognitive approach to understanding and treating pathological gambling. In: BellackAS, HersenM, eds. Comprehensive clinical psychology. New York: Pergamon, 1998:588 , 601. 2.,ChristiansenEM. Gambling and the American economy. In: FreyJH, ed. Gambling: socioeconomic impacts and public policy. Thousand Oaks, CA: Sage, 1998:556:36 , 52. 3.,KornDA & ShafferHJ. Gambling and the health of the public: adopting a public health perspective. J Gambling Stud2000;15:289 , 365. 4.,VolbergRA. Problem gambling in the United States. J Gambling Stud1996;12:111 , 128. 5.,BondolfiG, OsiekC, FerreroF. Prevalence estimates of pathological gambling in Switzerland. Acta Psychiatr Scand2000;101:473 , 475. 6.,VolbergRA, AbbottMW, RönnbergS, MunckIM. Prev-alence and risks of pathological gambling in Sweden. Acta Psychiatr Scand2001;104:250 , 256. 7.,LadouceurR, BouchardC, RhéaumeNet al. Is the SOGS an accurate measure of pathological gambling among children, adolescents and adults?J Gambling Stud2000;16:1 , 24. [source] Semi-empirical model for site effects on acceleration time histories at soft-soil sites.EARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 11 2004Part 1: formulation, development Abstract A criterion is developed for the simulation of realistic artificial ground motion histories at soft-soil sites, corresponding to a detailed ground motion record at a reference firm-ground site. A complex transfer function is defined as the Fourier transform of the ground acceleration time history at the soft-soil site divided by the Fourier transform of the acceleration record at the firm-ground site. Working with both the real and the imaginary components of the transfer function, and not only with its modulus, serves to keep the statistical information about the wave phases (and, therefore, about the time variation of amplitudes and frequencies) in the algorithm used to generate the artificial records. Samples of these transfer functions, associated with a given pair of soft-soil and firm-ground sites, are empirically determined from the corresponding pairs of simultaneous records. Each function included in a sample is represented as the superposition of the transfer functions of the responses of a number of oscillators. This formulation is intended to account for the contributions of trains of waves following different patterns in the vicinity of both sites. The properties of the oscillators play the role of parameters of the transfer functions. They vary from one seismic event to another. Part of the variation is systematic, and can be explained in terms of the influence of ground motion intensity on the effective values of stiffness and damping of the artificial oscillators. Another part has random nature; it reflects the random characteristics of the wave propagation patterns associated with the different events. The semi-empirical model proposed recognizes both types of variation. The influence of intensity is estimated by means of a conventional one-dimensional shear wave propagation model. This model is used to derive an intensity-dependent modification of the values of the empirically determined model parameters in those cases when the firm-ground earthquake intensity used to determine these parameters differs from that corresponding to the seismic event for which the simulated records are to be obtained. Copyright © 2004 John Wiley & Sons, Ltd. [source] GEOLOGICAL MODEL EVALUATION THROUGH WELL TEST SIMULATION: A CASE STUDY FROM THE WYTCH FARM OILFIELD, SOUTHERN ENGLANDJOURNAL OF PETROLEUM GEOLOGY, Issue 1 2007S.Y. Zheng This paper presents an approach to the evaluation of reservoir models using transient pressure data. Braided fluvial sandstones exposed in cliffs in SW England were studied as the surface equivalent of the Triassic Sherwood Sandstone, a reservoir unit at the nearby Wytch Farm oilfield. Three reservoir models were built; each used a different modelling approach ranging in complexity from stochastic pixel-based modelling using commercially available software, to a spreadsheet random number generator. In order to test these models, numerical well test simulations were conducted using sector models extracted from the geological models constructed. The simulation results were then evaluated against the actual well test data in order to find the model which best represented the field geology. Two wells at Wytch Farm field were studied. The results suggested that for one of the sampled wells, the model built using the spreadsheet random number generator gave the best match to the well test data. In the well, the permeability from the test interpretation matched the geometric average permeability. This average is the "correct" upscaled permeability for a random system, and this was consistent with the random nature of the geological model. For the second well investigated, a more complex "channel object" model appeared to fit the dynamic data better. All the models were built with stationary properties. However, the well test data suggested that some parts of the field have different statistical properties and hence show non-stationarity. These differences would have to be built into the model representing the local geology. This study presents a workflow that is not yet considered standard in the oil industry, and the use of dynamic data to evaluate geological models requires further development. The study highlights the fact that the comparison or matching of results from reservoir models and well-test analyses is not always straightforward in that different models may match different wells. The study emphasises the need for integrated analyses of geological and engineering data. The methods and procedures presented are intended to form a feedback loop which can be used to evaluate the representivity of a geological model. [source] Increasing sales by introducing non-salable itemsMANAGERIAL AND DECISION ECONOMICS, Issue 8 2006Kobi Kriesler Rationality implies that adding ,irrelevant' and, in particular, inferior alternatives to the opportunity set cannot increase the choice probability of some other alternative. In this study, we propose a novel approach that can rationalize an intended addition of such alternatives because it strictly increases the choice probability of some existing alternative. The driving force behind the existence and extent of such an increase is the random nature of individual preferences, that implies intransitivity, and the random nature of the applied choice procedures. We study the case of a firm interested in increasing the sales of some of its existing products by introducing a new and inferior (non-salable) product. Our main results focus on the feasibility and potential advantage of a successful such strategy. We first establish necessary and sufficient conditions for an increase in the sale probability and then derive the maximal possible absolute and relative increase in this probability, when the firm has extremely limited information on the characteristics of the consumers. We then derive analogous results, assuming that the existing line of products consists of just two items and that the firm has accurate information on the consumers' stochastic preferences over the existing products. These later results are illustrated using some experimental evidence. The applicability of the approach is finally briefly discussed in the context of branding policy. Copyright © 2006 John Wiley & Sons, Ltd. [source] Optimal control of work-in-process inventory of a two-station production line,OPTIMAL CONTROL APPLICATIONS AND METHODS, Issue 3 2010A. Kokangul Abstract Most production lines keep a minimal level of inventory stock to save storage costs and buffer space. However, the random nature of processing, breakdown, and repair times can significantly affect the efficiency of a production line and force the stocking of work-in-process inventory. We are interested in the case when starvation and blockage are preferentially avoided. In this study, a mathematical model has been developed using asymptotic approximation and simulation that provides asymptotic results for the expected value and the variance of the stock level in a buffer as a function of time. In addition, the functional relationship between buffer capacity and the first stopping time caused by starvation or blockage has been determined. Copyright © 2009 John Wiley & Sons, Ltd. [source] Methylation of acidic moieties in poly(methyl methacrylate-co-methacrylic acid) copolymers for end-group characterization by tandem mass spectrometryRAPID COMMUNICATIONS IN MASS SPECTROMETRY, Issue 14 2010Rémi Giordanengo The complete structural characterization of a copolymer composed of methacrylic acid (MAA) and methyl methacrylate (MMA) units was achieved using tandem mass spectrometry. In a first step, collision-induced dissociation (CID) of sodiated MAA-MMA co-oligomers allowed us to determine the co-monomeric composition, the random nature of the copolymer and the sum of the end-group masses. However, dissociation reactions of MAA-based molecules mainly involve the acidic pendant groups, precluding individual characterization of the end groups. Therefore, methylation of all the acrylic acid moieties was performed to transform the MAA-MMA copolymer into a PMMA homopolymer, for which CID mainly proceeds via backbone cleavages. Using trimethylsilyldiazomethane as a derivatization agent, this methylation reaction was shown to be complete without affecting the end groups. Using fragmentation rules established for PMMA polymers together with accurate mass measurements of the product ions and knowledge of reagents used for the studied copolymer synthesis, a structure could be proposed for both end groups and it was found to be consistent with signals obtained in nuclear magnetic resonance spectra. Copyright © 2010 John Wiley & Sons, Ltd. [source] |