Home About us Contact | |||
Computer Program (computer + program)
Kinds of Computer Program Selected AbstractsMaking Exchange Entitlements Operational: The Food Economy Approach to Famine Prediction and the RiskMap Computer ProgramDISASTERS, Issue 2 2000John Seaman The effect of production failure or some other shock on household income and food supply depends upon: (a) the pattern of household income, and (b) its ability to compensate for any deficit which may have occurred, for example, by the sale of assets or finding additional paid employment. The corollary is that the prediction of the likely effect of some event on the future state of the economy or food supply of a population of households requires an understanding of the economy of the households concerned and the economic context to which these relate. This paper describes an attempt to develop an approach to prediction using a dynamic model of economy based on quantitative descriptions of household economy obtained by systematic rapid field-work and summarises the experience of the use of this approach to date. [source] The Patentability of Computer Programs in Europe: An Improved Interpretation of Articles 52(2) and (3) of the European Patent ConventionTHE JOURNAL OF WORLD INTELLECTUAL PROPERTY, Issue 3 2010Sigrid Sterckx After the introduction we discuss the European Patent Convention (EPC) provisions that are relevant to the exculsion from patentability of computer programs and the broader relevance of the fact that the European Patent Office's (EPO's) Enlarged Board of Appeal has recently been requested by the EPO President to interpret these provisions. Next, we comment on the relevance of the recent EPC revision, before addressing what a computer program must be taken to mean for the purposes of the exclusion from patentability. After drawing attention to the conflict in case law that has developed in relation to the patentability of the computer programs and briefly summarizing the different approaches the EPO has taken to Article 52 of the EPC, we explain the evolution of these approaches, with particular attention to the EPO's dominant "technical character" approach. Subsequently, we address the questions put to the Enlarged Board and how they might be answered. We set out our proposal for what we believe is the approach the Enlarged Board should adopt. Since this approach might have effects beyond the field of computer programs, we show that the EPO case law outside computer programs would not be altered by our approach. Two alternative approaches are then critically addressed before setting out our conclusion. [source] Copyright Protection for Computer Programs in South Africa: Aspects of Sui Generis CategorizationTHE JOURNAL OF WORLD INTELLECTUAL PROPERTY, Issue 4 2009Lee-Ann Tong This article considers the protection of computer software in South Africa. It deals specifically with copyright of computer programs as provided for in the Copyright Act 98 of 1978 which makes provision for the categorization of computer programs as a sui generis category of works distinct from literary works. It explores the level of copyright protection under this regime with reference to aspects like the subsistence of copyright, authorship, ownership, duration, moral rights and infringement. It also considers the effect of the sui generis categorization on compliance with the Agreement on Trade-Related Aspects of Intellectual Property Rights and the anomalies that arise in the protection of preparatory work and computer programs. The focus is primarily on South African law. [source] Computer programs for estimating substrate flux into steady-state biofilms from pseudoanalytical solutionsCOMPUTER APPLICATIONS IN ENGINEERING EDUCATION, Issue 1 2002Chetan T. Goudar Abstract Fixed-film processes employing microorganisms attached to an inert surface (biofilms) are widely used for biological treatment of municipal and industrial wastewater. For optimal design and analysis of these processes, mathematical models are necessary that describe the dynamics of contaminant transport within these biofilms and the associated contaminant utilization by the microorganisms. However, these governing equations that typically involve Fickian diffusion for contaminant transport and Monod kinetics for contaminant utilization are inherently nonlinear and have no closed form solutions except under special conditions. This can restrict their use in the classroom as cumbersome numerical techniques must be used for their solution. This problem is well documented in the literature and several authors have presented pseudoanalytical solutions that replace numerical solutions with algebraic equations. In the present study, we present pseudoanalytical solution-based computer programs for estimating substrate flux and biofilm thickness for a steady-state biofilm. Depending upon the intended end use, these programs can either partially or totally automate the solution process. In the partial automation mode, they can serve to enhance student understanding of important concepts related to steady-state biofilms, while complete automation can help bring more challenging and realistic problems associated with steady-state biofilms into the classroom. The programs have been tested on MATLAB version 5.0 and are available as freeware for educational purposes. © 2002 Wiley Periodicals, Inc. Comput Appl Eng Educ 10: 26,32, 2002; Published online in Wiley InterScience (www.interscience.wiley.com.); DOI 10.1002/cae.10017 [source] Statistical issues on the determination of the no-observed-adverse-effect levels in toxicologyENVIRONMETRICS, Issue 4 2001Takashi Yanagawa Abstract The determination of a safe exposure level for toxic agents, often defined as the highest dose level with no toxic effect and termed the no-observed-adverse-effect level (NOAEL) is reviewed. The conventional methods based on statistical tests are criticized, particularly when the sample size is small, and an alternative method, which is based on the Akaike information criterion (AIC), is discussed. The method is extended to the estimation of the NOAEL for continuous data. Computer programs for Windows 95/NT for determining the NOAEL by the AIC approach are developed and its application to practical data is illustrated with examples. Copyright © 2001 John Wiley & Sons, Ltd. [source] The World Mental Health (WMH) Survey Initiative version of the World Health Organization (WHO) Composite International Diagnostic Interview (CIDI)INTERNATIONAL JOURNAL OF METHODS IN PSYCHIATRIC RESEARCH, Issue 2 2004Ronald C. Kessler Abstract This paper presents an overview of the World Mental Health (WMH) Survey Initiative version of the World Health Organization (WHO) Composite International Diagnostic Interview (CIDI) and a discussion of the methodological research on which the development of the instrument was based. The WMH-CIDI includes a screening module and 40 sections that focus on diagnoses (22 sections), functioning (four sections), treatment (two sections), risk factors (four sections), socio-demographic correlates (seven sections), and methodological factors (two sections). Innovations compared to earlier versions of the CIDI include expansion of the diagnostic sections, a focus on 12-month as well as lifetime disorders in the same interview, detailed assessment of clinical severity, and inclusion of information on treatment, risk factors, and consequences. A computer-assisted version of the interview is available along with a direct data entry software system that can be used to keypunch responses to the paper-and-pencil version of the interview. Computer programs that generate diagnoses are also available based on both ICD-10 and DSM-IV criteria. Elaborate CD-ROM-based training materials are available to teach interviewers how to administer the interview as well as to teach supervisors how to monitor the quality of data collection. Copyright © 2004 Whurr Publishers Ltd. [source] Optimal feeder bus routes on irregular street networksJOURNAL OF ADVANCED TRANSPORTATION, Issue 2 2000Steven Chien The methodology presented here seeks to optimize bus routes feeding a major intermodal transit transfer station while considering intersection delays and realistic street networks. A model is developed for finding the optimal bus route location and its operating headway in a heterogeneous service area. The criterion for optimality is the minimum total cost, including supplier and user costs. Irregular and discrete demand distributions, which realistically represent geographic variations in demand, are considered in the proposed model. The optimal headway is derived analytically for an irregularly shaped service area without demand elasticity, with non-uniformly distributed demand density, and with a many-to-one travel pattern. Computer programs are designed to analyze numerical examples, which show that the combinatory type routing problem can be globally optimized. The improved computational efficiency of the near-optimal algorithm is demonstrated through numerical comparisons to an optimal solution obtained by the exhaustive search (ES) algorithm. The CPU time spent by each algorithm is also compared to demonstrate that the near-optimal algorithm converges to an acceptable solution significantly faster than the ES algorithm. [source] Global Model for Optimizing Crossflow Microfiltration and Ultrafiltration Processes: A New Predictive and Design ToolBIOTECHNOLOGY PROGRESS, Issue 4 2005Gautam Lal Baruah A global model and algorithm that predicts the performance of crossflow MF and UF process individually or in combination in the laminar flow regime is presented and successfully tested. The model accounts for solute polydispersity, ionic environment, electrostatics, membrane properties and operating conditions. Computer programs were written in Fortran 77 for different versions of the model algorithm that can optimize MF/UF processes rapidly in terms of yield, purity, selectivity, or processing time. The model is validated successfully with three test cases: separation of bovine serum albumin (BSA) from hemoglobin (Hb), capture of immunoglobulin (IgG) from transgenic goat milk by MF, and separation of BSA from IgG by UF. These comparisons demonstrate the capability of the global model to conduct realistic in silico simulations of MF and UF processes. This model and algorithm should prove to be an invaluable technique to rapidly design new or optimize existing MF and UF processes separately or in combination in both pressure-dependent and pressure-independent regimes. [source] An elaborate education of basic genetic programming using C++COMPUTER APPLICATIONS IN ENGINEERING EDUCATION, Issue 3 2010Nirod C. Sahoo Abstract Evolutionary search is a global search method based on natural selection. In engineering curriculum, these techniques are taught in courses like Evolutionary Computation, Engineering Optimization, etc. Genetic algorithm (GA) is popular among these algorithms. Genetic programming (GP), developed by John Koza, is a powerful extension of GA where a chromosome/computer program (CP) is coded as a rooted point-labeled tree with ordered branches. The search space is the space of all possible CPs (trees) consisting of functions and terminals appropriate to the problem domain. GP uses, like GA, crossover and mutation for evolution. Due to tree-structured coding of individuals, the initial population generation, genetic operators' use, and tree decoding for fitness evaluations demand careful computer programming. This article describes the programming steps of GP implementation (using C++ language) for students' easy understanding with pseudocodes for each step. Two application examples are also illustrated. © 2009 Wiley Periodicals, Inc. Comput Appl Eng Educ 18: 434,448, 2010; View this article online at wileyonlinelibrary.com; DOI 10.1002/cae.20165 [source] A digital simulation of the vibration of a two-mass two-spring systemCOMPUTER APPLICATIONS IN ENGINEERING EDUCATION, Issue 3 2010Wei-Pin Lee Abstract In this study, we developed a computer program to simulate the vibration of a two-mass two-spring system by using Visual BASIC. Users can enter data for the two-mass two-spring system. The software will derive the eigenvalue problem from the input data. Then the software solves the eigenvalue problem and illustrates the results numerically and graphically on the screen. In addition, the program uses animation to demonstrate the motions of the two masses. The displacements, velocities, and accelerations of the two bodies can be shown if the corresponding checkboxes are selected. This program can be used in teaching courses, such as Linear Algebra, Advanced Engineering Mathematics, Vibrations, and Dynamics. Use of the software may help students to understand the applications of eigenvalue problems and related topics such as modes of vibration, natural frequencies, and systems of differential equations. © 2009 Wiley Periodicals, Inc. Comput Appl Eng Educ 18: 563,573, 2010; View this article online at wileyonlinelibrary.com; DOI 10.1002/cae.20241 [source] Digital simulation of the transformation of plane stressCOMPUTER APPLICATIONS IN ENGINEERING EDUCATION, Issue 1 2009Wei-Pin Lee Abstract In this study, we developed a computer program to simulate the transformation of plane stress by using Visual Basic.NET. We applied the equations of stress transformation to plane stress problems to calculate the stresses with respect to the 1,2 axes, which are rotated counterclockwise through an angle , about the x,y origin, and showed the visual results on the screen. In addition, we used animation to observe the change of plane stress. This program was then used in teaching courses, such as Mechanics of Materials and Linear Algebra. Use of the software may help students to understand principal stresses, principal axes, Mohr's circle, eigenvalues, eigenvectors, similar matrices, and invariants. © 2008 Wiley Periodicals, Inc. Comput Appl Eng Educ 17: 25,33, 2009; Published online in Wiley InterScience (www.interscience.wiley.com); DOI 10.1002/cae20180 [source] Integrating Messy Genetic Algorithms and Simulation to Optimize Resource UtilizationCOMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 6 2009Tao-ming Cheng Various resource distribution modeling scenarios were tested in simulation to determine their system performances. MGA operations were then applied in the selection of the best resource utilization schemes based on those performances. A case study showed that this new modeling mechanism, along with the implemented computer program, could not only ease the process of developing optimal resource utilization, but could also improve the system performance of the simulation model. [source] Numerical Methods to Simulate and Visualize Detailed Crane ActivitiesCOMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 3 2009Shih-Chung Kang One major consideration for virtual construction is the simulation of the operation of construction equipment for a construction project. This research specifically focuses on developing a mathematical model to support the simulation and visualization of cranes, the most critical equipment in terms of project controls. This model is composed of two submodels,a kinematics model and a dynamic model. The kinematics model is to present the crane components controlled by the operators. The dynamic model is to present the dynamic behavior in suspended system (including the cable and rigging object), which cannot be controlled directly by the operators. To verify the feasibility of these methods, a computer program that simulates and visualizes detailed crane activities was developed. This program supports the real-time visualization of crane activities with high degree of reality accuracy and also, enables the detailed simulation of long-term construction projects. [source] Steady-State Analysis of Water Distribution Networks Including Pressure-Reducing ValvesCOMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 4 2001L. Khezzar Hydraulic networks that contain controlling elements such as pressure-reducing valves (PRVs) are difficult to simulate. Limited literature exists on the explicit modeling of PRVs in a general solution procedure for steady-state analysis of water distribution systems. It is also known that inclusion of PRVs may lead to numerical difficulties. The objective of this article is to develop and present in sufficient detail the modeling of PRVs in combination with the linear theory method for steady-state analysis of water distribution networks. The presentation is explicit enough and leads to a robust algorithm that can be directly implemented in a computer program. The general methodology for simulating water distribution networks that embodies graph theoretic concepts, hydraulic theory, and numerical algorithms is reviewed. [source] Individual-based Computational Modeling of Smallpox Epidemic Control StrategiesACADEMIC EMERGENCY MEDICINE, Issue 11 2006Donald S. Burke MD In response to concerns about possible bioterrorism, the authors developed an individual-based (or "agent-based") computational model of smallpox epidemic transmission and control. The model explicitly represents an "artificial society" of individual human beings, each implemented as a distinct object, or data structure in a computer program. These agents interact locally with one another in code-represented social units such as homes, workplaces, schools, and hospitals. Over many iterations, these microinteractions generate large-scale macroscopic phenomena of fundamental interest such as the course of an epidemic in space and time. Model variables (incubation periods, clinical disease expression, contagiousness, and physical mobility) were assigned following realistic values agreed on by an advisory group of experts on smallpox. Eight response scenarios were evaluated at two epidemic scales, one being an introduction of ten smallpox cases into a 6,000-person town and the other an introduction of 500 smallpox cases into a 50,000-person town. The modeling exercise showed that contact tracing and vaccination of household, workplace, and school contacts, along with prompt reactive vaccination of hospital workers and isolation of diagnosed cases, could contain smallpox at both epidemic scales examined. [source] Structural and biophysical simulation of angiogenesis and vascular remodeling ,DEVELOPMENTAL DYNAMICS, Issue 4 2001Ralf Gödde Abstract The purpose of this report is to introduce a new computer model for the simulation of microvascular growth and remodeling into arteries and veins that imitates angiogenesis and blood flow in real vascular plexuses. A C++ computer program was developed based on geometric and biophysical initial and boundary conditions. Geometry was defined on a two-dimensional isometric grid by using defined sources and drains and elementary bifurcations that were able to proliferate or to regress under the influence of random and deterministic processes. Biophysics was defined by pressure, flow, and velocity distributions in the network by using the nodal-admittance-matrix-method, and accounting for hemodynamic peculiarities like Fahraeus-Lindqvist effect and exchange with extravascular tissue. The proposed model is the first to simulate interdigitation between the terminal branches of arterial and venous trees. This was achieved by inclusion of vessel regression and anastomosis in the capillary plexus and by remodeling in dependence from hemodynamics. The choice of regulatory properties influences the resulting vascular patterns. The model predicts interdigitating arteriovenous patterning if shear stress-dependent but not pressure-dependent remodeling was applied. By approximating the variability of natural vascular patterns, we hope to better understand homogeneity of transport, spatial distribution of hemodynamic properties and biomass allocation to the vascular wall or blood during development, or during evolution of circulatory systems. © 2001 Wiley-Liss, Inc. [source] The effect of thiazolidinediones on adiponectin serum level: a meta-analysisDIABETES OBESITY & METABOLISM, Issue 5 2008N. Riera-Guardia Background and aims:, Adiponectin is a hormone mainly produced by white adipose tissue. Decreased levels of adiponectin are linked with visceral obesity, insulin resistance states, and cardiovascular diseases. Recently, several studies have pointed out an increase in adiponectin serum levels in subjects undergoing treatment with thiazolidinediones (TZD). The aim of this study is to systematically review the current state of evidence of the effect of TZD on adiponectin serum level with special attention to avoid publication bias. Materials and methods:, An extensive literature search was performed. Meta Analysis Version 2.0 computer program was used to calculate statistical differences in means and 95% confidence interval (CI). Publication bias was assessed using different statistical approaches. Results:, In the meta-analysis including 19 studies the overall standardized mean difference was 0.94 (95% CI, 0.81,1.06) which means that subjects treated with TZDs on average had means of adiponectin concentration that were about 1 standard deviation higher than the comparison groups even after controlling for possible biases. Conclusions:, The results obtained agree with a moderate increase of serum adiponectin. The results clearly reveal an increase of endogenous serum adiponectin levels by intake of TZDs and may point to a potential new option to manage obesity-related diseases. [source] A computer program (WDTSRP) designed for computation of sand drift potential (DP) and plotting sand rosesEARTH SURFACE PROCESSES AND LANDFORMS, Issue 6 2007W. A. Saqqa Abstract Wind Data Tabulator and Sand Rose Plotter (WDTSRP) is an interactive developed computer program accessible for estimating sand transport potential by winds in barren sandy deserts. The Fryberger (1979) formula for determining sand drift potential (DP) was adopted to create and develop the computer program. WDTSRP is capable of working out weighting factors (WFs), frequency of wind speed occurrence (t), drift potential (DP), resultant drift potential (RDP) and directional variability of winds (DV) and of plotting sand roses. The developed computer program is built up of a simplified system driven by a group of options and dialogue boxes that allow users to input and handle data easily and systematically. Copyright © 2006 John Wiley & Sons, Ltd. [source] Linear analysis of concrete arch dams including dam,water,foundation rock interaction considering spatially varying ground motionsEARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 7 2010Jin-Ting Wang Abstract The available substructure method and computer program for earthquake response analysis of arch dams, including the effects of dam,water,foundation rock interaction and recognizing the semi-unbounded size of the foundation rock and fluid domains, are extended to consider spatial variations in ground motions around the canyon. The response of Mauvoisin Dam in Switzerland to spatially varying ground motion recorded during a small earthquake is analyzed to illustrate the results from this analysis procedure. Copyright © 2009 John Wiley & Sons, Ltd. [source] Maximum likelihood estimators of population parameters from doubly left-censored samplesENVIRONMETRICS, Issue 8 2006Abou El-Makarim A. Aboueissa Abstract Left-censored data often arise in environmental contexts with one or more detection limits, DLs. Estimators of the parameters are derived for left-censored data having two detection limits: DL1 and DL2 assuming an underlying normal distribution. Two different approaches for calculating the maximum likelihood estimates (MLE) are given and examined. These methods also apply to lognormally distributed environmental data with two distinct detection limits. The performance of the new estimators is compared utilizing many simulated data sets. Examples are given illustrating the use of these methods utilizing a computer program given in the Appendix. Copyright © 2006 John Wiley & Sons, Ltd. [source] Rapid Cooling Aborts Seizure-Like Activity in Rodent Hippocampal-Entorhinal SlicesEPILEPSIA, Issue 10 2000Matthew W. Hill Summary Purpose: As a preliminary step in the development of an implantable Peltier device to abort focal neocortical seizures in vivo, we have examined the effect of rapid cooling on seizures in rodent hippocampal-entorhinal slices. Methods: Seizure-like discharges were induced by exposing the slices to extracellular saline containing 4-aminopyridine (50 ,mol/L). Results: When we manually activated a Peltier device that was in direct contact with the slice, seizures terminated within seconds of the onset of cooling, sometimes preceding a detectable decrease in temperature measured near the top of the slice. However, activation of the Peltier device did not stop seizures when slices were no longer in direct physical contact with the device, indicating that this was not a field effect. When cooling was shut off and temperature returned to 33oC, bursting some-times returned, but a longer-term suppressive effect on seizure activity could be observed. In two of our experiments, a custom computer program automatically detected seizure discharges and triggered a transistor-transistor logic pulse to activate the Peltier device. In these experiments, the Peltier device automatically terminated the slice bursting in less than 4 seconds. When the Peltier device was placed in contact with the normal, exposed cortex of a newborn pig, we found that the cortical temperature decreased rapidly from 36oC to as low as 26oC at a depth of 1.7 mm below the cooling unit. Conclusions: These experiments show that local cooling may rapidly terminate focal paroxysmal discharges and might be adapted for clinical practice. [source] Successive expansion method of network planning applying symbolic analysis methodEUROPEAN TRANSACTIONS ON ELECTRICAL POWER, Issue 4 2002kokljev The conventional power system network successive expansion planning method is discussed in the context of the new paradigm of competitive electric power, energy and service market. In sequel, the paper presents an application of the conceptually new computer program, based on the symbolic analysis of load flows in power system networks. The network parameters and variables are defined as symbols. The symbolic analyzer which models analytically the power system DC load flows enables the sensitivity analysis of the power system to parameter and variable variations (costs, transfers, injections), a valuable tool for the expansion planning analysis. That virtue could not be found within the conventional approach, relying on compensation methods, precalculated distribution factors, and so on. This novel application sheds some light on the traditional power system network expansion planning method, as well as on its possible application within the system network expansion planning in the new environment assuming the competitive electric power market. [source] The analysis of thermoelastic isopachic data from crack tip stress fieldsFATIGUE & FRACTURE OF ENGINEERING MATERIALS AND STRUCTURES, Issue 4 2000Dulieu-Barton A computer program,FACTUS (fracture analysis of crack tips using SPATE),has been developed for the efficient analysis of thermoelastic data obtained from around a crack tip. The program is based on earlier work for the determination of stress intensity factors (SIFs), and also includes a novel solution procedure for the derivation of the non-singular stress term ,0x,. The program has been used in the analysis of a series of large plate specimens with central or edge slots/cracks. The derived SIFs are compared with independent values. Issues, e.g. crack closure and the extent and effect of the plastic zone, are discussed. [source] Identification of protein-coding genes in the genome of Vibrio cholerae with more than 98% accuracy using occurrence frequencies of single nucleotidesFEBS JOURNAL, Issue 15 2001Ju Wang The published sequence of the Vibrio cholerae genome indicates that, in addition to the genes that encode proteins of known and unknown function, there are 1577 ORFs identified as conserved hypothetical or hypothetical gene candidates. Because the annotation is not 100% accurate, it is not known which of the 1577 ORFs are true protein-coding genes. In this paper, an algorithm based on the Z curve method, with sensitivity, specificity and accuracy greater than 98%, is used to solve this problem. Twenty-fold cross-validation tests show that the accuracy of the algorithm is 98.8%. A detailed discussion of the mechanism of the algorithm is also presented. It was found that 172 of the 1577 ORFs are unlikely to be protein-coding genes. The number of protein-coding genes in the V. cholerae genome was re-estimated and found to be ,,3716. This result should be of use in microarray analysis of gene expression in the genome, because the cost of preparing chips may be somewhat decreased. A computer program was written to calculate a coding score called VCZ for gene identification in the genome. Coding/noncoding is simply determined by VCZ > 0/VCZ < 0. The program is freely available on request for academic use. [source] Semiparametric variance-component models for linkage and association analyses of censored trait dataGENETIC EPIDEMIOLOGY, Issue 7 2006G. Diao Abstract Variance-component (VC) models are widely used for linkage and association mapping of quantitative trait loci in general human pedigrees. Traditional VC methods assume that the trait values within a family follow a multivariate normal distribution and are fully observed. These assumptions are violated if the trait data contain censored observations. When the trait pertains to age at onset of disease, censoring is inevitable because of loss to follow-up and limited study duration. Censoring also arises when the trait assay cannot detect values below (or above) certain thresholds. The latent trait values tend to have a complex distribution. Applying traditional VC methods to censored trait data would inflate type I error and reduce power. We present valid and powerful methods for the linkage and association analyses of censored trait data. Our methods are based on a novel class of semiparametric VC models, which allows an arbitrary distribution for the latent trait values. We construct appropriate likelihood for the observed data, which may contain left or right censored observations. The maximum likelihood estimators are approximately unbiased, normally distributed, and statistically efficient. We develop stable and efficient numerical algorithms to implement the corresponding inference procedures. Extensive simulation studies demonstrate that the proposed methods outperform the existing ones in practical situations. We provide an application to the age at onset of alcohol dependence data from the Collaborative Study on the Genetics of Alcoholism. A computer program is freely available. Genet. Epidemiol. 2006. © 2006 Wiley-Liss, Inc. [source] Estimating the power of variance component linkage analysis in large pedigreesGENETIC EPIDEMIOLOGY, Issue 6 2006Wei-Min Chen Abstract Variance component linkage analysis is commonly used to map quantitative trait loci (QTLs) in general pedigrees. Large pedigrees are especially attractive for these studies because they provide greater power per genotyped individual than small pedigrees. We propose accurate and computationally efficient methods to calculate the analytical power of variance component linkage analysis that can accommodate large pedigrees. Our analytical power computation involves the approximation of the noncentrality parameter for the likelihood-ratio test by its Taylor expansions. We develop efficient algorithms to compute the second and third moments of the identical by descent (IBD) sharing distribution and enable rapid computation of the Taylor expansions. Our algorithms take advantage of natural symmetries in pedigrees and can accurately analyze many large pedigrees in a few seconds. We verify the accuracy of our power calculation via simulation in pedigrees with 2,5 generations and 2,8 siblings per sibship. We apply this proposed analytical power calculation to 98 quantitative traits in a cohort study of 6,148 Sardinians in which the largest pedigree includes 625 phenotyped individuals. Simulations based on eight representative traits show that the difference between our analytical estimation of the expected LOD score and the average of simulated LOD scores is less than 0.05 (1.5%). Although our analytical calculations are for a fully informative marker locus, in the settings we examined power was similar to what could be attained with a single nucleotide polymorphism (SNP) mapping panel (with >1 SNP/cM). Our algorithms for power analysis together with polygenic analysis are implemented in a freely available computer program, POLY. Genet. Epidemiol. 2006. © 2006 Wiley-Liss, Inc. [source] Estimating Aquifer Transmissivity from Specific Capacity Using MATLABGROUND WATER, Issue 4 2005Stephen G. McLin Historically, specific capacity information has been used to calculate aquifer transmissivity when pumping test data are unavailable. This paper presents a simple computer program written in the MATLAB programming language that estimates transmissivity from specific capacity data while correcting for aquifer partial penetration and well efficiency. The program graphically plots transmissivity as a function of these factors so that the user can visually estimate their relative importance in a particular application. The program is compatible with any computer operating system running MATLAB, including Windows, Macintosh OS, Linux, and Unix. Two simple examples illustrate program usage. [source] Removal of gutta-percha from root canals using an Nd:YAG laserINTERNATIONAL ENDODONTIC JOURNAL, Issue 10 2003D. Vidu Abstract Aim, To examine the use of an Nd:YAG laser in removing gutta-percha fillings from root canals when used in conjunction with eucalyptol, dimethylformamide (DMF) or no solvent. Methodology, Root-canal fillings (sealer and gutta-percha) were removed with laser irradiation of 20 Hz/1.5 W from 30 roots randomly divided in three groups. In group 1, the solvent was eucalyptol; in group 2, the solvent was DMF; and in group 3, no solvent was used. Laser irradiation was performed until the temperature measured on the root surface increased by 4 °C over room temperature. The treatment was deemed complete when the apical foramen was reached with the optical fibre and a reamer. The samples were split longitudinally, and the area of remaining gutta-percha on the root-canal walls was determined with the aid of a computer program. The total number of laser pulses to achieve length and the highest temperature recorded was determined for each tooth. The results were statistically analysed using Student's t -test (P < 0.05) for independent samples. Results, The average temperature increase in group 1 was 9.17 ± 0.56 °C; in group 2, 9.56 ± 0.28 °C; and in group 3, 8.29 ± 0.41 °C. The shortest time to achieve length was in group 3 (6.4 ± 0.49 min), then in group 1 (6.7 ± 0.85 min) and group 2 (7.05 ± 0.79 min). The area of remaining gutta-percha was the largest in group 2 (6.13 ± 5.76%), whilst the smallest was for group 3 (4.69 ± 4.03%), but the difference was not statistically significant. The number of pulses was not statistically significant between the groups. Conclusions, Use of an Nd:YAG laser alone is capable of softening gutta-percha. The addition of solvents did not improve the retreatment, either in terms of the time required for the procedure or in terms of the area of remaining gutta-percha on root-canal walls. [source] A new mixed finite element method for poro-elasticityINTERNATIONAL JOURNAL FOR NUMERICAL AND ANALYTICAL METHODS IN GEOMECHANICS, Issue 6 2008Maria Tchonkova Abstract Development of robust numerical solutions for poro-elasticity is an important and timely issue in modern computational geomechanics. Recently, research in this area has seen a surge in activity, not only because of increased interest in coupled problems relevant to the petroleum industry, but also due to emerging applications of poro-elasticity for modelling problems in biomedical engineering and materials science. In this paper, an original mixed least-squares method for solving Biot consolidation problems is developed. The solution is obtained via minimization of a least-squares functional, based upon the equations of equilibrium, the equations of continuity and weak forms of the constitutive relationships for elasticity and Darcy flow. The formulation involves four separate categories of unknowns: displacements, stresses, fluid pressures and velocities. Each of these unknowns is approximated by linear continuous functions. The mathematical formulation is implemented in an original computer program, written from scratch and using object-oriented logic. The performance of the method is tested on one- and two-dimensional classical problems in poro-elasticity. The numerical experiments suggest the same rates of convergence for all four types of variables, when the same interpolation spaces are used. The continuous linear triangles show the same rates of convergence for both compressible and entirely incompressible elastic solids. This mixed formulation results in non-oscillating fluid pressures over entire domain for different moments of time. The method appears to be naturally stable, without any need of additional stabilization terms with mesh-dependent parameters. Copyright © 2007 John Wiley & Sons, Ltd. [source] Numerical solution for consolidation and desiccation of soft soilsINTERNATIONAL JOURNAL FOR NUMERICAL AND ANALYTICAL METHODS IN GEOMECHANICS, Issue 2 2002Daniel T. C. Yao Abstract The consolidation and desiccation behaviour of soft soils can be described by two time-dependent non-linear partial differential equations using the finite strain theory. Analytical solutions do not exist for these governing equations. In this paper, we develop efficient numerical methods and software for finding the numerical solutions. We introduce a semi-implicit time integration scheme, and show numerically that our method converges. In addition, the numerical solution matches well with the experimental result. A boundary refinement method is also developed to improve the convergence and stability for the case of Neumann type boundary conditions. Interface governing equations are derived to maintain the continuity of consolidation and desiccation processes. This is useful because the soil column can undergo desiccation on top and consolidation on the bottom simultaneously. The numerical algorithms has been implemented into a computer program and the results have been verified with centrifuge test results conducted in our laboratory. Copyright © 2001 John Wiley & Sons, Ltd. [source] |