Programming

Distribution by Scientific Domains
Distribution within Engineering

Kinds of Programming

  • compromise programming
  • computer programming
  • developmental programming
  • dynamic programming
  • fetal programming
  • foetal programming
  • genetic programming
  • integer linear programming
  • integer programming
  • linear programming
  • mathematical programming
  • mixed integer linear programming
  • mixed integer programming
  • non-linear programming
  • nonlinear programming
  • prenatal programming
  • quadratic programming
  • sequential quadratic programming
  • stochastic dynamic programming
  • stochastic programming

  • Terms modified by Programming

  • programming algorithm
  • programming algorithms
  • programming approach
  • programming effort
  • programming environment
  • programming formulation
  • programming formulations
  • programming interface
  • programming language
  • programming languages
  • programming method
  • programming methods
  • programming model
  • programming models
  • programming paradigm
  • programming problem
  • programming relaxation
  • programming solver
  • programming technique
  • programming techniques

  • Selected Abstracts


    DECISION SUPPORT FOR ALLOCATION OF WATERSHED POLLUTION LOAD USING GREY FUZZY MULTIOBJECTIVE PROGRAMMING,

    JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 3 2006
    Ho-Wen Chen
    ABSTRACT: This paper uses the grey fuzzy multiobjective programming to aid in decision making for the allocation of waste load in a river system under versatile uncertainties and risks. It differs from previous studies by considering a multicriteria objective function with combined grey and fuzzy messages under a cost benefit analysis framework. Such analysis technically integrates the prior information of water quality models, water quality standards, wastewater treatment costs, and potential benefits gained via in-stream water quality improvement. While fuzzy sets are characterized based on semantic and cognitive vagueness in decision making, grey numbers can delineate measurement errors in data collection. By employing three distinct set theoretic fuzzy operators, the synergy of grey and fuzzy implications may smoothly characterize the prescribed management complexity. With the aid of genetic algorithm in the solution procedure, the modeling outputs contribute to the development of an effective waste load allocation and reduction scheme for tributaries in this subwatershed located in the lower Tseng-Wen River Basin, South Taiwan. Research findings indicate that the inclusion of three fuzzy set theoretic operators in decision analysis may delineate different tradeoffs in decision making due to varying changes, transformations, and movements of waste load in association with land use pattern within the watershed. [source]


    GENETIC PROGRAMMING AND ITS APPLICATION IN REAL-TIME RUNOFF FORECASTING,

    JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 2 2001
    Soon Thiam Khu
    ABSTRACT: Genetic programming (GP), a relatively new evolutionary technique, is demonstrated in this study to evolve codes for the solution of problems. First, a simple example in the area of symbolic regression is considered. GP is then applied to real-time runoff forecasting for the Orgeval catchment in France. In this study, GP functions as an error updating scheme to complement a rainfall-runoff model, MIKE11/NAM. Hourly runoff forecasts of different updating intervals are performed for forecast horizons of up to nine hours. The results show that the proposed updating scheme is able to predict the runoff quite accurately for all updating intervals considered and particularly for updating intervals not exceeding the time of concentration of the catchment. The results are also compared with those of an earlier study, by the World Meteorological Organization, in which autoregression and Kalman filter were used as the updating methods. Comparisons show that GP is a better updating tool for real-time flow forecasting. Another important finding from this study is that nondimensionalizing the variables enhances the symbolic regression process significantly. [source]


    Detecting New Forms of Network Intrusion Using Genetic Programming

    COMPUTATIONAL INTELLIGENCE, Issue 3 2004
    Wei Lu
    How to find and detect novel or unknown network attacks is one of the most important objectives in current intrusion detection systems. In this paper, a rule evolution approach based on Genetic Programming (GP) for detecting novel attacks on networks is presented and four genetic operators, namely reproduction, mutation, crossover, and dropping condition operators, are used to evolve new rules. New rules are used to detect novel or known network attacks. A training and testing dataset proposed by DARPA is used to evolve and evaluate these new rules. The proof of concept implementation shows that a rule generated by GP has a low false positive rate (FPR), a low false negative rate and a high rate of detecting unknown attacks. Moreover, the rule base composed of new rules has high detection rate with low FPR. An alternative to the DARPA evaluation approach is also investigated. [source]


    Out-of-Core and Dynamic Programming for Data Distribution on a Volume Visualization Cluster

    COMPUTER GRAPHICS FORUM, Issue 1 2009
    S. Frank
    I.3.2 [Computer Graphics]: Distributed/network graphics; C.2.4 [Distributed Systems]: Distributed applications Abstract Ray directed volume-rendering algorithms are well suited for parallel implementation in a distributed cluster environment. For distributed ray casting, the scene must be partitioned between nodes for good load balancing, and a strict view-dependent priority order is required for image composition. In this paper, we define the load balanced network distribution (LBND) problem and map it to the NP-complete precedence constrained job-shop scheduling problem. We introduce a kd-tree solution and a dynamic programming solution. To process a massive data set, either a parallel or an out-of-core approach is required. Parallel preprocessing is performed by render nodes on data, which are allocated using a static data structure. Volumetric data sets often contain a large portion of voxels that will never be rendered, or empty space. Parallel preprocessing fails to take advantage of this. Our slab-projection slice, introduced in this paper, tracks empty space across consecutive slices of data to reduce the amount of data distributed and rendered. It is used to facilitate out-of-core bricking and kd-tree partitioning. Load balancing using each of our approaches is compared with traditional methods using several segmented regions of the Visible Korean data set. [source]


    Grids challenged by a Web 2.0 and multicore sandwich

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 3 2009
    Geoffrey Fox
    Abstract We discuss the application of Web 2.0 to support scientific research (e-Science) and related ,e-more or less anything' applications. Web 2.0 offers interesting technical approaches (protocols, message formats, and programming tools) to build core e-infrastructure (cyberinfrastructure) as well as many interesting services (Facebook, YouTube, Amazon S3/EC2, and Google maps) that can add value to e-infrastructure projects. We discuss why some of the original Grid goals of linking the world's computer systems may not be so relevant today and that interoperability is needed at the data and not always at the infrastructure level. Web 2.0 may also support Parallel Programming 2.0,a better parallel computing software environment motivated by the need to run commodity applications on multicore chips. A ,Grid on the chip' will be a common use of future chips with tens or hundreds of cores. Copyright © 2008 John Wiley & Sons, Ltd. [source]


    Programming and coordinating Grid environments and applications

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 5 2004
    Cristina Ururahy
    Abstract The heterogeneous and dynamic nature of Grid environments place new demands on models and paradigms for parallel programming. In this work we discuss how ALua, a programming system based on a dual programming language model, can help the programmer to develop applications for this environment, monitoring the state of resources and controlling the application so that it adapts to changes in this state. Copyright © 2004 John Wiley & Sons, Ltd. [source]


    Incorporating multiple criteria into the design of conservation area networks: a minireview with recommendations

    DIVERSITY AND DISTRIBUTIONS, Issue 2 2006
    Alexander Moffett
    ABSTRACT We provide a review of multicriteria decision-making (MCDM) methods that may potentially be used during systematic conservation planning for the design of conservation area networks (CANs). We review 26 methods and present the core ideas of 19 of them. We suggest that the computation of the non-dominated set (NDS) be the first stage of any such analysis. This process requires only that alternatives be qualitatively ordered by each criterion. If the criteria can also be similarly ordered, at the next stage, Regime is the most appropriate method to refine the NDS. If the alternatives can also be given quantitative values by the criteria, Goal Programming will prove useful in many contexts. If both the alternatives and the criteria can be quantitatively evaluated, and the criteria are independent of each other but may be compounded, then multi-attribute value theory (MAVT) should be used (with preferences conveniently elicited by a modified Analytic Hierarchy Process (mAHP) provided that the number of criteria is not large). [source]


    Virus-evolutionary linear genetic programming

    ELECTRONICS & COMMUNICATIONS IN JAPAN, Issue 1 2008
    Kenji Tamura
    Abstract Many kinds of evolutionary methods have been proposed. GA and GP in particular have demonstrated their effectiveness in various problems recently, and many systems have been proposed. One is Virus-Evolutionary Genetic Algorithm (VE-GA), and the other is Linear Genetic Programming in C (LGPC). The performance of each system has been reported. VE-GA is the coevolution system of host individuals and virus individuals. That can spread schema effectively among the host individuals by using virus infection and virus incorporation. LGPC implements the GP by representing the individuals to one dimension as if GA. LGPC can reduce a search cost of pointer and save machine memory, and can reduce the time to implement GP programs. We have proposed that a system introduce virus individuals in LGPC, and analyzed the performance of the system on two problems. Our system can spread schema among the population, and search solution effectively. The results of computer simulation show that this system can search for solution depending on LGPC applying problem's character compared with LGPC. © 2008 Wiley Periodicals, Inc. Electron Comm Jpn, 91(1): 32, 39, 2008; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/eej.10030 [source]


    Incorporating power system security into market-clearing of day-ahead joint energy and reserves auctions

    EUROPEAN TRANSACTIONS ON ELECTRICAL POWER, Issue 2 2010
    J. Aghaei
    Abstract This paper is intended to introduce a technique for incorporating system security into the clearing of day-ahead joint electricity markets, with particular emphasis on the voltage stability. A Multiobjective Mathematical Programming (MMP) formulation is implemented for provision of ancillary services (Automatic Generation Control or AGC, spinning, non-spinning, and operating reserves) as well as energy in simultaneous auctions by pool-based aggregated market scheme. In the proposed market-clearing structure, the security problem, as an important responsibility of ISO, is addressed and a nonlinear model is formulated and used as the extra objective functions of the optimization problem. Thus, in the MMP formulation of the market-clearing process, the objective functions (including augmented generation offer cost, overload index, voltage drop index, and loading margin) are optimized while meeting AC power flow constraints, system reserve requirements, and lost opportunity cost (LOC) considerations. The IEEE 24-bus Reliability Test System (RTS 24-bus) is used to demonstrate the performance of the proposed method. Copyright © 2008 John Wiley & Sons, Ltd. [source]


    Incorporating Penalty Function to Reduce Spill in Stochastic Dynamic Programming Based Reservoir Operation of Hydropower Plants

    IEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING, Issue 5 2010
    Deependra Kumar Jha Non-member
    Abstract This paper proposes a framework that includes a penalty function incorporated stochastic dynamic programming (SDP) model in order to derive the operation policy of the reservoir of a hydropower plant, with an aim to reduce the amount of spill during operation of the reservoir. SDP models with various inflow process assumptions (independent and Markov-I) are developed and executed in order to derive the reservoir operation policies for the case study of a storage type hydropower plant located in Japan. The policy thus determined consists of target storage levels (end-of-period storage levels) for each combination of the beginning-of-period storage levels and the inflow states of the current period. A penalty function is incorporated in the classical SDP model with objective function that maximizes annual energy generation through operation of the reservoir. Due to the inclusion of the penalty function, operation policy of the reservoir changes in a way that ensures reduced spill. Simulations are carried out to identify reservoir storage guide curves based on the derived operation policies. Reservoir storage guide curves for different values of the coefficient of penalty function , are plotted for a study horizon of 64 years, and the corresponding average annual spill values are compared. It is observed that, with increasing values of ,, the average annual spill decreases; however, the simulated average annual energy value is marginally reduced. The average annual energy generation can be checked vis-à-vis the average annual spill reduction, and the optimal value of , can be identified based on the cost functions associated with energy and spill. © 2010 Institute of Electrical Engineers of Japan. Published by John Wiley & Sons, Inc. [source]


    A practical approach for estimating illumination distribution from shadows using a single image

    INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY, Issue 2 2005
    Taeone Kim
    Abstract This article presents a practical method that estimates illumination distribution from shadows using only a single image. The shadows are assumed to be cast on a textured, Lambertian surface by an object of known shape. Previous methods for illumination estimation from shadows usually require that the reflectance property of the surface on which shadows are cast be constant or uniform, or need an additional image to cancel out the effects of varying albedo of the textured surface on illumination estimation. But, our method deals with an estimation problem for which surface albedo information is not available. In this case, the estimation problem corresponds to an underdetermined one. We show that the combination of regularization by correlation and some user-specified information can be a practical method for solving the underdetermined problem. In addition, as an optimization tool for solving the problem, we develop a constrained Non-Negative Quadratic Programming (NNQP) technique into which not only regularization but also multiple linear constraints induced by user-specified information are easily incorporated. We test and validate our method on both synthetic and real images and present some experimental results. © 2005 Wiley Periodicals, Inc. Int J Imaging Syst Technol, 15, 143,154, 2005; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/ima.20047 [source]


    Hyperbolic Penalty: A New Method for Nonlinear Programming with Inequalities

    INTERNATIONAL TRANSACTIONS IN OPERATIONAL RESEARCH, Issue 6 2001
    Adilson Elias Xavier
    This work intends to present and to analyze a new penalty method that purposes to solve the general nonlinear programming problem subject to inequality constraints. The proposed method has the important feature of being completely differentiable and combines features of both exterior and interior penalty methods. Numerical results for some problems are commented on. [source]


    Perspective: PTH/PTHrP Activity and the Programming of Skeletal Development In Utero,

    JOURNAL OF BONE AND MINERAL RESEARCH, Issue 2 2004
    Jonathan H Tobias
    First page of article [source]


    Sequential Quadratic Programming for Development of a New Probiotic Dairy Tofu with Glucono-,-Lactone

    JOURNAL OF FOOD SCIENCE, Issue 7 2004
    M.-J. Chen
    ABSTRACT: The purpose of this research was to evaluate the effects of various concentrations of glucono-,-lactone (GDL) and skim milk powder, as well as the addition of prebiotics, on the rheology and probiotic viabilities of dairy tofu. Additionally, modern optimization techniques were applied to attempt to determine the optimal processing conditions and growth rate for the selected probiotics (Lactobacillus. acidophilus, L. casei, Bifidobacteria bifidum, and B. longum). There were 2 stages in this research to accomplish the goal. The 1st stage was to derive surface models using response surface methodology (RSM); the 2nd stage performed optimization on the models using sequential quadratic programming (SQP) techniques. The results were demonstrated to be effective. The most favorable production conditions of dairy tofu were 1% GDL, 0% peptides, 3% isomaltooligosaccharides (IMO), and 18% milk, as confirmed by subsequent verification experiments. Analysis of the sensory evaluation results revealed no significant difference between the probiotic dairy tofu and the GDL analog in terms of texture and appearance (P < 0.05). The viable numbers of probiotics were well above the recommended limit of 106 CFU/g for the probiotic dairy tofu throughout the tested storage period. [source]


    Ideal Design Programming with Photoethnographic Data and Systems Analysis

    JOURNAL OF INTERIOR DESIGN, Issue 3 2006
    Barbara McFall Ph.D
    First page of article [source]


    Efficient optimization strategies with constraint programming,

    AICHE JOURNAL, Issue 2 2010
    Prakash R. Kotecha
    Abstract In this article, we propose novel strategies for the efficient determination of multiple solutions for a single objective, as well as globally optimal pareto fronts for multiobjective, optimization problems using Constraint Programming (CP). In particular, we propose strategies to determine, (i) all the multiple (globally) optimal solutions of a single objective optimization problem, (ii) K -best feasible solutions of a single objective optimization problem, and (iii) globally optimal pareto fronts (including nonconvex pareto fronts) along with their multiple realizations for multiobjective optimization problems. It is shown here that the proposed strategy for determining K -best feasible solutions can be tuned as per the requirement of the user to determine either K -best distinct or nondistinct solutions. Similarly, the strategy for determining globally optimal pareto fronts can also be modified as per the requirement of the user to determine either only the distinct set of pareto points or determine the pareto points along with all their multiple realizations. All the proposed techniques involve appropriately modifying the search techniques and are shown to be computationally efficient in terms of not requiring successive re-solving of the problem to obtain the required solutions. This work therefore convincingly addresses the issue of efficiently determining globally optimal pareto fronts; in addition, it also guarantees the determination of all the possible realizations associated with each pareto point. The uncovering of such solutions can greatly aid the designer in making informed decisions. The proposed approaches are demonstrated via two case studies, which are nonlinear, combinatorial optimization problems, taken from the area of sensor network design. © 2009 American Institute of Chemical Engineers AIChE J, 2010 [source]


    Decision-maker's preferences modelling within the goal-programming model: a new typology

    JOURNAL OF MULTI CRITERIA DECISION ANALYSIS, Issue 5-6 2009
    Belaid Aouni
    Abstract Several classifications of the Multiple Objectives Programming (MOP) models have been proposed in the literature. In general, these classifications are based on the timing of introducing the decision-maker's (DM) preferences and the type of the required information about the parameters of the decision-making situation. The DM's preference information can take different forms such as: weights, priority levels, thresholds or trade-offs among the objectives. The Goal Programming (GP) is one of the well-known MOP models. The different GP formulations deal differently with the DM's preferences. The aim of this paper is to propose a new typology of the GP variants based on the way that the DM's preferences are considered. Copyright © 2010 John Wiley & Sons, Ltd. [source]


    Goal Programming: realistic targets for the near future

    JOURNAL OF MULTI CRITERIA DECISION ANALYSIS, Issue 3-4 2009
    Rafael Caballero
    Abstract Goal Programming (GP) can be regarded as one of the most widely used multicriteria decision-making techniques. In this paper, two surveys are carried out. First, the evolution of GP since its birth to the present time, in terms of number of publications, references, journals, etc., has been studied. Second, a more in-depth survey has been carried out, which covers the publications from year 2000 to the present time. All the references are listed, and some conclusions and future research lines have been extracted about the late trends of GP. Copyright © 2010 John Wiley & Sons, Ltd. [source]


    INTEGRATING HUMANS IN ECOSYSTEM MANAGEMENT USING MULTI-CRITERIA DECISION MAKING,

    JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 2 2003
    Georgios E. Pavlikakis
    ABSTRACT: The Ecosystem Management (EM) process belongs to the category of Multi-Criteria Decision Making (MCDM) problems. It requires appropriate decision support systems (DSS) where "all interested people" would be involved in the decision making process. Environmental values critical to EM, such as the biological diversity, health, productivity and sustainability, have to be studied, and play an important role in modeling the ecosystem functions; human values and preferences also influence decision making. Public participation in decision and policy making is one of the elements that differentiate EM from the traditional methods of management. Here, a methodology is presented on how to quantify human preferences in EM decision making. The case study of the National Park of River Nestos Delta and Lakes Vistonida and Ismarida in Greece, presented as an application of this methodology, shows that the direct involvement of the public, the quantification of its preferences and the decision maker's attitude provide a strong tool to the EM decision making process. Public preferences have been given certain weights and three MCDM methods, namely, the Expected Utility Method, Compromise Programming and the Analytic Hierarchy Process, have been used to select alternative management solutions that lead to the best configuration of the ecosystem and are also socially acceptable. [source]


    RESERVOIR OPERATION ANI EVALUATION OF DOWNSTREAM FLOW AUGMENTATION,

    JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 3 2001
    Mahesh Kumar Sahu
    ABSTRACT: Operation of a storage-based reservoir modifies the downstream flow usually to a value higher than that of natural flow in dry season. This could be important for irrigation, water supply, or power production as it is like an additional downstream benefit without any additional investment. This study addresses the operation of two proposed reservoirs and the downstream flow augmentation at an irrigation project located at the outlet of the Gandaki River basin in Nepal. The optimal operating policies of the reservoirs were determined using a Stochastic Dynamic Programming (SDP) model considering the maximization of power production. The modified flows downstream of the reservoirs were simulated by a simulation model using the optimal operating policy (for power maximization) and a synthetic long-term inflow series. Comparing the existing flow (flow in river without reservoir operation) and the modified flow (flow after reservoir operation) at the irrigation project, the additional amount of flow was calculated. The reliability analysis indicated that the supply of irrigation could be increased by 25 to 100 percent of the existing supply over the dry season (January to April) with a reliability of more than 80 percent. [source]


    Linear Programming and the von Neumann Model

    METROECONOMICA, Issue 1 2000
    Christian Bidard
    The formal similarity between von Neumann's theorem on maximal growth and (a weak form of) the fundamental theorem of linear programming is striking. The parallelism is explained by considering a simple economy for which the two problems are identical. [source]


    QSAR Models for the Dermal Penetration of Polycyclic Aromatic Hydrocarbons Based on Gene Expression Programming

    MOLECULAR INFORMATICS, Issue 7 2008
    Tao Wang
    Abstract Gene Expression Programming (GEP) is a novel machine learning technique. The GEP is used to build nonlinear quantitative structure activity relationship model for the prediction of the Percent of Applied Dose Dermally Absorbed (PADA) over 24,h for polycyclic aromatic hydrocarbons. This model is based on descriptors which are calculated from the molecular structure. Three descriptors are selected from the descriptors pool by Heuristic Method (HM) to build a multivariable linear model. The GEP method produced a nonlinear quantitative model with a correlation coefficient and a mean error of 0.92 and 4.70 for the training set, 0.91 and 7.65 for the test set, respectively. It is shown that the GEP predicted results are in good agreement with experimental ones. [source]


    Programming: Nuts and Bolts

    NEW DIRECTIONS FOR STUDENT SERVICES, Issue 90 2000
    M. Celine Hartwig
    Program implementation is a task-oriented process. This chapter highlights details to consider for effective program delivery and offers suggestions for dealing with unexpected circumstances that may arise throughout the course of a program. [source]


    Use of an Intracardiac Electrogram Eliminates the Need for a Surface ECG during Implantable Cardioverter-Defibrillator Follow-Up

    PACING AND CLINICAL ELECTROPHYSIOLOGY, Issue 12 2007
    KEVIN A. MICHAEL M.B.Ch.B.
    Background:A surface electrocardiogram (SECG) for pacing threshold measurements during routine implantable cardioverter-defibrillator (ICD) follow-up can be cumbersome. This study evaluated the use of an intrathoracic far-field electrogram (EGM) derived between the Can and superior vena cava (SVC) electrode,the Leadless electrocardiogram (LLECG), in dual chamber ICDs in performing pacing threshold tests. Methods:The LLECG was evaluated prospectively during atrial and ventricular pacing threshold testing as a substudy of the Comparison of Empiric to Physician-Tailored Programming of Implantable Cardioverter-Defibrillators trial (EMPIRIC) in which dual chamber ICDs were implanted in 888 patients. Threshold tests were conducted at 1 volt by decrementing the pulse width. Follow-up at three months compared pacing thresholds measured using LLECG with those using Lead I of the surface ECG (SECG). The timesaving afforded by LLECG was assessed by a questionnaire. Results:The median threshold difference between LLECG and SECG measurements for both atrial (0.00 ms, P = 0.90) and ventricular (0.00 ms, P = 0.34) threshold tests were not significant. Ninety percent of atrial and ventricular threshold differences were bounded by ± 0.10 ms and ,0.10 to +0.04 ms, respectively. We found that 99% of atrial and ventricular thresholds tests at six and 12 months attempted using LLECG were successfully completed. The questionnaire indicated that 65% of healthcare professionals found LLECG to afford at least some timesaving during device follow-ups. Conclusion:Routine follow-up can be performed reliably and expeditiously in dual chamber Medtronic (Minneapolis, MN, USA) ICDs using LLECG alone, resulting in overall timesaving. [source]


    Automatic Mode Switching of Implantable Pacemakers: II.

    PACING AND CLINICAL ELECTROPHYSIOLOGY, Issue 7 2002
    Clinical Performance of Current Algorithms, Their Programming
    LAU, C.-P., et al.: Automatic Mode Switching of Implantable Pacemakers: II. Clinical Performance of Current Algorithms and Their Programming. While the hemodynamic and clinical significance of automatic mode switching (AMS) in patients with pacemakers has been demonstrated, the clinical behavior of AMS algorithms differ widely according to the manufacturers and pacemaker models. In general, a "rate-cutoff" detection method of atrial tachyarrhythmias provides a rapid AMS onset and resynchronization to sinus rhythm at the termination of atrial tachyarrhythmias, but may cause intermittent oscillations between the atrial tracking and AMS mode. This can be minimized with a "counter" of total number of high rate events before the AMS occurs. The use of a "running average" algorithm results in more stable rate control during AMS by reducing the incidence of oscillations, but at the expense of delayed AMS onset and resynchronization to sinus rhythm. Algorithms may be combined to fine tune the AMS response and to avoid rapid fluctuation in pacing rate. Appropriate programming of atrial sensitivity, and the avoidance of ventriculoatrial cross-talk are essential for optimal AMS performance. [source]


    Beyond Point and Level Systems: Moving Toward Child-Centered Programming

    AMERICAN JOURNAL OF ORTHOPSYCHIATRY, Issue 1 2009
    FAAN, Wanda K. Mohr PhD
    Many residential treatment facilities and child inpatient units in the United States have been structured by way of motivational programming such as the point and/or level systems. On the surface, they appear to be a straightforward contingency management tool that is based on social learning theory and operant principles. In this article, the authors argue that the assumptions upon which point and level systems are based do not hold up to close empirical scrutiny or theoretical validity, and that point and level system programming is actually counterproductive with some children, and at times can precipitate dangerous clinical situations, such as seclusion and restraint. In this article, the authors critique point and level system programming and assert that continuing such programming is antithetical to individualized, culturally, and developmentally appropriate treatment, and the authors explore the resistance and barriers to changing traditional ways of "doing things." Finally, the authors describe a different approach to providing treatment that is based on a collaborative problem-solving approach and upon which other successful models of treatment have been based. [source]


    "You Can't Air That:" Four Cases of Controversy and Censorship in American Television Programming

    THE JOURNAL OF POPULAR CULTURE, Issue 4 2008
    Jay R. ClarksonArticle first published online: 15 JUL 200
    No abstract is available for this article. [source]


    Improved controllability test for dependent siphons in S3PR based on elementary siphons

    ASIAN JOURNAL OF CONTROL, Issue 3 2010
    Daniel Y. Chao
    Abstract When siphons in a flexible manufacturing system (FMS) modeled by an ordinary Petri net (OPN) become unmarked, the net gets deadlocked. To prevent deadlocks, some control places and related arcs are added to strict minimal siphons (SMS) so that no siphon can be emptied. For large systems, it is infeasible to add a monitor to every SMS since the number of SMS or control elements grows exponentially with respect to the size of a Petri net. To tackle this problem, Li and Zhou propose to add control nodes and arcs for only elementary siphons. The rest of siphons, called dependent ones, may be controlled by adjusting control depth variables of elementary siphons associated with a dependent siphon after the failure of two tests. First, they test a Marking Linear Inequality (MLI); if it fails, then they perform a Linear Integer Programming (LIP) test which is an NP-hard problem. This implies that the MLI test is only sufficient, but not necessary. We propose a sufficient and necessary test for adjusting control depth variables in an S3PR to avoid the sufficient-only time-consuming linear integer programming (LIP) test (NP-complete problem) required previously for some cases. We theoretically prove the following: i) no need for LIP test for Type II siphons; and ii) Type I strongly n-dependent (n>2) siphons being always marked. As a result, the total time complexity to check controllability of all strongly dependent siphons is no longer exponential but reduced to linear if all siphons are of Type I. The total time complexity is O(|,E||,D|) (order of the product of total number of elementary siphons and total number of dependent siphons) if all siphons are of Type II. A well-known S3PR example has been illustrated to show the advantages. Copyright © 2010 John Wiley and Sons Asia Pte Ltd and Chinese Automatic Control Society [source]


    An Economic Analysis of the Returns to Canadian Swine Research: 1974,97

    CANADIAN JOURNAL OF AGRICULTURAL ECONOMICS, Issue 2 2001
    Greg Thomas
    This paper reports a new set of estimates of the returns to swine research in Canada. These estimates are obtained using Agriculture and Agri-Food Canada's Canadian Regional Agricultural Model (CRAM). Positive Mathematical Programming is incorporated into the model for use in this study. The CRAM allows the effects of supply shifts from technological change in the hog industry to interact with product and factor market conditions in the rest of Canadian agriculture. Extensive sensitivity analysis is conducted to examine the robustness of the return estimates under variations in some of the key assumptions employed in the analysis. The costs of public and private sector swine research are estimated. Public sector research costs are inclusive of the marginal excess burden of taxation. Overall, the estimated benefits from Canadian swine research are high relative to the estimated costs for the time period considered. Previous estimates of the returns to Canadian swine research were obtained by Huot et al. (1989) with a partial equilibrium model that did not allow for intra-sectoral resource use adjustments. The estimated returns obtained in the present study are generally higher than those obtained by Huot et al. For example, the estimates obtained from the direct application of the econometrically estimated supply function in this study gave an internal rate of return of about 124% and a benefit-cost ratio of 22.4 to 1. Huot et al reported comparable estimates of about 43% for the internal rate of return and 6,7 to 1 for the benefit-cost ratio. The differences in returns are not solely attributable to the use of a multi-market versus a single-market partial equilibrium approach. There are also differences in the estimates of the marginal excess burden of taxation between the two studies. L'analyse que void présente une nouvelle série d'estimations quant au rendement de la recherche porcine au Canada. Ces estimations dérivent du Modèle d'analyse régionale de l'agriculture du Canada (MARAC) du ministère canadien de l'Agriculture et de l'Agroalimentaire. Aux fins de la présente étude, on avait intégré au modèle une programmation mathématique positive. Le MARAC autorise l'interaction entre les retombées d'une modification de l'offre attribuable au virage technologique de l'industrie porcine et les conditions du marché des produits et des facteurs dans le reste de l'agriculture canadienne. Les auteurs ont effectué une analyse de sensibilité poussée en vue d'établir la robustesse de leurs estimations quand variaient quelques-unes des principales hypotheses de l'analyse. On a estimé le coût de la recherche sur les pores poursuivie par les secteurs public et privé. Dans le secteur public, le coût de la recherche incluait une charge fiscale légérement excessive. Dans l'ensemble, la recherche sur les porcs entreprise au Canada a rapporté beaucoup comparativement à ce qu'elle a coûté pendant la période à l'étude. Les estimations antérieures, établies par Huot et ses collaborateurs (1989), venaient d'un modèle àéquilibre partiel ne permettant aucun ajustement pour l'utilisation intra-sectorielle des ressources. Les revenus estimés ici sont généralement plus élevés que ceux de Huot et de ses collaborateurs. Ainsi, une application directe de l'offre estimée par des méthodes économétriques à l'analyse donne un taux de rendement interne d'environ 124 % et un indice de rentabilité de 22,4 pour 1. À titre de comparaison, Huot et ses collaborateurs rapportent des résultats d'environ 43 % pour le taux de rendement interne et de 6 à 7 pour 1 en ce qui concerne l'indice de rentabilité. Pareil écart ne résulte pas uniquement du choix d'un modèle àéquilibre partiel reposant sur plusieurs marchés au lieu d'un seul; on relève aussi des variations dans l'estimation du léger excès de la charge fiscale entre les deux études. [source]


    Time Series Modeling of Two- and Three-Phase Flow Boiling Systems with Genetic Programming

    CHEMICAL ENGINEERING & TECHNOLOGY (CET), Issue 11 2007
    M.-Y. Liu
    Abstract The time series of the physical parameters in boiling evaporators with vapor-liquid (V-L) two-phase and vapor-liquid-solid (V-L-S) three-phase external natural circulating flows exhibit nonlinear features. Hence, proper system evolution models may be built from the point of view of nonlinear dynamics. In this work, genetic programming (GP) was utilized to find the nonlinear modeling functions necessary to develop global explicit two-variable iteration models, using wall temperature signals measured from the heated tube in ordinary two-phase and three-phase fluidized bed evaporators. The model predictions agree well with the experimental data of the time series, which means that the models established with GP can adequately describe the dynamic evolution behavior of multi-phase flow boiling systems. [source]