Heuristic

Distribution by Scientific Domains
Distribution within Mathematics and Statistics

Kinds of Heuristic

  • lagrangian heuristic
  • search heuristic
  • simple heuristic

  • Terms modified by Heuristic

  • heuristic algorithm
  • heuristic algorithms
  • heuristic approach
  • heuristic argument
  • heuristic framework
  • heuristic method
  • heuristic methods
  • heuristic model
  • heuristic procedure
  • heuristic processing
  • heuristic purpose
  • heuristic rule
  • heuristic solution
  • heuristic value

  • Selected Abstracts


    OTOLARYNGOLOGIC HEURISTICS: A RHINOLOGIC PERSPECTIVE

    ANZ JOURNAL OF SURGERY, Issue 12 2008
    Erik Kent Weitzel
    Rhinological heuristics are adapted from common principles within the field of otolaryngology. The most important principle in achieving quality endoscopic sinus surgery is good haemostatic control of the surgical field. Once this is achieved, the surgeon can then begin advancing to other heuristic principles. Thinking one to two moves in advance allows the surgeon to take advantage of the many dually purposed instruments available. Learning to visualize buried structures by their subtle projections quickly follows. Finally, an ergonomically positioned surgeon with intricate anatomical knowledge of the sinonasal cavities permits a second surgeon to assist and greatly expand the limit of what is possible. [source]


    A performance-oriented adaptive scheduler for dependent tasks on grids,

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 9 2008
    Luiz F. Bittencourt
    Abstract A scheduler must consider the heterogeneity and communication delays when scheduling dependent tasks on a grid. The task-scheduling problem is NP-Complete in general, which led us to the development of a heuristic for the associated optimization problem. In this work we present a dynamic adaptive approach to schedule dependent tasks onto a grid based on the Xavantes grid middleware. The developed dynamic approach is applied to the Path Clustering Heuristic, and introduces the concept of rounds, which take turns sending tasks to execution and evaluating the performance of the resources. The adaptive extension changes the size of rounds during the process execution, taking task attributes and resources performance as parameters, and it can be adopted in other task schedulers. The experiments show that the dynamic round-based and adaptive schedule can minimize the effects of performance losses while executing processes on the grid. Copyright © 2007 John Wiley & Sons, Ltd. [source]


    Coordinated Capacitated Lot-Sizing Problem with Dynamic Demand: A Lagrangian Heuristic

    DECISION SCIENCES, Issue 1 2004
    E. Powell Robinson Jr.
    ABSTRACT Coordinated replenishment problems are common in manufacturing and distribution when a family of items shares a common production line, supplier, or a mode of transportation. In these situations the coordination of shared, and often limited, resources across items is economically attractive. This paper describes a mixed-integer programming formulation and Lagrangian relaxation solution procedure for the single-family coordinated capacitated lot-sizing problem with dynamic demand. The problem extends both the multi-item capacitated dynamic demand lot-sizing problem and the uncapacitated coordinated dynamic demand lot-sizing problem. We provide the results of computational experiments investigating the mathematical properties of the formulation and the performance of the Lagrangian procedures. The results indicate the superiority of the dual-based heuristic over linear programming-based approaches to the problem. The quality of the Lagrangian heuristic solution improved in most instances with increases in problem size. Heuristic solutions averaged 2.52% above optimal. The procedures were applied to an industry test problem yielding a 22.5% reduction in total costs. [source]


    Behavioral Adaptation, Confidence, and Heuristic-Based Explanations of the Probing Effect

    HUMAN COMMUNICATION RESEARCH, Issue 4 2001
    Timothy R. Levine
    Researchers have found that asking probing questions of message sources does not enhance deception detection accuracy. Probing does, however, increase recipient and observer perceptions of source honesty, a finding we label the probing effect. This project examined 3 potential explanations for the probing effect: behavioral adaptation, confidence bias, and a probing heuristic. In Study 1, respondents (N = 337) viewed videotaped interviews in which probes were present or not present, and in which message source behaviors were controlled. Inconsistent with the behavioral adaptation explanation, respondents perceived probed sources as more honest than nonprobed sources, despite the fact that source behaviors were constant across conditions. The data also were inconsistent with the confidence bias explanation. Studies 2 and 3 investigated the probing heuristic explanation. The data from Study 2 (N = 136) were ambiguous, but the results of third study (N = 143) were consistent with the heuristic processing explanation of the probing effect. [source]


    Heuristic and simulated annealing algorithms for solving extended cell assignment problem in wireless ATM networks

    INTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, Issue 1 2002
    Der-Rong Din
    Abstract In this paper, we investigate the extended cell assignment problem which optimally assigns new adding and splitting cells in Personal Communication Service (PCS) to switches in a wireless Asynchronous Transfer Mode (ATM) network. Given cells in a PCS network and switches on an ATM network (whose locations are fixed and known), we would like to do the assignment in an attempt to minimize a cost criterion. The cost has two components: one is the cost of handoffs that involve two switches, and the other is the cost of cabling. This problem is modeled as a complex integer programming problem, and finding an optimal solution to this problem is NP-hard. A heuristic algorithm and a simulated annealing algorithm are proposed to solve this problem. The heuristic algorithm, Extended Assignment Algorithm (EEA), consists of two phases, initial assigning phase and cell exchanging phase. First, in the initial assigning phase, the initial assignments of cells to switches are found. Then, these assignments are improved by performing cell exchanging phase in which two cells are repeatedly exchanged in different switches with great reduction of the total cost. The simulated annealing algorithm, ESA (enhanced simulated annealing), generates constraint-satisfied configurations, and uses three configuration perturbation schemes to change current configuration to a new one. Experimental results indicate that EAA and ESA algorithms have good performances. Copyright © 2002 John Wiley & Sons, Ltd. [source]


    Theory, Stylized Heuristic or Self-Fulfilling Prophecy?

    PUBLIC ADMINISTRATION, Issue 1 2004
    The Status of Rational Choice Theory in Public Administration
    Rational choice is intimately associated with positivism and naturalism, its appeal to scholars of public administration lying in its ability to offer a predictive science of politics that is parsimonious in its analytical assumptions, rigorous in its deductive reasoning and overarching in its apparent applicability. In this paper I re-examine the ontology and epistemology which underpins this distinctive approach to public administration, challenging the necessity of the generally unquestioned association between rational choice and both positivism and naturalism. Rational choice, I contend, can only defend its claim to offer a predictive science of politics on the basis of an ingenious, paradoxical, and seldom acknowledged structuralism and a series of analytical assumptions incapable of capturing the complexity and contingency of political systems. I argue that analytical parsimony, though itself a condition of naturalism, is in fact incompatible with the deduction of genuinely explanatory/causal inferences. This suggests that the status of rational choice as an explanatory/predictive theory needs to be reassessed. Yet this is no reason to reject rational choice out of hand. For, deployed not as a theory in its own right, but as a heuristic analytical strategy for exploring hypothetical scenarios, it is a potent and powerful resource in post-positivist public administration. [source]


    Gutsy Heuristics: the Art of Ignoring Information

    ETHOLOGY, Issue 4 2009
    Article first published online: 17 MAR 200
    No abstract is available for this article. [source]


    Fuzzy based fast dynamic programming solution of unit commitment with ramp constraints

    EXPERT SYSTEMS, Issue 4 2009
    S. Patra
    Abstract: A fast dynamic programming technique based on a fuzzy based unit selection procedure is proposed in this paper for the solution of the unit commitment problem with ramp constraints. The curse of dimensionality of the dynamic programming technique is eliminated by minimizing the number of prospective solution paths to be stored at each stage of the search procedure. Heuristics like priority ordering of the units, unit grouping, fast economic dispatch based on priority ordering, and avoidance of repeated economic dispatch through memory action have been employed to make the algorithm fast. The proposed method produced comparable results with the best performing methods found in the literature. [source]


    The Limitations of Heuristics for Political Elites

    POLITICAL PSYCHOLOGY, Issue 6 2009
    Kristina C. Miler
    Despite the extensive literature on citizens' use of cognitive heuristics in political settings, far less is known about how political elites use these shortcuts. Legislative elites benefit from the efficiency of the accessibility heuristic, but their judgments can also be flawed if accessible information is incomplete or unrepresentative. Using personal interviews and a quasi-experimental design, this paper examines the use of the accessibility heuristic by professional legislative staff when assessing the importance of natural resources issues to their constituents. Staff members recall only a small subset of the relevant constituents in the district, and this subset is biased in favor of active and resource-rich constituents over other, equally relevant constituents. This paper provides a new application of cognitive psychology to political elites and addresses important normative questions about the importance of information processing for political representation. By drawing on the psychology literature on heuristics, this paper identifies the cognitive mechanisms of congressional representation and provides new evidence of old biases. [source]


    Opening the Black Box: Beyond Frameworks and Heuristics in Administrative Practice

    PUBLIC ADMINISTRATION REVIEW, Issue 1 2010
    Kalu N. Kalu
    First page of article [source]


    Tracing Foreign Policy Decisions: A Study of Citizens' Use of Heuristics

    BRITISH JOURNAL OF POLITICS & INTERNATIONAL RELATIONS, Issue 4 2009
    Robert Johns
    Public opinion researchers agree that citizens use simplifying heuristics to reach real, stable preferences. In domestic policy, the focus has been on citizens delegating judgement to opinion leaders, notably political parties. By contrast, citizens have been held to deduce foreign policy opinions from their own values or principles. Yet there is ample scope for delegation in the foreign policy sphere. In this exploratory study I use a ,process-tracing' method to test directly for delegation heuristic processing in university students' judgements on the Iranian nuclear issue. A substantial minority sought guidance on foreign policy decisions, either from parties, international actors or newspapers. This was not always simple delegation; some used such heuristics within more complex decision-making processes. However, others relied on simple delegation, raising questions about the ,effectiveness' of their processing. [source]


    Adaptive Non-Interventional Heuristics for Covariation Detection in Causal Induction: Model Comparison and Rational Analysis

    COGNITIVE SCIENCE - A MULTIDISCIPLINARY JOURNAL, Issue 5 2007
    Masasi Hattori
    Abstract In this article, 41 models of covariation detection from 2 × 2 contingency tables were evaluated against past data in the literature and against data from new experiments. A new model was also included based on a limiting case of the normative phi-coefficient under an extreme rarity assumption, which has been shown to be an important factor in covariation detection (McKenzie & Mikkelsen, 2007) and data selection (Hattori, 2002; Oaksford & Chater, 1994, 2003). The results were supportive of the new model. To investigate its explanatory adequacy, a rational analysis using two computer simulations was conducted. These simulations revealed the environmental conditions and the memory restrictions under which the new model best approximates the normative model of covariation detection in these tasks. They thus demonstrated the adaptive rationality of the new model. [source]


    Interactive Global Photon Mapping

    COMPUTER GRAPHICS FORUM, Issue 4 2009
    B. Fabianowski
    Abstract We present a photon mapping technique capable of computing high quality global illumination at interactive frame rates. By extending the concept of photon differentials to efficiently handle diffuse reflections, we generate footprints at all photon hit points. These enable illumination reconstruction by density estimation with variable kernel bandwidths without having to locate the k nearest photon hits first. Adapting an efficient BVH construction process for ray tracing acceleration, we build photon maps that enable the fast retrieval of all hits relevant to a shading point. We present a heuristic that automatically tunes the BVH build's termination criterion to the scene and illumination conditions. As all stages of the algorithm are highly parallelizable, we demonstrate an implementation using NVidia's CUDA manycore architecture running at interactive rates on a single GPU. Both light source and camera may be freely moved with global illumination fully recalculated in each frame. [source]


    Fast BVH Construction on GPUs

    COMPUTER GRAPHICS FORUM, Issue 2 2009
    C. Lauterbach
    We present two novel parallel algorithms for rapidly constructing bounding volume hierarchies on manycore GPUs. The first uses a linear ordering derived from spatial Morton codes to build hierarchies extremely quickly and with high parallel scalability. The second is a top-down approach that uses the surface area heuristic (SAH) to build hierarchies optimized for fast ray tracing. Both algorithms are combined into a hybrid algorithm that removes existing bottlenecks in the algorithm for GPU construction performance and scalability leading to significantly decreased build time. The resulting hierarchies are close in to optimized SAH hierarchies, but the construction process is substantially faster, leading to a significant net benefit when both construction and traversal cost are accounted for. Our preliminary results show that current GPU architectures can compete with CPU implementations of hierarchy construction running on multicore systems. In practice, we can construct hierarchies of models with up to several million triangles and use them for fast ray tracing or other applications. [source]


    Locating a Surveillance Infrastructure in and Near Ports or on Other Planar Surfaces to Monitor Flows

    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 2 2010
    Pitu B. Mirchandani
    This article addresses the problem of locating surveillance radars to cover a given target surface that may have barriers through which radar signals cannot penetrate. The area of coverage of a radar is assumed to be a disc, or a partial disc when there are barriers, with a known radius. The article shows that the corresponding location problems relate to two well studied problems: the set-covering model and the maximal covering problem. In the first problem, the minimum number of radars is to be located to completely cover the target area; in the second problem a given number M of radars are to be located to cover the target area as much as possible. Based on a discrete representation of the target area, a Lagrangian heuristic and a two-stage procedure with a conquer-and-divide scaling are developed to solve the above two models. The computational experiences reported demonstrate that the developed method solves well the radar location problems formulated here. [source]


    Social Infrastructure Planning: A Location Model and Solution Methods

    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 8 2007
    João F. Bigotte
    Authorities want to determine where to locate the facilities of a social infrastructure network and what should be the capacity of these facilities. Each user must be assigned to its closest facility and, to be economically viable, each facility must serve at least a pre-specified level of demand. The objective is to maximize the accessibility to facilities (i.e., to minimize the distance traveled by users to reach the facilities). A location model that captures the above features is formulated and different solution methods are tested. Among the methods tested, tabu search and a specialized local search heuristic provided the best solutions. The application of the model is illustrated through a case study involving the location of preschools in the municipality of Miranda do Corvo, Portugal. [source]


    A performance-oriented adaptive scheduler for dependent tasks on grids,

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 9 2008
    Luiz F. Bittencourt
    Abstract A scheduler must consider the heterogeneity and communication delays when scheduling dependent tasks on a grid. The task-scheduling problem is NP-Complete in general, which led us to the development of a heuristic for the associated optimization problem. In this work we present a dynamic adaptive approach to schedule dependent tasks onto a grid based on the Xavantes grid middleware. The developed dynamic approach is applied to the Path Clustering Heuristic, and introduces the concept of rounds, which take turns sending tasks to execution and evaluating the performance of the resources. The adaptive extension changes the size of rounds during the process execution, taking task attributes and resources performance as parameters, and it can be adopted in other task schedulers. The experiments show that the dynamic round-based and adaptive schedule can minimize the effects of performance losses while executing processes on the grid. Copyright © 2007 John Wiley & Sons, Ltd. [source]


    Tunneling enhanced by web page content block partition for focused crawling

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 1 2008
    Tao Peng
    Abstract The complexity of web information environments and multiple-topic web pages are negative factors significantly affecting the performance of focused crawling. A highly relevant region in a web page may be obscured because of low overall relevance of that page. Segmenting the web pages into smaller units will significantly improve the performance. Conquering and traversing irrelevant page to reach a relevant one (tunneling) can improve the effectiveness of focused crawling by expanding its reach. This paper presents a heuristic-based method to enhance focused crawling performance. The method uses a Document Object Model (DOM)-based page partition algorithm to segment a web page into content blocks with a hierarchical structure and investigates how to take advantage of block-level evidence to enhance focused crawling by tunneling. Page segmentation can transform an uninteresting multi-topic web page into several single topic context blocks and some of which may be interesting. Accordingly, focused crawler can pursue the interesting content blocks to retrieve the relevant pages. Experimental results indicate that this approach outperforms Breadth-First, Best-First and Link-context algorithm both in harvest rate, target recall and target length. Copyright © 2007 John Wiley & Sons, Ltd. [source]


    Optimal integrated code generation for VLIW architectures

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 11 2006
    Christoph Kessler
    Abstract We present a dynamic programming method for optimal integrated code generation for basic blocks that minimizes execution time. It can be applied to single-issue pipelined processors, in-order-issue superscalar processors, VLIW architectures with a single homogeneous register set, and clustered VLIW architectures with multiple register sets. For the case of a single register set, our method simultaneously copes with instruction selection, instruction scheduling, and register allocation. For clustered VLIW architectures, we also integrate the optimal partitioning of instructions, allocation of registers for temporary variables, and scheduling of data transfer operations between clusters. Our method is implemented in the prototype of a retargetable code generation framework for digital signal processors (DSPs), called OPTIMIST. We present results for the processors ARM9E, TI C62x, and a single-cluster variant of C62x. Our results show that the method can produce optimal solutions for small and (in the case of a single register set) medium-sized problem instances with a reasonable amount of time and space. For larger problem instances, our method can be seamlessly changed into a heuristic. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    Confronting Uncertainty and Missing Values in Environmental Value Transfer as Applied to Species Conservation

    CONSERVATION BIOLOGY, Issue 5 2010
    SONIA AKTER
    conservación de especies; error de transferencia; incertidumbre; transferencia de valor ambiental; valores de no uso Abstract:,The nonuse (or passive) value of nature is important but time-consuming and costly to quantify with direct surveys. In the absence of estimates of these values, there will likely be less investment in conservation actions that generate substantial nonuse benefits, such as conservation of native species. To help overcome decisions about the allocation of conservation dollars that reflect the lack of estimates of nonuse values, these values can be estimated indirectly by environmental value transfer (EVT). EVT uses existing data or information from a study site such that the estimated monetary value of an environmental good is transferred to another location or policy site. A major challenge in the use of EVT is the uncertainty about the sign and size of the error (i.e., the percentage by which transferred value exceeds the actual value) that results from transferring direct estimates of nonuse values from a study to a policy site, the site where the value is transferred. An EVT is most useful if the decision-making framework does not require highly accurate information and when the conservation decision is constrained by time and financial resources. To account for uncertainty in the decision-making process, a decision heuristic that guides the decision process and illustrates the possible decision branches, can be followed. To account for the uncertainty associated with the transfer of values from one site to another, we developed a risk and simulation approach that uses Monte Carlo simulations to evaluate the net benefits of conservation investments and takes into account different possible distributions of transfer error. This method does not reduce transfer error, but it provides a way to account for the effect of transfer error in conservation decision making. Our risk and simulation approach and decision-based framework on when to use EVT offer better-informed decision making in conservation. Resumen:,El valor de no uso (o pasivo) de la naturaleza es importante pero su cuantificación con muestreos pasivos consume tiempo y es costosa. En ausencia de estimaciones de estos valores, es probable que haya menos inversión en acciones de conservación que generen beneficios de no uso sustanciales, tal como la conservación de especies nativas. Para ayudar a superar decisiones respecto a la asignación de dólares para conservación que reflejan la carencia de estimaciones de los valores de no uso, estos valores pueden ser estimados indirectamente por la transferencia de valor ambiental (TVA). La transferencia de valor ambiental utiliza datos existentes o información de un sitio de estudio de tal manera que el valor monetario estimado de un bien ambiental es transferido a otro sitio. Un reto mayor en el uso de TVA es la incertidumbre sobre la señal y el tamaño del error (i.e., el porcentaje en que el valor transferido excede al valor actual) que resulta de la transferencia de estimaciones directas de los valores de no uso de un sitio de estudio a uno político, el sitio adonde el valor es transferido. Una TVA es más útil si el marco de toma de decisiones no requiere información muy precisa y cuando la decisión de conservación está restringida por tiempo y recursos financieros. Para tomar en cuenta la incertidumbre en el proceso de toma de decisiones, se puede seguir una decisión heurística que guie el proceso de decisión e ilustre sobre las posibles ramificaciones de la decisión. Para tomar en cuenta la incertidumbre asociada con la transferencia de valores de un sitio a otro, desarrollamos un método de riesgo y simulación que utiliza simulaciones Monte Carlo para evaluar los beneficios netos de las inversiones de conservación y que considera posibles distribuciones diferentes de la transferencia de error. Este método no reduce el error de transferencia, pero proporciona una manera para considerar el efecto del error de transferencia en la toma de decisiones de conservación. Nuestro método de riesgo y simulación y el marco de referencia basado en decisones sobre cuando utilizar TVA permiten la toma de decisiones en conservación más informadas. [source]


    Motivations for the Restoration of Ecosystems

    CONSERVATION BIOLOGY, Issue 2 2006
    ANDRE F. CLEWELL
    cambio climático; capital natural; restauración ecológica Abstract:,The reasons ecosystems should be restored are numerous, disparate, generally understated, and commonly underappreciated. We offer a typology in which these reasons,or motivations,are ordered among five rationales: technocratic, biotic, heuristic, idealistic, and pragmatic. The technocratic rationale encompasses restoration that is conducted by government agencies or other large organizations to satisfy specific institutional missions and mandates. The biotic rationale for restoration is to recover lost aspects of local biodiversity. The heuristic rationale attempts to elicit or demonstrate ecological principles and biotic expressions. The idealistic rationale consists of personal and cultural expressions of concern or atonement for environmental degradation, reengagement with nature, and/or spiritual fulfillment. The pragmatic rationale seeks to recover or repair ecosystems for their capacity to provide a broad array of natural services and products upon which human economies depend and to counteract extremes in climate caused by ecosystem loss. We propose that technocratic restoration, as currently conceived and practiced, is too narrow in scope and should be broadened to include the pragmatic rationale whose overarching importance is just beginning to be recognized. We suggest that technocratic restoration is too authoritarian, that idealistic restoration is overly restricted by lack of administrative strengths, and that a melding of the two approaches would benefit both. Three recent examples are given of restoration that blends the technocratic, idealistic, and pragmatic rationales and demonstrates the potential for a more unified approach. The biotic and heuristic rationales can be satisfied within the contexts of the other rationales. Resumen:,Las razones por la que los ecosistemas deben ser restaurados son numerosas, dispares, generalmente poco sustentadas, y comúnmente poco apreciadas. Ofrecemos una tipología en la que estas razones,o motivaciones,son ordenadas entre cinco razonamientos: tecnocrático, biótico, heurístico, idealista y pragmático. El razonamiento tecnocrático se refiere a la restauración que es llevada a cabo por agencias gubernamentales u otras grandes organizaciones para satisfacer misiones y mandatos institucionales específicos. El razonamiento biótico de la restauración es la recuperación de aspectos perdidos de la biodiversidad local. El razonamiento heurístico intenta extraer o demostrar principios ecológicos y expresiones bióticas. El razonamiento idealista consiste de expresiones personales y culturales de la preocupación o reparación de la degradación ambiental, reencuentro con la naturaleza y/o cumplimiento espiritual. El razonamiento pragmático busca recuperar o reparar ecosistemas por su capacidad de proporcionar una amplia gama de servicios y productos naturales de la que dependen las economías humanas y para contrarrestar extremos en el clima causados por la pérdida de ecosistemas. Proponemos que la restauración tecnocrática, como se concibe y practica actualmente, es muy corta en su alcance y debiera ampliarse para incluir al razonamiento pragmático, cuya importancia apenas comienza a ser reconocida. Sugerimos que la restauración tecnocrática es demasiado autoritaria, que la restauración idealista esta muy restringida por la falta de fortalezas administrativas, y que una mezcla de los dos enfoques podría beneficiar a ambas. Proporcionamos tres ejemplos recientes de restauración que combinan los razonamientos tecnocrático, idealista y pragmático y demuestran el potencial para un enfoque más unificado. Los razonamientos biótico y heurístico pueden ser satisfechos en el contexto de los otros razonamientos. [source]


    Coordinated Capacitated Lot-Sizing Problem with Dynamic Demand: A Lagrangian Heuristic

    DECISION SCIENCES, Issue 1 2004
    E. Powell Robinson Jr.
    ABSTRACT Coordinated replenishment problems are common in manufacturing and distribution when a family of items shares a common production line, supplier, or a mode of transportation. In these situations the coordination of shared, and often limited, resources across items is economically attractive. This paper describes a mixed-integer programming formulation and Lagrangian relaxation solution procedure for the single-family coordinated capacitated lot-sizing problem with dynamic demand. The problem extends both the multi-item capacitated dynamic demand lot-sizing problem and the uncapacitated coordinated dynamic demand lot-sizing problem. We provide the results of computational experiments investigating the mathematical properties of the formulation and the performance of the Lagrangian procedures. The results indicate the superiority of the dual-based heuristic over linear programming-based approaches to the problem. The quality of the Lagrangian heuristic solution improved in most instances with increases in problem size. Heuristic solutions averaged 2.52% above optimal. The procedures were applied to an industry test problem yielding a 22.5% reduction in total costs. [source]


    Quantitative Comparison of Approximate Solution Sets for Bi-criteria Optimization Problems,

    DECISION SCIENCES, Issue 1 2003
    W. Matthew Carlyle
    ABSTRACT We present the Integrated Preference Functional (IPF) for comparing the quality of proposed sets of near-pareto-optimal solutions to bi-criteria optimization problems. Evaluating the quality of such solution sets is one of the key issues in developing and comparing heuristics for multiple objective combinatorial optimization problems. The IPF is a set functional that, given a weight density function provided by a decision maker and a discrete set of solutions for a particular problem, assigns a numerical value to that solution set. This value can be used to compare the quality of different sets of solutions, and therefore provides a robust, quantitative approach for comparing different heuristic, a posteriori solution procedures for difficult multiple objective optimization problems. We provide specific examples of decision maker preference functions and illustrate the calculation of the resulting IPF for specific solution sets and a simple family of combined objectives. [source]


    Optimizing Service Attributes: The Seller's Utility Problem,

    DECISION SCIENCES, Issue 2 2001
    Fred F. Easton
    Abstract Service designers predict market share and sales for their new designs by estimating consumer utilities. The service's technical features (for example, overnight parcel delivery), its price, and the nature of consumer interactions with the service delivery system influence those utilities. Price and the service's technical features are usually quite objective and readily ascertained by the consumer. However, consumer perceptions about their interactions with the service delivery system are usually far more subjective. Furthermore, service designers can only hope to influence those perceptions indirectly through their decisions about nonlinear processes such as employee recruiting, training, and scheduling policies. Like the service's technical features, these process choices affect quality perceptions, market share, revenues, costs, and profits. We propose a heuristic for the NP-hard service design problem that integrates realistic service delivery cost models with conjoint analysis. The resulting seller's utility function links expected profits to the intensity of a service's influential attributes and also reveals an ideal setting or level for each service attribute. In tests with simulated service design problems, our proposed configurations compare quite favorably with the designs suggested by other normative service design heuristics. [source]


    Monolingual, bilingual, trilingual: infants' language experience influences the development of a word-learning heuristic

    DEVELOPMENTAL SCIENCE, Issue 5 2009
    Krista Byers-Heinlein
    How infants learn new words is a fundamental puzzle in language acquisition. To guide their word learning, infants exploit systematic word-learning heuristics that allow them to link new words to likely referents. By 17 months, infants show a tendency to associate a novel noun with a novel object rather than a familiar one, a heuristic known as disambiguation. Yet, the developmental origins of this heuristic remain unknown. We compared disambiguation in 17- to 18-month-old infants from different language backgrounds to determine whether language experience influences its development, or whether disambiguation instead emerges as a result of maturation or social experience. Monolinguals showed strong use of disambiguation, bilinguals showed marginal use, and trilinguals showed no disambiguation. The number of languages being learned, but not vocabulary size, predicted performance. The results point to a key role for language experience in the development of disambiguation, and help to distinguish among theoretical accounts of its emergence. [source]


    Multiple causality in developmental disorders: methodological implications from computational modelling

    DEVELOPMENTAL SCIENCE, Issue 5 2003
    Michael S.C. Thomas
    When developmental disorders are defined on the basis of behavioural impairments alone, there is a risk that individuals with different underlying cognitive deficits will be grouped together on the basis that they happen to share a certain impairment. This phenomenon is labelled multiple causality. In contrast, a developmental disorder generated by a single underlying cognitive deficit may nevertheless show variable patterns of impairments due to individual differences. Connectionist computational models of development are used to investigate whether there may be ways to distinguish disorder groups with a single underlying cause (homogeneous disorder groups) from disorder groups with multiple underlying causes (heterogeneous disorder groups) on the basis of behavioural measures alone. A heuristic is proposed to assess the underlying causal homogeneity of the disorder group based on the variability of different behavioural measures from the target domain. Heterogeneous disorder groups are likely to show smaller variability on the measure used to define the disorder than on subsequent behavioural measures, while homogeneous groups should show approximately equivalent variability. Homogeneous disorder groups should show reductions in the variability of behavioural measures over time, while heterogeneous groups may not. It is demonstrated how these predictions arise from computational assumptions, and their use is illustrated with reference to behavioural data on naming skills from two developmental disorder groups, Williams syndrome and children with Word Finding Difficulties. [source]


    Paranoid thinking as a heuristic

    EARLY INTERVENTION IN PSYCHIATRY, Issue 3 2010
    Antonio Preti
    Abstract Paranoid thinking can be viewed as a human heuristic used by individuals to deal with uncertainty during stressful situations. Under stress, individuals are likely to emphasize the threatening value of neutral stimuli and increase the reliance on paranoia-based heuristic to interpreter events and guide their decisions. Paranoid thinking can also be activated by stress arising from the possibility of losing a good opportunity; this may result in an abnormal allocation of attentional resources to social agents. A better understanding of the interplay between cognitive heuristics and emotional processes may help to detect situations in which paranoid thinking is likely to exacerbate and improve intervention for individuals with delusional disorders. [source]


    Multiobjective heuristic approaches to seismic design of steel frames with standard sections

    EARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 11 2007
    M. Ohsaki
    Abstract Seismic design problem of a steel moment-resisting frame is formulated as a multiobjective programming problem. The total structural (material) volume and the plastic dissipated energy at the collapse state against severe seismic motions are considered as performance measures. Geometrically nonlinear inelastic time-history analysis is carried out against recorded ground motions that are incrementally scaled to reach the predefined collapse state. The frame members are chosen from the lists of the available standard sections. Simulated annealing (SA) and tabu search (TS), which are categorized as single-point-search heuristics, are applied to the multiobjective optimization problem. It is shown in the numerical examples that the frames that collapse with uniform interstorey drift ratios against various levels of ground motions can be obtained as a set of Pareto optimal solutions. Copyright © 2007 John Wiley & Sons, Ltd. [source]


    HOW DO POLICY-MAKERS ACTUALLY SOLVE PROBLEMS?

    ECONOMICS & POLITICS, Issue 2 2009
    EVIDENCE FROM THE FRENCH LOCAL PUBLIC SECTOR
    This article examines how policy-makers solve problems within local representative democracies. It will be argued that politicians cannot undertake an exhaustive search of all possible policy choices; instead, they might use an incremental strategy such as the hill-climbing heuristic. These possibilities will be formalized using the median voter model as an analytical framework. The corresponding models will then be estimated over a set of French jurisdictions (the départements). The empirical results lend support to the hill-climbing model, given that: (1) for social welfare and secondary school expenditures, the influence of the past is significant; (2) a pure model of incrementalism, without any exogenous variables, is not appropriate for explaining the behavior of departmental council members; and (3) the impact of the past is more significant and stronger when expenditure levels are higher. [source]


    Depression and reliance on ease-of-retrieval experiences

    EUROPEAN JOURNAL OF SOCIAL PSYCHOLOGY, Issue 2 2008
    Rainer Greifeneder
    The relationship between level of depressive symptomatology and reliance on the ease-of-retrieval heuristic was investigated. In two studies, differences in ease-of-retrieval were instigated by means of the paradigm introduced by Schwarz and co-workers. Subsequently, participants were screened for depressive symptoms with the Allgemeine Depressionsskala (ADS, Experiments 1 and 2) and the Beck Depression Inventory (BDI, Experiment 2). In both experiments, participants were randomly selected from a non-clinical population. Results indicate that participants with low levels of depressive symptomatology relied on experienced ease or difficulty, whereas individuals with high levels of depressive symptomatology based their judgment on the accessible content information. Theoretical and practical implications of these findings are discussed. Copyright © 2007 John Wiley & Sons, Ltd. [source]