Combinatorial Optimization (combinatorial + optimization)

Distribution by Scientific Domains

Terms modified by Combinatorial Optimization

  • combinatorial optimization problem

  • Selected Abstracts


    Population Synthesis: Comparing the Major Techniques Using a Small, Complete Population of Firms

    GEOGRAPHICAL ANALYSIS, Issue 2 2009
    Justin Ryan
    Recently, disaggregate modeling efforts that rely on microdata have received wide attention by scholars and practitioners. Synthetic population techniques have been devised and are used as a viable alternative to the collection of microdata that normally are inaccessible because of confidentiality concerns or incomplete because of high acquisition costs. The two most widely discussed synthetic techniques are the synthetic reconstruction method (IPFSR), which makes use of iterative proportional fitting (IPF) techniques, and the combinatorial optimization (CO) method. Both methods are described in this article and then evaluated in terms of their ability to recreate a known population of firms, using limited data extracted from the parent population of the firms. Testing a synthetic population against a known population is seldom done, because obtaining an entire population usually is too difficult. The case presented here uses a small, complete population of firms for the City of Hamilton, Ontario, for the year 1990; firm attributes compiled are number of employees, 3-digit standard industrial classification, and geographic location. Results are summarized for experiments based upon various combinations of sample size and tabulation detail designed to maximize the accuracy of resulting synthetic populations while holding input data costs to a minimum. The output from both methods indicates that increases in sample size and tabulation detail result in higher quality synthetic populations, although the quality of the generated population is more sensitive to increases in tabular detail. Finally, most tests conducted with the created synthetic populations suggest that the CO method is superior to the IPFSR method. Los modelos desagregados basados en micro data han recibido la atención relativamente reciente de los círculos académicos y de aplicación. La colección de dicha data es una tarea difícil por cuestiones de accesibilidad, confidencialidad, datos incompletos o altos costos de adquisición. Por esta razón se han creado indicadores sintéticos como a alternativa a la recolección directa de datos. Los dos indicadores sintéticos mas discutidos/conocidos son el método de Reconstrucción Sintética (Sytnthetic Reconstruction method) (IPFSR) que hace uso de técnicas de Ajuste Proporcional Iterativo (IPF); y el método Optimización Combinatoria (CO). Ambos métodos son descritos en este artículo y luego evaluados en base a su habilidad de recrear una población de empresas ya conocidas o preestablecidas. Contrastar una población sintética versus una población conocida es una operación poco frecuente porque la obtención de una población entera es por lo general bastante difícil. El caso presentado en este estudio utiliza una población pequeña y completa de empresas en la ciudad de Hamilton, Ontario (Canadá) para el año 1990. Las variables recopiladas son el número de empleados, SIC (código estandarizado de clasificación industrial), y ubicación geográfica. Los resultados que se reportan en el presente estudio son producto de varios experimentos basados en varias combinaciones del tamaño de la muestra, y del detalle en la tabulación diseñados, los mismos que fueron diseñados para maximizar la exactitud de las poblaciones sintéticas calculadas y al mismo tiempo minimizar los costos de datos necesarios. Los resultados obtenidos por ambos métodos indica que los incrementos en el tamaño de la muestra y en el detalle de la tabulación resultan en un estimado de poblaciones mejor, aunque este estimado es particularmente sensible a incrementos en el detalle de las tabulaciones. Finalmente, la mayoría de pruebas realizadas con las poblaciones sintéticas generadas para este estudio sugieren que el método CO es superior al método IPFSR. [source]


    Shortest paths in fuzzy weighted graphs

    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 11 2004
    Chris Cornelis
    The task of finding shortest paths in weighted graphs is one of the archetypical problems encountered in the domain of combinatorial optimization and has been studied intensively over the past five decades. More recently, fuzzy weighted graphs, along with generalizations of algorithms for finding optimal paths within them, have emerged as an adequate modeling tool for prohibitively complex and/or inherently imprecise systems. We review and formalize these algorithms, paying special attention to the ranking methods used for path comparison. We show which criteria must be met for algorithm correctness and present an efficient method, based on defuzzification of fuzzy weights, for finding optimal paths. © 2004 Wiley Periodicals, Inc. Int J Int Syst 19: 1051,1068, 2004. [source]


    An annotated bibliography of GRASP,Part II: Applications

    INTERNATIONAL TRANSACTIONS IN OPERATIONAL RESEARCH, Issue 2 2009
    Paola Festa
    Abstract A greedy randomized adaptive search procedure (GRASP) is a metaheuristic for combinatorial optimization. It is a multi-start or iterative process, in which each GRASP iteration consists of two phases, a construction phase, in which a feasible solution is produced, and a local search phase, in which a local optimum in the neighborhood of the constructed solution is sought. Since 1989, numerous papers on the basic aspects of GRASP, as well as enhancements to the basic metaheuristic, have appeared in the literature. GRASP has been applied to a wide range of combinatorial optimization problems, ranging from scheduling and routing to drawing and turbine balancing. This is the second of two papers with an annotated bibliography of the GRASP literature from 1989 to 2008. In the companion paper, algorithmic aspects of GRASP are surveyed. In this paper, we cover the literature where GRASP is applied to scheduling, routing, logic, partitioning, location, graph theory, assignment, manufacturing, transportation, telecommunications, biology and related fields, automatic drawing, power systems, and VLSI design. [source]


    An annotated bibliography of GRASP , Part I: Algorithms

    INTERNATIONAL TRANSACTIONS IN OPERATIONAL RESEARCH, Issue 1 2009
    Paola Festa
    Abstract A greedy randomized adaptive search procedure (GRASP) is a metaheuristic for combinatorial optimization. It is a multi-start or iterative process, in which each GRASP iteration consists of two phases, a construction phase, in which a feasible solution is produced, and a local search phase, in which a local optimum in the neighborhood of the constructed solution is sought. Since 1989, numerous papers on the basic aspects of GRASP, as well as enhancements to the basic metaheuristic have appeared in the literature. GRASP has been applied to a wide range of combinatorial optimization problems, ranging from scheduling and routing to drawing and turbine balancing. This is the first of two papers with an annotated bibliography of the GRASP literature from 1989 to 2008. This paper covers algorithmic aspects of GRASP. [source]


    Multiobjective combinatorial optimization: some approaches

    JOURNAL OF MULTI CRITERIA DECISION ANALYSIS, Issue 3-4 2008
    Murat KöksalanArticle first published online: 9 FEB 200
    Abstract There have been many developments in multiple criteria decision-making (MCDM) during the last 50 years. Researchers from different areas have also recognized the multiple-criteria nature of problems in their application areas and tried to address these issues. Unfortunately, there has not always been sufficient information flow between the researchers in the MCDM area and the researchers applying MCDM to their problems. More recently, multiobjective combinatorial optimization (MOCO) and multiobjective metaheuristic areas have been enjoying substantial developments. These problems are hard to solve. Many researchers addressed the problem of finding all nondominated solutions. This is a difficult task for MOCO problems. This difficulty limits many of the studies to concentrate on bicriteria problems. In this paper, I review some MCDM approaches that aim to find only the preferred solutions of the decision maker (DM). I argue that this is especially important for MOCO problems. I discuss several of our approaches that incorporate DM's preferences into the solution process of MOCO problems and argue that there is a need for more work to be done in this area. Copyright © 2009 John Wiley & Sons, Ltd. [source]


    A personal account of the role of peptide research in drug discovery: the case of hepatitis C,

    JOURNAL OF PEPTIDE SCIENCE, Issue 1 2001
    Antonello Pessi
    Abstract Although peptides themselves are not usually the end products of a drug discovery effort, peptide research often plays a key role in many aspects of this process. This will be illustrated by reviewing the experience of peptide research carried out at IRBM in the course of our study of hepatitis C virus (HCV). The target of our work is the NS3/4A protease, which is essential for maturation of the viral polyprotein. After a thorough examination of its substrate specificity we fine-tuned several substrate-derived peptides for enzymology studies, high-throughput screening and as fluorescent probes for secondary binding assays. In the course of these studies we made the key observation: that the protease is inhibited by its own cleavage products. Single analog and combinatorial optimization then derived potent peptide inhibitors. The crucial role of the NS4A cofactor was also addressed. NS4A is a small transmembrane protein, whose central domain is the minimal region sufficient for enzyme activation. Structural studies were performed with a peptide corresponding to the minimal activation domain, with a series of product inhibitors and with both. We found that NS3/4A is an induced fit enzyme, requiring both the cofactor and the substrate to acquire its bioactive conformation; this explained some puzzling results of ,serine-trap' type inhibitors. A more complete study on NS3 activation, however, requires the availability of the full-length NS4A protein. This was prepared by native chemical ligation, after sequence engineering to enhance its solubility; structural studies are in progress. Current work is focused on the P, region of the substrate, which, at variance with the P region, is not used for ground state binding to the enzyme and might give rise to inhibitors showing novel interactions with the enzyme. Copyright © 2001 European Peptide Society and John Wiley & Sons, Ltd. [source]


    A branch-and-price approach for the maximum weight independent set problem

    NETWORKS: AN INTERNATIONAL JOURNAL, Issue 4 2005
    Deepak Warrier
    Abstract The maximum weight-independent set problem (MWISP) is one of the most well-known and well-studied problems in combinatorial optimization. This article presents a novel approach to solve MWISP exactly by decomposing the original graph into vertex-induced subgraphs. The approach solves MWISP for the original graph by solving MWISP on the subgraphs to generate columns for a branch-and-price framework. The authors investigate different implementation techniques that can be associated with the approach, and offer computational results to identify the strengths and weaknesses of each implementation technique. © 2005 Wiley Periodicals, Inc. NETWORKS, Vol. 46(4), 198,209 2005 [source]