Search Procedure (search + procedure)

Distribution by Scientific Domains


Selected Abstracts


Scheduling activities at oil wells with resource displacement

INTERNATIONAL TRANSACTIONS IN OPERATIONAL RESEARCH, Issue 6 2008
Arnaldo V. Moura
Abstract Before promising locations at petroliferous basins become productive oil wells, it is necessary to complete development activities at these locations. The scheduling of such activities must satisfy several conflicting constraints and attain a number of goals. Moreover, resource displacements between wells are important and must also be taken into account. The problem is NP-hard, as can be seen by a simple poly-time reduction from the Job Shop problem. This paper describes Greedy Randomized Adaptive Search Procedures (GRASPs) for the scheduling of oil well development activities with resource displacement. The heuristics were tested on real instances from a major oil company, recognized for its expertise in offshore oil exploration. Computational experiments over real instances revealed that the GRASP implementations are competitive, outperforming the software currently being used by the company. [source]


An Automated System for Argument Invention in Law Using Argumentation and Heuristic Search Procedures,

RATIO JURIS, Issue 4 2005
DOUGLAS WALTON
Argumentation schemes are forms of argument representing premise-conclusion and inference structures of common types of arguments. Schemes especially useful in law represent defeasible arguments, like argument from expert opinion. Argument diagramming is a visualization tool used to display a chain of connected arguments linked together. One such tool, Araucaria, available free at http://araucaria.computing.dundee.ac.uk/, helps a user display an argument on the computer screen as an inverted tree structure with an ultimate conclusion as the root of the tree. These argumentation tools are applicable to analyzing a mass of evidence in a case at trial, in a manner already known in law using heuristic methods (Schum 1994) and Wigmore diagrams (Wigmore 1931). In this paper it is shown how they can be automated and applied to the task of inventing legal arguments. One important application is to proof construction in trial preparation (Palmer 2003). [source]


Statistical optimization of octree searches

COMPUTER GRAPHICS FORUM, Issue 6 2008
Rener Castro
Abstract This work emerged from the following observation: usual search procedures for octrees start from the root to retrieve the data stored at the leaves. But as the leaves are the farthest nodes to the root, why start from the root? With usual octree representations, there is no other way to access a leaf. However, hashed octrees allow direct access to any node, given its position in space and its depth in the octree. Search procedures take the position as an input, but the depth remains unknown. This work proposes to estimate the depth of an arbitrary node through a statistical optimization of the average cost of search procedures. As the highest costs of these algorithms are obtained when starting from the root, this method improves on both the memory footprint by the use of hashed octrees, and execution time through the proposed optimization. [source]


A grasp-based motion planning algorithm for character animation

COMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 3 2001
Maciej Kalisiak
The design of autonomous characters capable of planning their own motions continues to be a challenge for computer animation. We present a novel kinematic motion-planning algorithm for character animation which addresses some of the outstanding problems. The problem domain for our algorithm is as follows: given a constrained environment with designated handholds and footholds, plan a motion through this space towards some desired goal. Our algorithm is based on a stochastic search procedure which is guided by a combination of geometric constraints, posture heuristics, and distance-to-goal metrics. The method provides a single framework for the use of multiple modes of locomotion in planning motions through these constrained, unstructured environments. We illustrate our results with demonstrations of a human character using walking, swinging, climbing, and crawling in order to navigate through various obstacle courses. Copyright © 2001 John Wiley & Sons, Ltd. [source]


Performance Measures for Selection of Metamodels to be Used in Simulation Optimization

DECISION SCIENCES, Issue 1 2002
Anthony C. Keys
ABSTRACT This paper points out the need for performance measures in the context of simulation optimization and suggests six such measures. Two of the measures are indications of absolute performance, whereas the other four are useful in assessing the relative performance of various candidate metamodels. The measures assess performance on three fronts: accuracy of placing optima in the correct location, fit to the response, and fit to the character of the surface (expressed in terms of the number of optima). Examples are given providing evidence of the measures' utility,one in a limited scenario deciding which of two competing metamodels to use as simulation optimization response surfaces vary, and the other in a scenario of a researcher developing a new, sequential optimization search procedure. [source]


Fuzzy based fast dynamic programming solution of unit commitment with ramp constraints

EXPERT SYSTEMS, Issue 4 2009
S. Patra
Abstract: A fast dynamic programming technique based on a fuzzy based unit selection procedure is proposed in this paper for the solution of the unit commitment problem with ramp constraints. The curse of dimensionality of the dynamic programming technique is eliminated by minimizing the number of prospective solution paths to be stored at each stage of the search procedure. Heuristics like priority ordering of the units, unit grouping, fast economic dispatch based on priority ordering, and avoidance of repeated economic dispatch through memory action have been employed to make the algorithm fast. The proposed method produced comparable results with the best performing methods found in the literature. [source]


An Empirically Based Implementation and Evaluation of a Hierarchical Model for Commuting Flows

GEOGRAPHICAL ANALYSIS, Issue 3 2010
Jens Petter Gitlesen
This article provides an empirical evaluation of a hierarchical approach to modeling commuting flows. As the gravity family of spatial interaction models represents a benchmark for empirical evaluation, we begin by reviewing basic aspects of these models. The hierarchical modeling framework is the same that Thorsen, Ubøe, and Nævdal (1999) used. However, because some modifications are required to construct a more workable model, we undertake a relatively detailed presentation of the model, rather than merely referring to the presentation in Thorsen, Ubøe, and Nævdal (1999). The model uses a hierarchical specification of a transportation network and the individual search procedure. Journeys to work are determined by the effects of distance deterrence and of intervening opportunities, and by the location of potential destinations relative to alternatives at subsequent levels in a transportation network. The model calibration uses commuting data from a region in western Norway. The estimated parameter values are reasonable, and the explanatory power is very satisfactory when compared with the results of a competing destinations approach. Este artículo presenta una evaluación empírica de un enfoque jerárquico para el modelado de flujos de desplazamientos del lugar de residencia al lugar de trabajo (commuting flows). Los modelos interacción espacial, y en particular los modelos de gravedad representan un buen punto de referencia para esta tarea. Por esta razón, los autores inician el estudio con una revisión de los aspectos básicos de estos modelos. El marco general del modelo jerárquico seleccionado es el mismo que emplean Thorsen, Ubøe y Nævdal (1999). Sin embargo, debido a que algunas modificaciones son necesarias para construir un método más viable, los autores presentan su versión del modelo de manera detallada en lugar de sólo hacer referencia a la versión de Thorsen, Ubøe y Nævdal. El modelo modificado propuesto emplea una especificación jerárquica para una red de transporte y hace uso de un procedimiento de búsqueda individual (individual search procedure). Los desplazamientos hacia el lugar de trabajo son establecidos en base a 1) los efectos limitantes de distancia de las oportunidades de desplazamiento, y 2) la localización de los posibles destinos medida en relación a las distintas alternativas existentes en los niveles inferiores de la jerarquía de la red de transporte. La calibración del modelo utiliza datos de desplazamientos de una región en el oeste de Noruega. Finalmente, los autores concluyen que los valores de los parámetros estimados obtenidos son razonables, y que el poder explicativo del modelo es muy satisfactorio en comparación a los resultados obtenidos por un análisis comparativo/competitivo de destinos a (competing destinations). [source]


Adaptive pattern nulling design of linear array antenna by phase-only perturbations using memetic algorithms

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN BIOMEDICAL ENGINEERING, Issue 11 2008
Chao-Hsing Hsu
Abstract In this paper, the pattern nulling of a linear array for interference cancellation is derived by phase-only perturbations using memetic algorithms (MAs). The MAs uses improvement procedures which is obtained by incorporating local search into the genetic algorithms. It is proposed to improve the search ability of genetic algorithms. MA is a kind of an improved type of the traditional genetic algorithms. By using local search procedure, it can avoid the shortcoming of the traditional genetic algorithms, whose termination criteria are set up by using the trial and error method. The MA is applied to find the pattern nulling of the proposed adaptive antenna. This design for radiation pattern nulling of an adaptive antenna can suppress interference by placing a null at the direction of the interfering source, i.e. to increase the signal to interference ratio. This proposed method is that an innovative adaptive antenna optimization technique is also able to solve the multipath problem which exists in practical wireless communication systems. Two examples are provided to justify the proposed phase-only perturbations approach based on MAs. Computer simulation results are given to demonstrate the effectiveness of the proposed method. Copyright © 2007 John Wiley & Sons, Ltd. [source]


Defining and optimizing algorithms for neighbouring particle identification in SPH fluid simulations

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN FLUIDS, Issue 6 2008
G. Viccione
Abstract Lagrangian particle methods such as smoothed particle hydrodynamics (SPH) are very demanding in terms of computing time for large domains. Since the numerical integration of the governing equations is only carried out for each particle on a restricted number of neighbouring ones located inside a cut-off radius rc, a substantial part of the computational burden depends on the actual search procedure; it is therefore vital that efficient methods are adopted for such a search. The cut-off radius is indeed much lower than the typical domain's size; hence, the number of neighbouring particles is only a little fraction of the total number. Straightforward determination of which particles are inside the interaction range requires the computation of all pair-wise distances, a procedure whose computational time would be unpractical or totally impossible for large problems. Two main strategies have been developed in the past in order to reduce the unnecessary computation of distances: the first based on dynamically storing each particle's neighbourhood list (Verlet list) and the second based on a framework of fixed cells. The paper presents the results of a numerical sensitivity study on the efficiency of the two procedures as a function of such parameters as the Verlet size and the cell dimensions. An insight is given into the relative computational burden; a discussion of the relative merits of the different approaches is also given and some suggestions are provided on the computational and data structure of the neighbourhood search part of SPH codes. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Fast template matching using correlation-based adaptive predictive search

INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY, Issue 3 2003
Shijun Sun
Abstract We have developed the Correlation-based Adaptive Predictive Search (CAPS) as a fast search strategy for multidimensional template matching. A 2D template is analyzed, and certain characteristics are computed from its autocorrelation. The extracted information is then used to speed up the search procedure. This method provides a significant improvement in computation time while retaining the accuracy of traditional full-search matching. We have extended CAPS to three and higher dimensions. An example of the third dimension is rotation where rotated targets can be located while again substantially reducing the computational requirements. CAPS can also be applied in multiple steps to further speed up the template matching process. Experiments were conducted to evaluate the performance of 2D, 3D, and multiple-step CAPS algorithms. Compared to the conventional full-search method, we achieved speedup ratios of up to 66.5 and 145 with 2D and 3D CAPS, respectively. © 2003 Wiley Periodicals, Inc. Int J Imaging Syst Technol 13, 169,178, 2003; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/ima.10055 [source]


A tabu search procedure for coordinating production, inventory and distribution routing problems

INTERNATIONAL TRANSACTIONS IN OPERATIONAL RESEARCH, Issue 2 2010
André Luís Shiguemoto
Abstract This paper addresses the problem of optimally coordinating a production-distribution system over a multi-period finite horizon, where a facility production produces several items that are distributed to a set of customers by a fleet of homogeneous vehicles. The demand for each item at each customer is known over the horizon. The production planning determines how much to produce of each item in every period, while the distribution planning defines when customers should be visited, the amount of each item that should be delivered to customers and the vehicle routes. The objective is to minimize the sum of production and inventory costs at the facility, inventory costs at the customers and distribution costs. We also consider a related problem of inventory routing, where a supplier receives or produces known quantities of items in each period and has to solve the distribution problem. We propose a tabu search procedure for solving such problems, and this approach is compared with vendor managed policies proposed in the literature, in which the facility knows the inventory levels of the customers and determines the replenishment policies. [source]


An annotated bibliography of GRASP,Part II: Applications

INTERNATIONAL TRANSACTIONS IN OPERATIONAL RESEARCH, Issue 2 2009
Paola Festa
Abstract A greedy randomized adaptive search procedure (GRASP) is a metaheuristic for combinatorial optimization. It is a multi-start or iterative process, in which each GRASP iteration consists of two phases, a construction phase, in which a feasible solution is produced, and a local search phase, in which a local optimum in the neighborhood of the constructed solution is sought. Since 1989, numerous papers on the basic aspects of GRASP, as well as enhancements to the basic metaheuristic, have appeared in the literature. GRASP has been applied to a wide range of combinatorial optimization problems, ranging from scheduling and routing to drawing and turbine balancing. This is the second of two papers with an annotated bibliography of the GRASP literature from 1989 to 2008. In the companion paper, algorithmic aspects of GRASP are surveyed. In this paper, we cover the literature where GRASP is applied to scheduling, routing, logic, partitioning, location, graph theory, assignment, manufacturing, transportation, telecommunications, biology and related fields, automatic drawing, power systems, and VLSI design. [source]


An annotated bibliography of GRASP , Part I: Algorithms

INTERNATIONAL TRANSACTIONS IN OPERATIONAL RESEARCH, Issue 1 2009
Paola Festa
Abstract A greedy randomized adaptive search procedure (GRASP) is a metaheuristic for combinatorial optimization. It is a multi-start or iterative process, in which each GRASP iteration consists of two phases, a construction phase, in which a feasible solution is produced, and a local search phase, in which a local optimum in the neighborhood of the constructed solution is sought. Since 1989, numerous papers on the basic aspects of GRASP, as well as enhancements to the basic metaheuristic have appeared in the literature. GRASP has been applied to a wide range of combinatorial optimization problems, ranging from scheduling and routing to drawing and turbine balancing. This is the first of two papers with an annotated bibliography of the GRASP literature from 1989 to 2008. This paper covers algorithmic aspects of GRASP. [source]


Two-stage computing budget allocation approach for the response surface method

INTERNATIONAL TRANSACTIONS IN OPERATIONAL RESEARCH, Issue 6 2007
J. Peng
Abstract Response surface methodology (RSM) is one of the main statistical approaches to search for an input combination that optimizes the simulation output. In the early stages of RSM, an iterative steepest ascent search procedure is frequently used. In this paper, we attempt to improve this procedure by considering a more realistic case where there are computing budget constraints, and formulate a new computing budget allocation problem to look into the important issue of allocating computing budget to the design points in the local region of experimentation. We propose a two-stage computing budget allocation approach, which uses a limited budget to estimate the response surface in the first stage and then uses the rest of the budget to improve the lower bound of the estimated response at the center of the next design region in the second stage. Several numerical experiments are carried out to compare the two-stage approach with the regular factorial design, which allocates budget equally to each design point. The results show that our two-stage allocation outperforms the equal allocation, especially when the system noise is large. [source]


New techniques for indexing: N-TREOR in EXPO

JOURNAL OF APPLIED CRYSTALLOGRAPHY, Issue 4 2000
Angela Altomare
Indexing of a powder diffraction pattern is still a critical point in procedures aiming at solving crystal structures from powder data. New code has been associated to the program TREOR90 in order to define an efficient peak search procedure, to modify the crystallographic decisions coded into TREOR90 to make it more exhaustive, to refine the selected unit cell automatically, and to make the entire procedure user friendly, via a graphical interface. The new program, called N-TREOR, has been integrated into the package EXPO to create a suite of programs able to provide a structural model from the analysis of the experimental pattern. N-TREOR is also available as a stand-alone program. [source]


Stochastic search for isomers on a quantum mechanical surface

JOURNAL OF COMPUTATIONAL CHEMISTRY, Issue 5 2004
Martin Saunders
Abstract In studying molecules with unusual bonding and structures, it is desirable to be able to find all the isomers that are minima on the energy surface. A stochastic search procedure is described for seeking all the isomers on a surface defined by quantum mechanical calculations involving random kicks followed by optimization. It has been applied to searching for singlet structures for C6 using the restricted Hartree,Fock/6-311G basis set. In addition to the linear chain and ring previously investigated, 11 additional structures (A,K) were located at this level. These provide a basis for discussing qualitative bonding motifs for this carbon cluster. The application of a similar idea to searching for transition states is discussed. © 2004 Wiley Periodicals, Inc. J Comput Chem 25: 621,626, 2004 [source]


Organic dyes as small molecule protein,protein interaction inhibitors for the CD40,CD154 costimulatory interaction

JOURNAL OF MOLECULAR RECOGNITION, Issue 1 2010
Peter Buchwald
Abstract It is becoming increasingly clear that small molecules can often act as effective protein,protein interaction (PPI) inhibitors, an area of increasing interest for its many possible therapeutic applications. We have identified several organic dyes and related small molecules that (i) concentration-dependently inhibit the important CD40,CD154 costimulatory interaction with activities in the low micromolar (µM) range, (ii) show selectivity toward this particular PPI, (iii) seem to bind on the surface of CD154, and (iv) concentration-dependently inhibit the CD154-induced B cell proliferation. They were identified through an iterative activity screening/structural similarity search procedure starting with suramin as lead, and the best smaller compounds, the main focus of the present work, achieved an almost 3-fold increase in ligand efficiency (,G0/nonhydrogen atom,=,0.8,kJ/NnHa) approaching the average of known promising small-molecule PPI inhibitors (,1.0,kJ/NnHa). Since CD154 is a member of the tumor necrosis factor (TNF) superfamily of cell surface interaction molecules, inhibitory activities on the TNF-R1,TNF- , interactions were also determined to test for specificity, and the compounds selected here all showed more than 30-fold selectivity toward the CD40,CD154 interaction. Because of their easy availability in various structural scaffolds and because of their good protein-binding ability, often explored for tissue-specific staining and other purposes, such organic dyes can provide a valuable addition to the chemical space searched to identify small molecule PPI inhibitors in general. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Neural Network Modeling of Constrained Spatial Interaction Flows: Design, Estimation, and Performance Issues

JOURNAL OF REGIONAL SCIENCE, Issue 1 2003
Manfred M Fischer
In this paper a novel modular product unit neural network architecture is presented to model singly constrained spatial interaction flows. The efficacy of the model approach is demonstrated for the origin constrained case of spatial interaction using Austrian interregional telecommunication traffic data. The model requires a global search procedure for parameter estimation, such as the Alopex procedure. A benchmark comparison against the standard origin constrained gravity model and the two,stage neural network approach, suggested by Openshaw (1998), illustrates the superiority of the proposed model in terms of the generalization performance measured by ARV and SRMSE. [source]


Attribution of tumour lethality and estimation of the time to onset of occult tumours in the absence of cause-of-death Information

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES C (APPLIED STATISTICS), Issue 2 2000
H. Ahn
A new statistical approach is developed for estimating the carcinogenic potential of drugs and other chemical substances used by humans. Improved statistical methods are developed for rodent tumorigenicity assays that have interval sacrifices but not cause-of-death data. For such experiments, this paper proposes a nonparametric maximum likelihood estimation method for estimating the distributions of the time to onset of and the time to death from the tumour. The log-likelihood function is optimized using a constrained direct search procedure. Using the maximum likelihood estimators, the number of fatal tumours in an experiment can be imputed. By applying the procedure proposed to a real data set, the effect of calorie restriction is investigated. In this study, we found that calorie restriction delays the tumour onset time significantly for pituitary tumours. The present method can result in substantial economic savings by relieving the need for a case-by-case assignment of the cause of death or context of observation by pathologists. The ultimate goal of the method proposed is to use the imputed number of fatal tumours to modify Peto's International Agency for Research on Cancer test for application to tumorigenicity assays that lack cause-of-death data. [source]


A branch-and-price algorithm for parallel machine scheduling with time windows and job priorities

NAVAL RESEARCH LOGISTICS: AN INTERNATIONAL JOURNAL, Issue 1 2006
Jonathan F. Bard
Abstract This paper presents a branch-and-price algorithm for scheduling n jobs on m nonhomogeneous parallel machines with multiple time windows. An additional feature of the problem is that each job falls into one of , priority classes and may require two operations. The objective is to maximize the weighted number of jobs scheduled, where a job in a higher priority class has "infinitely" more weight or value than a job in a lower priority class. The methodology makes use of a greedy randomized adaptive search procedure (GRASP) to find feasible solutions during implicit enumeration and a two-cycle elimination heuristic when solving the pricing subproblems. Extensive computational results are presented based on data from an application involving the use of communications relay satellites. Many 100-job instances that were believed to be beyond the capability of exact methods, were solved within minutes. © 2005 Wiley Periodicals, Inc. Naval Research Logistics, 2006 [source]


Models and heuristics for a minimum arborescence problem

NETWORKS: AN INTERNATIONAL JOURNAL, Issue 1 2008
Christophe Duhamel
Abstract The Minimum Arborescence problem (MAP) consists of finding a minimum cost arborescence in a directed graph. This problem is NP-Hard and is a generalization of two well-known problems: the Minimum Spanning Arborescence Problem (MSAP) and the Directed Node Weighted Steiner Tree Problem (DNWSTP). We start the model presentation in this paper by describing four models for the MSAP (including two new ones, using so called "connectivity" constraints which forbid disconnected components) and we then describe the changes induced on the polyhedral structure of the problem by the removal of the spanning property. Only two (the two new ones) of the four models for the MSAP remain valid when the spanning property is removed. We also describe a multicommodity flow reformulation for the MAP that differs from well-known multicommodity flow reformulations in the sense that the flow conservation constraints at source and destination are replaced by inequalities. We show that the linear programming relaxation of this formulation is equivalent to the linear programming relaxation of the best of the two previous valid formulations and we also propose two Lagrangean relaxations based on the multicommodity flow reformulation. From the upper bound perspective, we describe a constructive heuristic as well as a local search procedure involving the concept of key path developed earlier for the Steiner Tree Problem. Numerical experiments taken from instances with up to 400 nodes are used to evaluate the proposed methods. © 2007 Wiley Periodicals, Inc. NETWORKS, 2008 [source]


On the impact of the solution representation for the Internet Protocol Network Design Problem with max-hop constraints

NETWORKS: AN INTERNATIONAL JOURNAL, Issue 2 2004
L. De Giovanni
Abstract The IP (Internet Protocol) Network Design Problem can be shortly stated as follows. Given a set of nodes and a set of traffic demands, we want to determine the minimum cost capacity installation such that all the traffic is routed. Capacity is provided by means of links of a given capacity and traffic must be loaded on the network according to the OSPF-ECM (Open Shortest Path First,Equal Commodity Multiflow) protocol, with additional constraints on the maximum number of hops. The problem is strongly NP-Hard, and the literature proposes local search-based heuristics that do not take into account max-hop constraints, or assume a simplified OSPF routing. The core in a local search approach is the network loading algorithm for the evaluation of the neighbor solutions costs. It presents critical aspects concerning both computational efficiency and memory requirements. Starting from a tabu search prototype, we show how these aspects deeply impact on the design of a local search procedure, even at the logical level. We present several properties of the related network loading problem, that allow to overcome the critical issues and lead to an efficient solution evaluation. © 2004 Wiley Periodicals, Inc. NETWORKS, VoL. 44(2), 73,83 2004 [source]


Economic contribution of French serradella (Ornithopus sativus Brot.) pasture to integrated weed management in Western Australian mixed-farming systems: an application of compressed annealing,

AUSTRALIAN JOURNAL OF AGRICULTURAL & RESOURCE ECONOMICS, Issue 2 2009
Graeme J. Doole
Sowing phases of French serradella (Ornithopus sativus Brot.) pasture between extended cropping sequences in the Western Australian wheatbelt can sustain grain production through restoring soil fertility and reducing selective herbicide use. The objective of this article is to investigate the profitability of rotations involving this pasture under a variety of weed management scenarios to obtain greater insight into its value for mixed farming systems in this region. A stochastic search procedure, compressed annealing, is used to identify profitable sets of weed management strategies in a simulation model representing a large number of potential combinations of chemical and non-chemical forms of weed control. In contrast to a continuous-cropping sequence, the inclusion of a serradella phase in a rotation is profitable at high weed densities and with increasing levels of herbicide resistance. A single year of pasture in the rotation is optimal if resistance to Group A selective herbicides is present at the beginning of the planning horizon, but a three-year phase is required if resistance to multiple herbicide groups is observed. Sowing a serradella pasture twice over a two-year phase is also shown to be economically attractive given benefits of successive high weed kills. [source]


Statistical optimization of octree searches

COMPUTER GRAPHICS FORUM, Issue 6 2008
Rener Castro
Abstract This work emerged from the following observation: usual search procedures for octrees start from the root to retrieve the data stored at the leaves. But as the leaves are the farthest nodes to the root, why start from the root? With usual octree representations, there is no other way to access a leaf. However, hashed octrees allow direct access to any node, given its position in space and its depth in the octree. Search procedures take the position as an input, but the depth remains unknown. This work proposes to estimate the depth of an arbitrary node through a statistical optimization of the average cost of search procedures. As the highest costs of these algorithms are obtained when starting from the root, this method improves on both the memory footprint by the use of hashed octrees, and execution time through the proposed optimization. [source]


Strategy hubs: Domain portals to help find comprehensive information

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, Issue 1 2006
Suresh K. Bhavnani
Recent studies suggest that the wide variability in type, detail, and reliability of online information motivate expert searchers to develop procedural search knowledge. In contrast to prior research that has focused on finding relevant sources, procedural search knowledge focuses on how to order multiple relevant sources with the goal of retrieving comprehensive information. Because such procedural search knowledge is neither spontaneously inferred from the results of search engines, nor from the categories provided by domain-specific portals, the lack of such knowledge leads most novice searchers to retrieve incomplete information. In domains like healthcare, such incomplete information can lead to dangerous consequences. To address the above problem, a new kind of domain portal called a Strategy Hub was developed and tested. Strategy Hubs provide critical search procedures and associated high-quality links to enable users to find comprehensive and accurate information. We begin by describing how we collaborated with physicians to systematically identify generalizable search procedures to find comprehensive information about a disease, and how these search procedures were made available through the Strategy Hub. A controlled experiment suggests that this approach can improve the ability of novice searchers in finding comprehensive and accurate information, when compared to general-purpose search engines and domain-specific portals. We conclude with insights on how to refine and automate the Strategy Hub design, with the ultimate goal of helping users find more comprehensive information when searching in unfamiliar domains. [source]