Home About us Contact | |||
Distance Function (distance + function)
Selected AbstractsThe impact of the 1999 CAP reforms on the efficiency of the COP sector in SpainAGRICULTURAL ECONOMICS, Issue 3 2009Fatima Lambarraa Agenda 2000; Distance function; Efficiency; Spanish COP sector Abstract The cereal, oilseeds, and protein crop sector (COP) occupies a prominent position within the European Union's agricultural sector. Within Spain, the COP sector accounts for almost a third of total Agricultural Guidance and Guarantee Fund expenses, and half of the utilized agricultural area (UAA). The COP sector is not only relevant because of its physical and economic magnitude, but also because of the political attention it receives. The Common Agricultural Policy (CAP) reforms that occurred during the 1990s paid special attention to this sector. This article aims to determine the impacts of Agenda 2000 on a sample of Spanish COP farmers' production decisions by using an output-oriented stochastic distance function. The distance function allows for an assessment of the reform-motivated changes on total output, input used, input composition, and crop mix. It also permits an assessment of the impacts of the reform on farms' technical efficiency. Results show that the reform has shifted the production frontier inward and changed output composition in favor of voluntary set-aside land. With respect to input composition, Agenda 2000 induced a decrease in land, fertilizers, pesticides, and other inputs in favor of labor. In addition, Agenda 2000 has had a negative impact on technical efficiency. [source] Perceptual 3D pose distance estimation by boosting relational geometric featuresCOMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 2-3 2009Cheng Chen Abstract Traditional pose similarity functions based on joint coordinates or rotations often do not conform to human perception. We propose a new perceptual pose distance: Relational Geometric Distance that accumulates the differences over a set of features that reflects the geometric relations between different body parts. An extensive relational geometric feature pool that contains a large number of potential features is defined, and the features effective for pose similarity estimation are selected using a set of labeled data by Adaboost. The extensive feature pool guarantees that a wide diversity of features is considered, and the boosting ensures that the selected features are optimized when used jointly. Finally, the selected features form a pose distance function that can be used for novel poses. Experiments show that our method outperforms others in emulating human perception in pose similarity. Our method can also adapt to specific motion types and capture the features that are important for pose similarity of a certain motion type. Copyright © 2009 John Wiley & Sons, Ltd. [source] Manifold Homotopy via the Flow ComplexCOMPUTER GRAPHICS FORUM, Issue 5 2009Bardia Sadri Abstract It is known that the critical points of the distance function induced by a dense sample P of a submanifold , of ,n are distributed into two groups, one lying close to , itself, called the shallow, and the other close to medial axis of ,, called deep critical points. We prove that under (uniform) sampling assumption, the union of stable manifolds of the shallow critical points have the same homotopy type as , itself and the union of the stable manifolds of the deep critical points have the homotopy type of the complement of ,. The separation of critical points under uniform sampling entails a separation in terms of distance of critical points to the sample. This means that if a given sample is dense enough with respect to two or more submanifolds of ,n, the homotopy types of all such submanifolds together with those of their complements are captured as unions of stable manifolds of shallow versus those of deep critical points, in a filtration of the flow complex based on the distance of critical points to the sample. This results in an algorithm for homotopic manifold reconstruction when the target dimension is unknown. [source] Accounting for quality in the measurement of hospital performance: evidence from Costa RicaHEALTH ECONOMICS, Issue 7 2007Pablo Arocena Abstract This paper provides insights into how Costa Rican public hospitals responded to the pressure for increased efficiency and quality introduced by the reforms carried out over the period 1997,2001. To that purpose we compute a generalized output distance function by means of non-parametric mathematical programming to construct a productivity index, which accounts for productivity changes while controlling for quality of care. Our results show an improvement in hospital performance mainly driven by quality increases. The adoption of management contracts seems to have contributed to such enhancement, more notably for small hospitals. Further, productivity growth is primarily due to technical and scale efficiency change rather than technological change. A number of policy implications are drawn from these results. Copyright © 2006 John Wiley & Sons, Ltd. [source] Piecewise constant level set method for structural topology optimizationINTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 4 2009Peng Wei Abstract In this paper, a piecewise constant level set (PCLS) method is implemented to solve a structural shape and topology optimization problem. In the classical level set method, the geometrical boundary of the structure under optimization is represented by the zero level set of a continuous level set function, e.g. the signed distance function. Instead, in the PCLS approach the boundary is described by discontinuities of PCLS functions. The PCLS method is related to the phase-field methods, and the topology optimization problem is defined as a minimization problem with piecewise constant constraints, without the need of solving the Hamilton,Jacobi equation. The result is not moving the boundaries during the iterative procedure. Thus, it offers some advantages in treating geometries, eliminating the reinitialization and naturally nucleating holes when needed. In the paper, the PCLS method is implemented with the additive operator splitting numerical scheme, and several numerical and procedural issues of the implementation are discussed. Examples of 2D structural topology optimization problem of minimum compliance design are presented, illustrating the effectiveness of the proposed method. Copyright © 2008 John Wiley & Sons, Ltd. [source] Subsessions: A granular approach to click path analysisINTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 7 2004Ernestina Menasalvas The fiercely competitive web-based electronic commerce (e-commerce) environment has made necessary the application of intelligent methods to gather and analyze information collected from consumer web sessions. Knowledge about user behavior and session goals can be discovered from the information gathered about user activities, as tracked by web clicks. Most current approaches to customer behavior analysis study the user session by examining each web page access. However, the abstraction of subsessions provides a more granular view of user activity. Here, we propose a method of increasing the granularity of the user session analysis by isolating useful subsessions within sessions. Each subsession represents a high-level user activity such as performing a purchase or searching for a particular type of information. Given a set of previously identified subsessions, we can determine at which point the user begins a preidentified subsession by tracking user clicks. With this information we can (1) optimize the user experience by precaching pages or (2) provide an adaptive user experience by presenting pages according to our estimation of the user's ultimate goal. To identify subsessions, we present an algorithm to compute frequent click paths from which subsessions then can be isolated. The algorithm functions by scanning all user sessions and extracting all frequent subpaths by using a distance function to determining subpath similarity. Each frequent subpath represents a subsession. An analysis of the pages represented by the subsession provides additional information about semantically related activities commonly performed by users. © 2004 Wiley Periodicals, Inc. [source] A fuzzy goal programming procedure for solving quadratic bilevel programming problemsINTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 5 2003Bijay Baran Pal This article presents a fuzzy goal programming (FGP) procedure for solving quadratic bilevel programming problems (QBLPP). In the proposed approach, the membership functions for the defined fuzzy objective goals of the decision makers (DM) at both the levels are developed first. Then, a quadratic programming model is formulated by using the notion of distance function minimizing the degree of regret to satisfaction of both DMs. At the first phase of the solution process, the quadratic programming model is transformed into an equivalent nonlinear goal programming (NLGP) model to maximize the membership value of each of the fuzzy objective goals on the extent possible on the basis of their priorities in the decision context. Then, at the second phase, the concept of linear approximation technique in goal programming is introduced for measuring the degree of satisfaction of the DMs at both the levels by arriving at a compromised decision regarding the optimality of two different sets of decision variables controlled separately by each of them. A numerical example is provided to illustrate the proposed approach. © 2003 Wiley Periodicals, Inc. [source] Explaining agricultural productivity growth: an international perspectiveAGRICULTURAL ECONOMICS, Issue 1 2010Derek Headey Labor productivity; Multi-output distance function; Total factor productivity Abstract This article presents multi-output, multi-input total factor productivity (TFP) growth rates in agriculture for 88 countries over the 1970,2001 period, estimated with both stochastic frontier analysis (SFA) and the more commonly employed data envelopment analysis (DEA). We find results with SFA to be more plausible than with DEA, and use them to analyze trends across countries and the determinants of TFP growth in developing countries. The central finding is that policy and institutional variables, including public agricultural expenditure and proagricultural price policy reforms, are significant correlates of TFP growth. The most significant geographic correlate of TFP growth is distance to the nearest OECD country. [source] The impact of the 1999 CAP reforms on the efficiency of the COP sector in SpainAGRICULTURAL ECONOMICS, Issue 3 2009Fatima Lambarraa Agenda 2000; Distance function; Efficiency; Spanish COP sector Abstract The cereal, oilseeds, and protein crop sector (COP) occupies a prominent position within the European Union's agricultural sector. Within Spain, the COP sector accounts for almost a third of total Agricultural Guidance and Guarantee Fund expenses, and half of the utilized agricultural area (UAA). The COP sector is not only relevant because of its physical and economic magnitude, but also because of the political attention it receives. The Common Agricultural Policy (CAP) reforms that occurred during the 1990s paid special attention to this sector. This article aims to determine the impacts of Agenda 2000 on a sample of Spanish COP farmers' production decisions by using an output-oriented stochastic distance function. The distance function allows for an assessment of the reform-motivated changes on total output, input used, input composition, and crop mix. It also permits an assessment of the impacts of the reform on farms' technical efficiency. Results show that the reform has shifted the production frontier inward and changed output composition in favor of voluntary set-aside land. With respect to input composition, Agenda 2000 induced a decrease in land, fertilizers, pesticides, and other inputs in favor of labor. In addition, Agenda 2000 has had a negative impact on technical efficiency. [source] Between ends and fibersJOURNAL OF GRAPH THEORY, Issue 2 2007C. Paul Bonnington Abstract Let , be an infinite, locally finite, connected graph with distance function ,. Given a ray P in , and a constant C , 1, a vertex-sequence is said to be regulated by C if, for all n,,, never precedes xn on P, each vertex of P appears at most C times in the sequence, and . R. Halin (Math. Ann., 157, 1964, 125,137) defined two rays to be end-equivalent if they are joined by infinitely many pairwise-disjoint paths; the resulting equivalence classes are called ends. More recently H. A. Jung (Graph Structure Theory, Contemporary Mathematics, 147, 1993, 477,484) defined rays P and Q to be b-equivalent if there exist sequences and VQ regulated by some constant C , 1 such that for all n,,; he named the resulting equivalence classes b-fibers. Let denote the set of nondecreasing functions from into the set of positive real numbers. The relation (called f-equivalence) generalizes Jung's condition to . As f runs through , uncountably many equivalence relations are produced on the set of rays that are no finer than b -equivalence while, under specified conditions, are no coarser than end-equivalence. Indeed, for every , there exists an "end-defining function" that is unbounded and sublinear and such that implies that P and Q are end-equivalent. Say if there exists a sublinear function such that . The equivalence classes with respect to are called bundles. We pursue the notion of "initially metric" rays in relation to bundles, and show that in any bundle either all or none of its rays are initially metric. Furthermore, initially metric rays in the same bundle are end-equivalent. In the case that , contains translatable rays we give some sufficient conditions for every f -equivalence class to contain uncountably many g -equivalence classes (where ). We conclude with a variety of applications to infinite planar graphs. Among these, it is shown that two rays whose union is the boundary of an infinite face of an almost-transitive planar map are never bundle- equivalent. © 2006 Wiley Periodicals, Inc. J Graph Theory 54: 125,153, 2007 [source] Complex-distance potential theory, wave equations, and physical waveletsMATHEMATICAL METHODS IN THE APPLIED SCIENCES, Issue 16-18 2002Gerald Kaiser Potential theory in ,n is extended to ,n by analytically continuing the Euclidean distance function. The extended Newtonian potential ,(z) is generated by a (non-holomorphic) source distribution ,,(z) extending the usual point source ,(x). With Minkowski space ,n, 1 embedded in ,n+1, the Laplacian ,n+1 restricts to the wave operator ,n,1 in ,n, 1. We show that ,,(z) acts as a propagator generating solutions of the wave equation from their initial values, where the Cauchy data need not be assumed analytic. This generalizes an old result by Garabedian, who established a connection between solutions of the boundary-value problem for ,n+1 and the initial-value problem for ,n,1 provided the boundary data extends holomorphically to the initial data. We relate these results to the physical avelets introduced previously. In the context of Clifford analysis, our methods can be used to extend the Borel,Pompeiu formula from ,n+1 to ,n+1, where its riction to Minkowski space ,n, 1 provides solutions for time-dependent Maxwell and Dirac equations. Copyright © 2002 John Wiley & Sons, Ltd. [source] Genetic data in population viability analysis: case studies with ambystomatid salamandersANIMAL CONSERVATION, Issue 2 2010K. R. Greenwald Abstract Parameterization of population viability models is a complicated task for most types of animals, as knowledge of population demography, abundance and connectivity can be incomplete or unattainable. Here I illustrate several ways in which genetic data can be used to inform population viability analysis, via the parameterization of both initial abundance and dispersal matrices. As case studies, I use three ambysomatid salamander datasets to address the following question: how do population viability predictions change when dispersal estimates are based on genetic assignment test data versus a general dispersal,distance function? Model results showed that no local population was large enough to ensure long-term persistence in the absence of immigration, suggesting a metapopulation structure. Models parameterized with a dispersal,distance function resulted in much more optimistic predictions than those incorporating genetic data in the dispersal estimates. Under the dispersal,distance function scenario all local populations persisted; however, using genetic assignments to infer dispersal revealed local populations at risk of extinction. Viability estimates based on dispersal,distance functions should be interpreted with caution, especially in heterogeneous landscapes. In these situations I promote the idea of model parameterization using genetic assignment tests for a more accurate portrayal of real-world dispersal patterns. [source] A Bayesian Spatial Multimarker Genetic Random-Effect Model for Fine-Scale MappingANNALS OF HUMAN GENETICS, Issue 5 2008M.-Y. Tsai Summary Multiple markers in linkage disequilibrium (LD) are usually used to localize the disease gene location. These markers may contribute to the disease etiology simultaneously. In contrast to the single-locus tests, we propose a genetic random effects model that accounts for the dependence between loci via their spatial structures. In this model, the locus-specific random effects measure not only the genetic disease risk, but also the correlations between markers. In other words, the model incorporates this relation in both mean and covariance structures, and the variance components play important roles. We consider two different settings for the spatial relations. The first is our proposal, relative distance function (RDF), which is intuitive in the sense that markers nearby are likely to correlate with each other. The second setting is a common exponential decay function (EDF). Under each setting, the inference of the genetic parameters is fully Bayesian with Markov chain Monte Carlo (MCMC) sampling. We demonstrate the validity and the utility of the proposed approach with two real datasets and simulation studies. The analyses show that the proposed model with either one of two spatial correlations performs better as compared with the single locus analysis. In addition, under the RDF model, a more precise estimate for the disease locus can be obtained even when the candidate markers are fairly dense. In all simulations, the inference under the true model provides unbiased estimates of the genetic parameters, and the model with the spatial correlation structure does lead to greater confidence interval coverage probabilities. [source] THE RELATIVE EFFICIENCY OF CHARTER SCHOOLSANNALS OF PUBLIC AND COOPERATIVE ECONOMICS, Issue 1 2009Shawna Grosskopf ABSTRACT,:,This analysis compares the technical efficiency of charter school primary and secondary campuses with that of comparable campuses in traditional Texas school districts. Charter schools are hybrids,publicly funded, but not required to meet all the state regulations releant for traditional schools. Student performance is measured using value added on standardized tests in reading and mathematics, and efficiency is measured using the input distance function. The analysis suggests that at least in Texas, charter schools are substantially more efficient than traditional public schools. [source] Input usage, output mix and industry deregulation: an analysis of the Australian dairy manufacturing industry,AUSTRALIAN JOURNAL OF AGRICULTURAL & RESOURCE ECONOMICS, Issue 2 2007Kelvin Balcombe In this paper we estimate a Translog output distance function for a balanced panel of state level data for the Australian dairy processing sector. We estimate a fixed effects specification employing Bayesian methods, with and without the imposition of monotonicity and curvature restrictions. Our results indicate that Tasmania and Victoria are the most technically efficient states with New South Wales being the least efficient. The imposition of theoretical restrictions marginally affects the results especially with respect to estimates of technical change and industry deregulation. Importantly, our bias estimates show changes in both input use and output mix that result from deregulation. Specifically, we find that deregulation has positively biased the production of butter, cheese and powders. [source] A Hybrid Approach to Multiple Fluid Simulation using Volume FractionsCOMPUTER GRAPHICS FORUM, Issue 2 2010Nahyup Kang Abstract This paper presents a hybrid approach to multiple fluid simulation that can handle miscible and immiscible fluids, simultaneously. We combine distance functions and volume fractions to capture not only the discontinuous interface between immiscible fluids but also the smooth transition between miscible fluids. Our approach consists of four steps: velocity field computation, volume fraction advection, miscible fluid diffusion, and visualization. By providing a combining scheme between volume fractions and level set functions, we are able to take advantages of both representation schemes of fluids. From the system point of view, our work is the first approach to Eulerian grid-based multiple fluid simulation including both miscible and immiscible fluids. From the technical point of view, our approach addresses the issues arising from variable density and viscosity together with material diffusion. We show that the effectiveness of our approach to handle multiple miscible and immiscible fluids through experiments. [source] DiFi: Fast 3D Distance Field Computation Using Graphics HardwareCOMPUTER GRAPHICS FORUM, Issue 3 2004Avneesh Sud We present an algorithm for fast computation of discretized 3D distance fields using graphics hardware. Given a set of primitives and a distance metric, our algorithm computes the distance field for each slice of a uniform spatial grid baly rasterizing the distance functions of the primitives. We compute bounds on the spatial extent of the Voronoi region of each primitive. These bounds are used to cull and clamp the distance functions rendered for each slice. Our algorithm is applicable to all geometric models and does not make any assumptions about connectivity or a manifold representation. We have used our algorithm to compute distance fields of large models composed of tens of thousands of primitives on high resolution grids. Moreover, we demonstrate its application to medial axis evaluation and proximity computations. As compared to earlier approaches, we are able to achieve an order of magnitude improvement in the running time. Categories and Subject Descriptors (according to ACM CCS): I.3.3 [Computer Graphics]: Distance fields, Voronoi regions, graphics hardware, proximity computations [source] Spatial point-process statistics: concepts and application to the analysis of lead contamination in urban soil,ENVIRONMETRICS, Issue 4 2005Christian Walter Abstract This article explores the use of spatial point-process analysis as an aid to describe topsoil lead distribution in urban environments. The data used were collected in Glebe, an inner suburb of Sydney. The approach focuses on the locations of punctual events defining a point pattern, which can be statistically described through local intensity estimates and between-point distance functions. F -, G - and K -surfaces of a marked spatial point pattern were described and used to estimate nearest distance functions over a sliding band of quantiles belonging to the marking variable. This provided a continuous view of the point pattern properties as a function of the marking variable. Several random fields were simulated by selecting points from random, clustered or regular point processes and diffusing them. Recognition of the underlying point process using variograms derived from dense sampling was difficult because, structurally, the variograms were very similar. Point-event distance functions were useful complimentary tools that, in most cases, enabled clear recognition of the clustered processes. Spatial sampling quantile point pattern analysis was defined and applied to the Glebe data set. The analysis showed that the highest lead concentrations were strongly clustered. The comparison of this data set with the simulation confidence limits of a Poisson process, a short-radius clustered point process and a geostatistical simulation showed a random process for the third quartile of lead concentrations but strong clustering for the data in the upper quartile. Thus the distribution of topsoil lead concentrations over Glebe may have resulted from several contamination processes, mainly from regular or random processes with large diffusion ranges and short-range clustered processes for the hot spots. Point patterns with the same characteristics as the Glebe experimental pattern could be generated by separate additive geostatistical simulation. Spatial sampling quantile point patterns statistics can, in an easy and accurate way, be used complementarily with geostatistical methods. Copyright © 2005 John Wiley & Sons, Ltd. [source] Aggregation of ordinal and cardinal preferences: a framework based on distance functionsJOURNAL OF MULTI CRITERIA DECISION ANALYSIS, Issue 3-4 2008Jacinto González-Pachón Abstract In this paper a collective choice function (CCF), formulated within a p -metric distance function framework, is proposed as a generator of several compromise consensuses. Even though, the proposed CCF is not smooth, it is however demonstrated that it can be straightforwardly transformed into easily computable goal programming models. Finally, several cases of individual preferences aggregation are obtained by providing different interpretations of the CCF parameters: ordinal and complete information, ordinal and partial information and a cardinal case through ,pairwise' comparison matrices. Copyright © 2009 John Wiley & Sons, Ltd. [source] Indicators and indexes of directional output loss and input allocative inefficiencyMANAGERIAL AND DECISION ECONOMICS, Issue 7 2008Hirofumi Fukuyama We extend Grosskopf et al.'s method and create an output loss indicator using directional output distance functions that allows non-radial efficiency gains in output. We compare our new output loss indicator with an indicator of input allocative inefficiency and derive the necessary and sufficient condition for equivalence between the two indicators. We also present an output loss index and corresponding input allocative inefficiency index and consider how indexes are related. Then we extend our analysis to productivity change and derive the necessary and sufficient condition for an output loss change indicator to be equivalent to an indicator of input allocative inefficiency change. Copyright © 2008 John Wiley & Sons, Ltd. [source] On the complexity of finding paths in a two-dimensional domain I: Shortest pathsMLQ- MATHEMATICAL LOGIC QUARTERLY, Issue 6 2004Arthur W. Chou Abstract The computational complexity of finding a shortest path in a two-dimensional domain is studied in the Turing machine-based computational model and in the discrete complexity theory. This problem is studied with respect to two formulations of polynomial-time computable two-dimensional domains: (A) domains with polynomialtime computable boundaries, and (B) polynomial-time recognizable domains with polynomial-time computable distance functions. It is proved that the shortest path problem has the polynomial-space upper bound for domains of both type (A) and type (B); and it has a polynomial-space lower bound for the domains of type (B), and has a #P lower bound for the domains of type (A). (© 2004 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] Genetic data in population viability analysis: case studies with ambystomatid salamandersANIMAL CONSERVATION, Issue 2 2010K. R. Greenwald Abstract Parameterization of population viability models is a complicated task for most types of animals, as knowledge of population demography, abundance and connectivity can be incomplete or unattainable. Here I illustrate several ways in which genetic data can be used to inform population viability analysis, via the parameterization of both initial abundance and dispersal matrices. As case studies, I use three ambysomatid salamander datasets to address the following question: how do population viability predictions change when dispersal estimates are based on genetic assignment test data versus a general dispersal,distance function? Model results showed that no local population was large enough to ensure long-term persistence in the absence of immigration, suggesting a metapopulation structure. Models parameterized with a dispersal,distance function resulted in much more optimistic predictions than those incorporating genetic data in the dispersal estimates. Under the dispersal,distance function scenario all local populations persisted; however, using genetic assignments to infer dispersal revealed local populations at risk of extinction. Viability estimates based on dispersal,distance functions should be interpreted with caution, especially in heterogeneous landscapes. In these situations I promote the idea of model parameterization using genetic assignment tests for a more accurate portrayal of real-world dispersal patterns. [source] Environmental Performance Assessment of China's ManufacturingASIAN ECONOMIC JOURNAL, Issue 1 2010Tao Zhang O47; P28; R15 This paper applies the data envelopment analysis approach to contribute to the debate on the environmental performance of China's manufacturing sector. Regional and periodic differences in environmental efficiency, environmental quantity and environmental change indexes in China's manufacturing sector are examined for the period between 1998 and 2002. Within the framework of data envelopment analysis and distance functions, environmental quantity and environmental change indexes are measured as variants of the Malmquist quantity index. The overall environmental efficiency of China's manufacturing sector is very low, indicating substantial potential to reduce pollution emissions in China's manufacturing industries. The results and implications of this study can provide helpful information to improve the environmental performance of China's manufacturing sector. [source] |