Filtering

Distribution by Scientific Domains
Distribution within Engineering

Kinds of Filtering

  • habitat filtering
  • information filtering
  • kalman filtering
  • low-pass filtering
  • optimal filtering
  • spatial filtering
  • wiener filtering

  • Terms modified by Filtering

  • filtering algorithm
  • filtering approach
  • filtering bleb
  • filtering error system
  • filtering method
  • filtering methods
  • filtering problem
  • filtering property
  • filtering technique
  • filtering techniques

  • Selected Abstracts


    HIGH-DIMENSIONAL LEARNING FRAMEWORK FOR ADAPTIVE DOCUMENT FILTERING,

    COMPUTATIONAL INTELLIGENCE, Issue 1 2003
    Wai Lam
    We investigate the unique requirements of the adaptive textual document filtering problem and propose a new high-dimensional on-line learning framework, known as the REPGER (relevant feature pool with good training example retrieval rule) algorithm to tackle this problem. Our algorithm possesses three characteristics. First, it maintains a pool of selective features with potentially high predictive power to predict document relevance. Second, besides retrieving documents according to their predicted relevance, it also retrieves incoming documents that are considered good training examples. Third, it can dynamically adjust the dissemination threshold throughout the filtering process so as to maintain a good filtering performance in a fully interactive environment. We have conducted experiments on three document corpora, namely, Associated Press, Foreign Broadcast Information Service, and Wall Street Journal to compare the performance of our REPGER algorithm with two existing on-line learning algorithms. The results demonstrate that our REPGER algorithm gives better performance most of the time. Comparison with the TREC (Text Retrieval Conference) adaptive text filtering track participants was also made. The result shows that our REPGER algorithm is comparable to them. [source]


    LMI APPROACH TO ROBUST FILTERING FOR DISCRETE TIME-DELAY SYSTEMS WITH NONLINEAR DISTURBANCES

    ASIAN JOURNAL OF CONTROL, Issue 2 2005
    Huijun Gao
    ABSTRACT This paper investigates the problem of robust filtering for a class of uncertain nonlinear discrete-time systems with multiple state delays. It is assumed that the parameter uncertainties appearing in all the system matrices reside in a polytope, and that the nonlinearities entering into both the state and measurement equations satisfy global Lipschitz conditions. Attention is focused on the design of robust full-order and reduced-order filters guaranteeing a prescribed noise attenuation level in an H, or l2 - l, sense with respect to all energy-bounded noise disturbances for all admissible uncertainties and time delays. Both delay-dependent and independent approaches are developed by using linear matrix inequality (LMI) techniques, which are applicable to systems either with or without a priori information on the size of delays. [source]


    Fast High-Dimensional Filtering Using the Permutohedral Lattice

    COMPUTER GRAPHICS FORUM, Issue 2 2010
    Andrew Adams
    Abstract Many useful algorithms for processing images and geometry fall under the general framework of high-dimensional Gaussian filtering. This family of algorithms includes bilateral filtering and non-local means. We propose a new way to perform such filters using the permutohedral lattice, which tessellates high-dimensional space with uniform simplices. Our algorithm is the first implementation of a high-dimensional Gaussian filter that is both linear in input size and polynomial in dimensionality. Furthermore it is parameter-free, apart from the filter size, and achieves a consistently high accuracy relative to ground truth (> 45 dB). We use this to demonstrate a number of interactive-rate applications of filters in as high as eight dimensions. [source]


    Volumetric Filtering, Modeling and Visualization for Nano-Medicine

    COMPUTER GRAPHICS FORUM, Issue 3 2003
    Chandrajit Bajaj
    The 3D structures of individual proteins or small complexes, such as most of the Protein Data Bank entries, are still unable to yield the "full picture" of a functional biological complex. The study of large macromolecular complexes, such as viruses, ion channels, the ribosome and other macromolecular machines of various types, offer more complete structural and functional description of the nano-machinery of life. In addition to x-ray crystallography. NMR spectroscopy, electron cryomicroscopy (cryoEM) imaging of single particles, and in-vivo molecular tomographic imaging has become indispensable at revealing the structures of large macromolecular complexes at subnanometer resolutions. In this talk, I shall describe some of the recent computational advances in filtering, modeling, analysis and visualization, that have propelled structure determination by cryoEM and tomographic imaging, to steadily increasing accuracy. [source]


    Filtering Out the Noise

    ACADEMIC EMERGENCY MEDICINE, Issue 5 2003
    Michelle H. Biros MD
    No abstract is available for this article. [source]


    Constrained Kalman Filtering: Additional Results

    INTERNATIONAL STATISTICAL REVIEW, Issue 2 2010
    Adrian Pizzinga
    Summary This paper deals with linear state space modelling subject to general linear constraints on the state vector. The discussion concentrates on four topics: the constrained Kalman filtering versus the recursive restricted least squares estimator; a new proof of the constrained Kalman filtering under a conditional expectation framework; linear constraints under a reduced state space modelling; and state vector prediction under linear constraints. The techniques proposed are illustrated in two real problems. The first problem is related to investment analysis under a dynamic factor model, whereas the second is about making constrained predictions within a GDP benchmarking estimation. Résumé Cet article traite des modèles espace-état sujets aux restrictions linéaires générales sur le vecteur d'état. La discussion se concentre autour de quatre aspects: le filtrage de Kalman restreint versus l'estimateur de moindres carrés restreint recursive; une nouvelle preuve du filtrage de Kalman restreint sous le cadre de l'espérance conditionelle; restrictions linéaires aux modèles espace-état réduits; et la prédiction d'état sous restrictions linéaires. Les techniques proposées sont illustrées par deux problèmes réels. Le premier problème est concerné par l'analyse d'investissement sous un modèle à facteur dynamique, tandis que le second concerne les prédictions restreintes dans l'estimation de benchmarking. [source]


    Deblurring Images: Matrices, Spectra and Filtering by Per Christian Hansen, James G. Nagy, Dianne P. O'Leary

    INTERNATIONAL STATISTICAL REVIEW, Issue 2 2007
    Paul Marriott
    No abstract is available for this article. [source]


    Short-term travel speed prediction models in car navigation systems

    JOURNAL OF ADVANCED TRANSPORTATION, Issue 2 2006
    Seungjae Lee
    The objective of this study is the development of the short-term prediction models to predict average spot speeds of the subject location in the short-term periods of 5, 10 and 15 minutes respectively. In this study, field data were used to see the comparison of the predictability of Regression Analysis, ARIMA, Kalman Filtering and Neural Network models. These field data were collected from image processing detectors at the urban expressway for 17 hours including both peak and non-peak hours. Most of the results were reliable, but the results of models using Kalman Filtering and Neural Networks are more accurate and realistic than those of the others. [source]


    Evaluating predictive performance of value-at-risk models in emerging markets: a reality check

    JOURNAL OF FORECASTING, Issue 2 2006
    Yong Bao
    Abstract We investigate the predictive performance of various classes of value-at-risk (VaR) models in several dimensions,unfiltered versus filtered VaR models, parametric versus nonparametric distributions, conventional versus extreme value distributions, and quantile regression versus inverting the conditional distribution function. By using the reality check test of White (2000), we compare the predictive power of alternative VaR models in terms of the empirical coverage probability and the predictive quantile loss for the stock markets of five Asian economies that suffered from the 1997,1998 financial crisis. The results based on these two criteria are largely compatible and indicate some empirical regularities of risk forecasts. The Riskmetrics model behaves reasonably well in tranquil periods, while some extreme value theory (EVT)-based models do better in the crisis period. Filtering often appears to be useful for some models, particularly for the EVT models, though it could be harmful for some other models. The CaViaR quantile regression models of Engle and Manganelli (2004) have shown some success in predicting the VaR risk measure for various periods, generally more stable than those that invert a distribution function. Overall, the forecasting performance of the VaR models considered varies over the three periods before, during and after the crisis. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    Wiener,Kolmogorov Filtering and Smoothing for Multivariate Series With State,Space Structure

    JOURNAL OF TIME SERIES ANALYSIS, Issue 3 2007
    Víctor Gómez
    Abstract., Wiener,Kolmogorov filtering and smoothing usually deal with projection problems for stochastic processes that are observed over semi-infinite and doubly infinite intervals. For multivariate stationary series, there exist closed formulae based on covariance generating functions that were first given independently by N. Wiener and A.N. Kolmogorov around 1940. In this article, we consider multivariate series with a state,space structure and, using a new purely algebraic approach to the problem, we prove the equivalence between Wiener,Kolmogorov filtering and Kalman filtering. Up to now, this equivalence has only been partially shown. In addition, we get some new recursions for smoothing and some new recursions to compute the filter weights and the covariance generating functions of the errors. The results are extended to nonstationary series. [source]


    Fast Filtering and Smoothing for Multivariate State Space Models

    JOURNAL OF TIME SERIES ANALYSIS, Issue 3 2000
    S. J. Koopman
    This paper investigates a new approach to diffuse filtering and smoothing for multivariate state space models. The standard approach treats the observations as vectors, while our approach treats each element of the observational vector individually. This strategy leads to computationally efficient methods for multivariate filtering and smoothing. Also, the treatment of the diffuse initial state vector in multivariate models is much simpler than in existing methods. The paper presents details of relevant algorithms for filtering, prediction and smoothing. Proofs are provided. Three examples of multivariate models in statistics and economics are presented for which the new approach is particularly relevant. [source]


    A Partially Observed Model for Micromovement of Asset Prices with Bayes Estimation via Filtering

    MATHEMATICAL FINANCE, Issue 3 2003
    Yong Zeng
    A general micromovement model that describes transactional price behavior is proposed. The model ties the sample characteristics of micromovement and macromovement in a consistent manner. An important feature of the model is that it can be transformed to a filtering problem with counting process observations. Consequently, the complete information of price and trading time is captured and then utilized in Bayes estimation via filtering for the parameters. The filtering equations are derived. A theorem on the convergence of conditional expectation of the model is proved. A consistent recursive algorithm is constructed via the Markov chain approximation method to compute the approximate posterior and then the Bayes estimates. A simplified model and its recursive algorithm are presented in detail. Simulations show that the computed Bayes estimates converge to their true values. The algorithm is applied to one month of intraday transaction prices for Microsoft and the Bayes estimates are obtained. [source]


    Common Fluorescent Sunlamps are an Inappropriate Substitute for Sunlight ,

    PHOTOCHEMISTRY & PHOTOBIOLOGY, Issue 3 2000
    Douglas B. Brown
    ABSTRACT Fluorescent sunlamps are commonly employed as convenient sources in photobiology experiments. The ability of Kodacel to filter photobiologically irrelevant UVC wavelengths has been described. Yet there still remains a major unaddressed issue,the over representation of UVB in the output. The shortest terrestrial solar wavelengths reaching the surface are ,295 nm with the 295,320 nm range comprising ,4% of the solar UV irradiance. In Kodacel-filtered sunlamps, 47% of the UV output falls in this range. Consequently, in studies designed to understand skin photobiology after solar exposure, the use of these unfiltered sunlamps may result in misleading, or even incorrect conclusions. To demonstrate the importance of using an accurate representation of the UV portion of sunlight, the ability of different ultraviolet radiation (UVR) sources to induce the expression of a reporter gene was assayed. Unfiltered fluorescent sunlamps (FS lamps) induce optimal chloramphenicol acetyltransferase (CAT) activity at apparently low doses (10,20 J/cm2). Filtering the FS lamps with Kodacel raised the delivered dose for optimal CAT activity to 50,60 mJ/cm2. With the more solar-like UVA-340 lamps somewhat lower levels of CAT activities were induced even though the apparent delivered doses were significantly greater than for either the FS or Kodacel-filtered sunlamp (KFS lamps). When DNA from parallel-treated cells was analyzed for photoproduct formation by a radioimmuneassay, it was shown that the induction of CAT activity correlated with the level of induced photoproduct formation regardless of the source employed. [source]


    File-sharing, Filtering and the Spectre of the Automated Censor

    THE POLITICAL QUARTERLY, Issue 4 2008
    MONICA HORTEN
    The European Parliament's Bono report is an example of how politicians can speak up for the interests of citizens against those of multi-national corporations. The report concerned the economic status of the cultural industries in Europe, but it has become known for one amendment, protecting citizens' rights on the Internet. The issue at stake is open access to the Internet, versus alleged copyright infringement through online file sharing. As the UK sets out its own policy proposals for copyright and the Internet, the Bono amendment invites us to consider the wider agenda for copyright enforcement, content filtering and the potential for industrial censorship. [source]


    A singular vector perspective of 4D-Var: Filtering and interpolation

    THE QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY, Issue 605 2005
    Christine Johnson
    Abstract Four-dimensional variational data assimilation (4D-Var) combines the information from a time sequence of observations with the model dynamics and a background state to produce an analysis. In this paper, a new mathematical insight into the behaviour of 4D-Var is gained from an extension of concepts that are used to assess the qualitative information content of observations in satellite retrievals. It is shown that the 4D-Var analysis increments can be written as a linear combination of the singular vectors of a matrix which is a function of both the observational and the forecast model systems. This formulation is used to consider the filtering and interpolating aspects of 4D-Var using idealized case-studies based on a simple model of baroclinic instability. The results of the 4D-Var case-studies exhibit the reconstruction of the state in unobserved regions as a consequence of the interpolation of observations through time. The results also exhibit the filtering of components with small spatial scales that correspond to noise, and the filtering of structures in unobserved regions. The singular vector perspective gives a very clear view of this filtering and interpolating by the 4D-Var algorithm and shows that the appropriate specification of the a priori statistics is vital to extract the largest possible amount of useful information from the observations. Copyright © 2005 Royal Meteorological Society [source]


    Filtering and Counting of Extended Connectivity Fingerprint Features Maximizes Compound Recall and the Structural Diversity of Hits

    CHEMICAL BIOLOGY & DRUG DESIGN, Issue 1 2009
    Ye Hu
    Extended connectivity fingerprints produce variable numbers of structural features for molecules and quantitative comparison of feature ensembles is typically carried out as a measure of molecular similarity. As an alternative way to utilize the information content of extended connectivity fingerprint features, we have introduced a compound class-directed feature filtering technique. In combination with a simple feature counting protocol, feature filtering significantly improves the performance of extended connectivity fingerprint similarity searching compared with state-of-the-art fingerprint search methods. Subsets of extended connectivity fingerprint features that are unique to active compounds are found to be responsible for high compound recall. Moreover, feature filtering and counting is shown to result in significantly higher scaffold hopping potential than data fusion or fingerprint averaging methods. Extended connectivity fingerprint feature filtering and counting represents one of the simplest similarity search methods introduced to date, yet it produces top compound recall and maximizes the scaffold diversity of hits, which is a longstanding goal of similarity searching. [source]


    Web Discovery and Filtering Based on Textual Relevance Feedback Learning

    COMPUTATIONAL INTELLIGENCE, Issue 2 2003
    Wai Lam
    We develop a new approach for Web information discovery and filtering. Our system, called WID, allows the user to specify long-term information needs by means of various topic profile specifications. An entire example page or an index page can be accepted as input for the discovery. It makes use of a simulated annealing algorithm to automatically explore new Web pages. Simulated annealing algorithms possess some favorable properties to fulfill the discovery objectives. Information retrieval techniques are adopted to evaluate the content-based relevance of each page being explored. The hyperlink information, in addition to the textual context, is considered in the relevance score evaluation of a Web page. WID allows users to provide three forms of the relevance feedback model, namely, the positive page feedback, the negative page feedback, and the positive keyword feedback. The system is domain independent and does not rely on any prior knowledge or information about the Web content. Extensive experiments have been conducted to demonstrate the effectiveness of the discovery performance achieved by WID. [source]


    A Windows-based interface for teaching image processing

    COMPUTER APPLICATIONS IN ENGINEERING EDUCATION, Issue 2 2010
    Melvin Ayala
    Abstract The use of image processing in research represents a challenge to the scientific community interested in its various applications but is not familiar with this area of expertise. In academia as well as in industry, fundamental concepts such as image transformations, filtering, noise removal, morphology, convolution/deconvolution among others require extra efforts to be understood. Additionally, algorithms for image reading and visualization in computers are not always easy to develop by inexperienced researchers. This type of environment has lead to an adverse situation where most students and researchers develop their own image processing code for operations which are already standards in image processing, a redundant process which only exacerbates the situation. The research proposed in this article, with the aim to resolve this dilemma, is to propose a user-friendly computer interface that has a dual objective which is to free students and researchers from the learning time needed for understanding/applying diverse imaging techniques but to also provide them with the option to enhance or reprogram such algorithms with direct access to the software code. The interface was thus developed with the intention to assist in understanding and performing common image processing operations through simple commands that can be performed mostly by mouse clicks. The visualization of pseudo code after each command execution makes the interface attractive, while saving time and facilitating to users the learning of such practical concepts. © 2009 Wiley Periodicals, Inc. Comput Appl Eng Educ 18: 213,224, 2010; Published online in Wiley InterScience (www.interscience.wiley.com); DOI 10.1002/cae.20171 [source]


    The Readability of Path-Preserving Clusterings of Graphs

    COMPUTER GRAPHICS FORUM, Issue 3 2010
    Daniel Archambault
    Abstract Graph visualization systems often exploit opaque metanodes to reduce visual clutter and improve the readability of large graphs. This filtering can be done in a path-preserving way based on attribute values associated with the nodes of the graph. Despite extensive use of these representations, as far as we know, no formal experimentation exists to evaluate if they improve the readability of graphs. In this paper, we present the results of a user study that formally evaluates how such representations affect the readability of graphs. We also explore the effect of graph size and connectivity in terms of this primary research question. Overall, for our tasks, we did not find a significant difference when this clustering is used. However, if the graph is highly connected, these clusterings can improve performance. Also, if the graph is large enough and can be simplified into a few metanodes, benefits in performance on global tasks are realized. Under these same conditions, however, performance of local attribute tasks may be reduced. [source]


    TouchTone: Interactive Local Image Adjustment Using Point-and-Swipe

    COMPUTER GRAPHICS FORUM, Issue 2 2010
    Chia-Kai Liang
    Recent proliferation of camera phones, photo sharing and social network services has significantly changed how we process our photos. Instead of going through the traditional download-edit-share cycle using desktop editors, an increasing number of photos are taken with camera phones and published through cellular networks. The immediacy of the sharing process means that on-device image editing, if needed, should be quick and intuitive. However, due to the limited computational resources and vastly different user interaction model on small screens, most traditional local selection methods can not be directly adapted to mobile devices. To address this issue, we present TouchTone, a new method for edge-aware image adjustment using simple finger gestures. Our method enables users to select regions within the image and adjust their corresponding photographic attributes simultaneously through a simple point-and-swipe interaction. To enable fast interaction, we develop a memory- and computation-efficient algorithm which samples a collection of 1D paths from the image, computes the adjustment solution along these paths, and interpolates the solutions to entire image through bilateral filtering. Our system is intuitive to use, and can support several local editing tasks, such as brightness, contrast, and color balance adjustments, within a minute on a mobile device. [source]


    Fast High-Dimensional Filtering Using the Permutohedral Lattice

    COMPUTER GRAPHICS FORUM, Issue 2 2010
    Andrew Adams
    Abstract Many useful algorithms for processing images and geometry fall under the general framework of high-dimensional Gaussian filtering. This family of algorithms includes bilateral filtering and non-local means. We propose a new way to perform such filters using the permutohedral lattice, which tessellates high-dimensional space with uniform simplices. Our algorithm is the first implementation of a high-dimensional Gaussian filter that is both linear in input size and polynomial in dimensionality. Furthermore it is parameter-free, apart from the filter size, and achieves a consistently high accuracy relative to ground truth (> 45 dB). We use this to demonstrate a number of interactive-rate applications of filters in as high as eight dimensions. [source]


    Hierarchical Vortex Regions in Swirling Flow

    COMPUTER GRAPHICS FORUM, Issue 3 2009
    Christoph Petz
    Abstract We propose a new criterion to characterize hierarchical two-dimensional vortex regions induced by swirling motion. Central to the definition are closed loops that intersect the flow field at a constant angle. The union of loops belonging to the same area of swirling motion defines a vortex region. These regions are disjunct but may be nested, thus introducing a spatial hierarchy of vortex regions. We present a parameter free algorithm for the identification of these regions. Since they are not restricted to star- or convex-shaped geometries, we are able to identify also intricate regions, e.g., of elongated vortices. Computing an integrated value for each loop and mapping these values to a vortex region, introduces new ways for visualizing or filtering the vortex regions. Exemplary, an application based on the Rankine vortex model is presented. We apply our method to several CFD datasets and compare our results to existing approaches. [source]


    Ptex: Per-Face Texture Mapping for Production Rendering

    COMPUTER GRAPHICS FORUM, Issue 4 2008
    Brent Burley
    Explicit parameterization of subdivision surfaces for texture mapping adds significant cost and complexity to film production. Most parameterization methods currently in use require setup effort, and none are completely general. We propose a new texture mapping method for Catmull-Clark subdivision surfaces that requires no explicit parameterization. Our method, Ptex, stores a separate texture per quad face of the subdivision control mesh, along with a novel per-face adjacency map, in a single texture file per surface. Ptex uses the adjacency data to perform seamless anisotropic filtering of multi-resolution textures across surfaces of arbitrary topology. Just as importantly, Ptex requires no manual setup and scales to models of arbitrary mesh complexity and texture detail. Ptex has been successfully used to texture all of the models in an animated theatrical short and is currently being applied to an entire animated feature. Ptex has eliminated UV assignment from our studio and significantly increased the efficiency of our pipeline. [source]


    Volumetric Filtering, Modeling and Visualization for Nano-Medicine

    COMPUTER GRAPHICS FORUM, Issue 3 2003
    Chandrajit Bajaj
    The 3D structures of individual proteins or small complexes, such as most of the Protein Data Bank entries, are still unable to yield the "full picture" of a functional biological complex. The study of large macromolecular complexes, such as viruses, ion channels, the ribosome and other macromolecular machines of various types, offer more complete structural and functional description of the nano-machinery of life. In addition to x-ray crystallography. NMR spectroscopy, electron cryomicroscopy (cryoEM) imaging of single particles, and in-vivo molecular tomographic imaging has become indispensable at revealing the structures of large macromolecular complexes at subnanometer resolutions. In this talk, I shall describe some of the recent computational advances in filtering, modeling, analysis and visualization, that have propelled structure determination by cryoEM and tomographic imaging, to steadily increasing accuracy. [source]


    Interactive Visualization with Programmable Graphics Hardware

    COMPUTER GRAPHICS FORUM, Issue 3 2002
    Thomas Ertl
    One of the main scientific goals of visualization is the development of algorithms and appropriate data models which facilitate interactive visual analysis and direct manipulation of the increasingly large data sets which result from simulations running on massive parallel computer systems, from measurements employing fast high-resolution sensors, or from large databases and hierarchical information spaces. This task can only be achieved with the optimization of all stages of the visualization pipeline: filtering, compression, and feature extraction of the raw data sets, adaptive visualization mappings which allow the users to choose between speed and accuracy, and exploiting new graphics hardware features for fast and high-quality rendering. The recent introduction of advanced programmability in widely available graphics hardware has already led to impressive progress in the area of volume visualization. However, besides the acceleration of the final rendering, flexible graphics hardware is increasingly being used also for the mapping and filtering stages of the visualization pipeline, thus giving rise to new levels of interactivity in visualization applications. The talk will present recent results of applying programmable graphics hardware in various visualization algorithms covering volume data, flow data, terrains, NPR rendering, and distributed and remote applications. [source]


    Are Points the Better Graphics Primitives?

    COMPUTER GRAPHICS FORUM, Issue 3 2001
    Markus Gross
    Since the early days of graphics the computer based representation of three-dimensional geometry has been one of the core research fields. Today, various sophisticated geometric modelling techniques including NURBS or implicit surfaces allow the creation of 3D graphics models with increasingly complex shape. In spite of these methods the triangle has survived over decades as the king of graphics primitives meeting the right balance between descriptive power and computational burden. As a consequence, today's consumer graphics hardware is heavily tailored for high performance triangle processing. In addition, a new generation of geometry processing methods including hierarchical representations, geometric filtering, or feature detection fosters the concept of triangle meshes for graphics modelling. Unlike triangles, points have amazingly been neglected as a graphics primitive. Although being included in APIs since many years, it is only recently that point samples experience a renaissance in computer graphics. Conceptually, points provide a mere discretization of geometry without explicit storage of topology. Thus, point samples reduce the representation to the essentials needed for rendering and enable us to generate highly optimized object representations. Although the loss of topology poses great challenges for graphics processing, the latest generation of algorithms features high performance rendering, point/pixel shading, anisotropic texture mapping, and advanced signal processing of point sampled geometry. This talk will give an overview of how recent research results in the processing of triangles and points are changing our traditional way of thinking of surface representations in computer graphics - and will discuss the question: Are Points the Better Graphics Primitives? [source]


    Real-Time OD Estimation Using Automatic Vehicle Identification and Traffic Count Data

    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 1 2002
    Michael P. Dixon
    A key input to many advanced traffic management operations strategies are origin,destination (OD) matricies. In order to examine the possibility of estimating OD matricies in real-time, two constrained OD estimators, based on generalized least squares and Kalman filtering, were developed and tested. A one-at-a-time processing method was introduced to provide an efficient organized framework for incorporating observations from multiple data sources in real-time. The estimators were tested under different conditions based on the type of prior OD information available, the type of assignment available, and the type of link volume model used. The performance of the Kalman filter estimators also was compared to that of the generalized least squares estimator to provide insight regarding their performance characteristics relative to one another for given scenarios. Automatic vehicle identification (AVI) tag counts were used so that observed and estimated OD parameters could be compared. While the approach was motivated using AVI data, the methodology can be generalized to any situation where traffic counts are available and origin volumes can be estimated reliably. The primary means by which AVI data was utilized was through the incorporation of prior observed OD information as measurements, the inclusion of a deterministic link volume component that makes use of OD data extracted from the latest time interval from which all trips have been completed, and through the use of link choice proportions estimated based on link travel time data. It was found that utilizing prior observed OD data along with link counts improves estimator accuracy relative to OD estimation based exclusively on link counts. [source]


    A workflow portal supporting multi-language interoperation and optimization

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 12 2007
    Lican Huang
    Abstract In this paper we present a workflow portal for Grid applications, which supports different workflow languages and workflow optimization. We present an XSLT converter that converts from one workflow language to another and enables the interoperation between different workflow languages. We discuss strategies for choosing the optimal service from several semantically equivalent Web services in a Grid application. The dynamic selection of Web services involves discovering a set of semantically equivalent services by filtering the available services based on metadata, and selecting an optimal service based on real-time data and/or historical data recorded during prior executions. Finally, we describe the framework and implementation of the workflow portal which aggregates different components of the project using Java portlets. Copyright © 2007 John Wiley & Sons, Ltd. [source]


    Urban disaster recovery: a measurement framework and its application to the 1995 Kobe earthquake

    DISASTERS, Issue 2 2010
    Stephanie E. Chang
    This paper provides a framework for assessing empirical patterns of urban disaster recovery through the use of statistical indicators. Such a framework is needed to develop systematic knowledge on how cities recover from disasters. The proposed framework addresses such issues as defining recovery, filtering out exogenous influences unrelated to the disaster, and making comparisons across disparate areas or events. It is applied to document how Kobe City, Japan, recovered from the catastrophic 1995 earthquake. Findings indicate that while aggregate population regained pre-disaster levels in ten years, population had shifted away from the older urban core. Economic recovery was characterised by a three to four year temporary boost in reconstruction activities, followed by settlement at a level some ten per cent below pre-disaster levels. Other long-term effects included substantial losses of port activity and sectoral shifts toward services and large businesses. These patterns of change and disparity generally accelerated pre-disaster trends. [source]


    On the reliability of long-period response spectral ordinates from digital accelerograms

    EARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 5 2008
    Roberto Paolucci
    Abstract Using records from co-located broadband and digital strong motion (SM) instruments, it is first shown that the displacement waveforms obtained by double integration of the accelerogram need not be free of unrealistic baseline drift to yield reliable spectral ordinates up to at least 10,s. Secondly, to provide objective criteria for selecting reliable digital SM records for ground motion predictions at long periods, a set of synthetic accelerograms contaminated by random long-period noise has been used, and the difference between the original accelerograms and the spurious ones in terms of response spectra has been quantified, by introducing a noise index that can be easily calculated based on the velocity waveform of the record. The results of this study suggest that high-pass filtering the digital acceleration record from a cutoff period selected to suppress baseline drifts on the displacement waveform appears to be in most cases too conservative and unduly depletes reliable information on long-period spectral ordinates. Copyright © 2007 John Wiley & Sons, Ltd. [source]