Processing Tools (processing + tool)

Distribution by Scientific Domains


Selected Abstracts


Surface wavelets: a multiresolution signal processing tool for 3D computational modelling

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 3 2001
Kevin Amaratunga
Abstract In this paper, we provide an introduction to wavelet representations for complex surfaces (surface wavelets), with the goal of demonstrating their potential for 3D scientific and engineering computing applications. Surface wavelets were originally developed for representing geometric objects in a multiresolution format in computer graphics. These wavelets share all of the major advantages of conventional wavelets, in that they provide an analysis tool for studying data, functions and operators at different scales. However, unlike conventional wavelets, which are restricted to uniform grids, surface wavelets have the power to perform signal processing operations on complex meshes, such as those encountered in finite element modelling. This motivates the study of surface wavelets as an efficient representation for the modelling and simulation of physical processes. We show how surface wavelets can be applied to partial differential equations, stated either in integral form or in differential form. We analyse and implement the wavelet approach for a model 3D potential problem using a surface wavelet basis with linear interpolating properties. We show both theoretically and experimentally that an O(h) convergence rate, hn being the mesh size, can be obtained by retaining only O((logN) 7/2N) entries in the discrete operator matrix, where N is the number of unknowns. The principles described here may also be extended to volumetric discretizations. Copyright © 2001 John Wiley & Sons, Ltd. [source]


A review of catalytic approaches to waste minimization: case study,liquid-phase catalytic treatment of chlorophenols

JOURNAL OF CHEMICAL TECHNOLOGY & BIOTECHNOLOGY, Issue 11 2005
Mark A Keane
Abstract Effective waste management must address waste reduction, reuse, recovery/recycle and, as the least progressive option, waste treatment. Catalysis can serve as an integral green processing tool, ensuring lower operating pressures/temperatures with a reduction in energy requirements while providing alternative cleaner synthesis routes and facilitating waste conversion to reusable material. The case study chosen to illustrate the role that catalysis has to play in waste minimization deals with the conversion of toxic chlorophenols in wastewater. The presence of chloro-organic emissions is of increasing concern with mounting evidence of adverse ecological and public health impacts. A critical overview of the existing treatment technologies is provided with an analysis of the available literature on catalytic dechlorination. The efficacy of Pd/Al2O3 to promote the hydrogen-mediated dechlorination of mono- and dichlorophenols is demonstrated, taking account of both the physical and chemical contributions in this three-phase (solid catalyst and liquid/gaseous reactants) system. Hydrodechlorination activity and selectivity trends are discussed in terms of chloro-isomer structure, the influence of temperature is discussed, the role of base (NaOH) addition is examined and the feasibility of catalyst reuse is addressed. Copyright © 2005 Society of Chemical Industry [source]


Automatic mining of the literature to generate new hypotheses for the possible link between periodontitis and atherosclerosis: lipopolysaccharide as a case study

JOURNAL OF CLINICAL PERIODONTOLOGY, Issue 12 2007
Kristina M. Hettne
Abstract Aim: The aim of the current report was to generate and explore new hypotheses into how, in a pathophysiological sense, atherosclerosis and periodontitis could be linked. Material and Methods: Two different biomedical informatics techniques were used: an association-based technique that generated a ranked list of genes associated with the diseases, and a natural language processing tool that extracted the relationships between the retrieved genes and lipopolysaccharide (LPS). Results: This combined approach of association-based and natural language processing-based literature mining identified a hit list of 16 candidate genes, with PON1 as the primary candidate. Conclusions: Further study of the literature prompted the hypothesis that PON1 might connect periodontitis with atherosclerosis in both an LPS-dependent and a non-LPS-dependent manner. Furthermore, the resulting genes not only confirmed already known associations between the two diseases, but also provided genes or gene products that have only been investigated separately in the two disease states, and genes or gene products previously reported to be involved in atherosclerosis. These findings remain to be investigated through clinical studies. This example of multidisciplinary research illustrates how collaborative efforts of investigators from different fields of expertise can result in the discovery of new hypotheses. [source]


Electronic Tongues Employing Electrochemical Sensors

ELECTROANALYSIS, Issue 14 2010
Manel del, Valle
Abstract This review presents recent advances concerning work with electronic tongues employing electroanalytical sensors. This new concept in the electroanalysis sensor field entails the use of chemical sensor arrays coupled with chemometric processing tools, as a mean to improve sensors performance. The revision is organized according to the electroanalytical technique used for transduction, namely: potentiometry, voltammetry/amperometry or electrochemical impedance. The significant use of biosensors, mainly enzyme-based is also presented. Salient applications in real problem solving using electrochemical electronic tongues are commented. [source]


Feature removal and isolation in potential field data

GEOPHYSICAL JOURNAL INTERNATIONAL, Issue 3 2004
F. Boschetti
SUMMARY With the aim of designing signal processing tools that act locally in space upon specific features of a signal, we compare two algorithms to remove or isolate individual anomalies in potential field profiles. The first method, based on multiscale edge analysis, leaves other features in the signal relatively untouched. A second method, based on iterative lateral continuation and subtraction of anomalies, accounts for the influence of adjacent anomalies on one another. This allows a potential field profile to be transformed into a number of single anomaly signals. Each single anomaly can then be individually processed, which considerably simplifies applications such as inversion and signal processing. [source]


Automatic CAD model topology generation

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN FLUIDS, Issue 8 2006
Paresh S. Patel
Abstract Computer aided design (CAD) models often need to be processed due to the data translation issues and requirements of the downstream applications like computational field simulation, rapid prototyping, computer graphics, computational manufacturing, and real-time rendering before they can be used. Automatic CAD model processing tools can significantly reduce the amount of time and cost associated with the manual processing. The topology generation algorithm, commonly known as CAD repairing/healing, is presented to detect commonly found geometrical and topological issues like cracks, gaps, overlaps, intersections, T-connections, and no/invalid topology in the model, process them and build correct topological information. The present algorithm is based on the iterative vertex pair contraction and expansion operations called stitching and filling, respectively, to process the model accurately. Moreover, the topology generation algorithm can process manifold as well as non-manifold models, which makes the procedure more general and flexible. In addition, a spatial data structure is used for searching and neighbour finding to process large models efficiently. In this way, the combination of generality, accuracy, and efficiency of this algorithm seems to be a significant improvement over existing techniques. Results are presented showing the effectiveness of the algorithm to process two- and three-dimensional configurations. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Analyzing Spatial Drivers in Quantitative Conflict Studies: The Potential and Challenges of Geographic Information Systems

INTERNATIONAL STUDIES REVIEW, Issue 3 2009
Nathalie Stephenne
The objective of this literature review is to understand where Graphical Information Systems (GIS) can be useful to address security issues and how it has been used until now. While the geographic drivers of territorial conflicts have been extensively described by a number of political studies, the quantitative analysis of these drivers is quite new. This study traces an evolution from conceptual research to quantitative development. It then discusses the advantages and challenges of applying new geographic techniques to analyze spatial drivers of conflict. We identify the main spatial components in conflict and security, the existing types of information/data and the quantitative methods used. We describe the spatial component of security by looking at: (i) the main sociopolitical concepts linked to territory, (ii) the kind of geographic concepts linked to territory, (iii) measures used to describe such geographic concepts; and (iv) the issues raised in any attempt to integrate geographic concepts into a GIS. We conclude that GIS tools can be useful in the analysis of civil disputes, particularly where subnational level data exists. This paper shows that spatial processing tools in GIS allow us to represent some spatial components and to address new issues such as the fuzzy complexity of border permeability. [source]


Advances in proteomics data analysis and display using an accurate mass and time tag approach

MASS SPECTROMETRY REVIEWS, Issue 3 2006
Jennifer S.D. Zimmer
Abstract Proteomics has recently demonstrated utility for increasing the understanding of cellular processes on the molecular level as a component of systems biology approaches and for identifying potential biomarkers of various disease states. The large amount of data generated by utilizing high efficiency (e.g., chromatographic) separations coupled with high mass accuracy mass spectrometry for high-throughput proteomics analyses presents challenges related to data processing, analysis, and display. This review focuses on recent advances in nanoLC-FTICR-MS-based proteomics approaches and the accompanying data processing tools that have been developed to display and interpret the large volumes of data being produced. © 2006 Wiley Periodicals, Inc., Mass Spec Rev 25:450,482, 2006 [source]