Home About us Contact | |||
Interpolation
Kinds of Interpolation Terms modified by Interpolation Selected AbstractsGradient-based Interpolation and Sampling for Real-time Rendering of Inhomogeneous, Single-scattering MediaCOMPUTER GRAPHICS FORUM, Issue 7 2008Zhong Ren Abstract We present a real-time rendering algorithm for inhomogeneous, single scattering media, where all-frequency shading effects such as glows, light shafts, and volumetric shadows can all be captured. The algorithm first computes source radiance at a small number of sample points in the medium, then interpolates these values at other points in the volume using a gradient-based scheme that is efficiently applied by sample splatting. The sample points are dynamically determined based on a recursive sample splitting procedure that adapts the number and locations of sample points for accurate and efficient reproduction of shading variations in the medium. The entire pipeline can be easily implemented on the GPU to achieve real-time performance for dynamic lighting and scenes. Rendering results of our method are shown to be comparable to those from ray tracing. [source] Generating Consistent Motion Transition via Decoupled Framespace InterpolationCOMPUTER GRAPHICS FORUM, Issue 3 2000G. Ashraf The framespace interpolation algorithm abstracts motion sequences as 1D signals, and interpolates between them to create higher dimension signals, with weights drawn from a user specified curve in a bounded region. We reformulate the algorithm to achieve motion-state based transition via dynamic warping of framespaces and automatic transition timing via framespace frequency interpolation. Basis motions displaying diverse coordination configurations between upper and lower body-halves, cannot be consistently corresponded at a macro level. We address this problem here, through decoupled blending of these halves to achieve true consistency, and eliminate accumulated phase differences via cosine phase warp functions. This generalization enables interpolation of motions with diverse coordinations between the upper and lower bodies. [source] Interpolation processes using multivariate geostatistics for mapping of climatological precipitation mean in the Sannio Mountains (southern Italy)EARTH SURFACE PROCESSES AND LANDFORMS, Issue 3 2005Nazzareno Diodato Abstract The spatial variability of precipitation has often been a topic of research, since accurate modelling of precipitation is a crucial condition for obtaining reliable results in hydrology and geomorphology. In mountainous areas, the sparsity of the measurement networks makes an accurate and reliable spatialization of rainfall amounts at the local scale difficult. The purpose of this paper is to show how the use of a digital elevation model can improve interpolation processes at the subregional scale for mapping the mean annual and monthly precipitation from rainfall observations (40 years) recorded in a region of 1400 km2 in southern Italy. Besides linear regression of precipitation against elevation, two methods of interpolation are applied: inverse squared distance and ordinary cokriging. Cross-validation indicates that the inverse distance interpolation, which ignores the information on elevation, yields the largest prediction errors. Smaller prediction errors are produced by linear regression and ordinary cokriging. However, the results seem to favour the multivariate geostatistical method including auxiliary information (related to elevation). We conclude that ordinary cokriging is a very flexible and robust interpolation method because it can take into account several properties of the landscape; it should therefore be applicable in other mountainous regions, especially where precipitation is an important geomorphological factor. Copyright © 2005 John Wiley & Sons, Ltd. [source] Mechanisms of Visual Object Recognition in Infancy: Five-Month-Olds Generalize Beyond the Interpolation of Familiar ViewsINFANCY, Issue 1 2007Clay Mash This work examined predictions of the interpolation of familiar views (IFV) account of object recognition performance in 5-month-olds. Infants were familiarized to an object either from a single viewpoint or from multiple viewpoints varying in rotation around a single axis. Object recognition was then tested in both conditions with the same object rotated around a novel axis. Infants in the multiple-views condition recognized the object, whereas infants in the single-view condition provided no evidence for recognition. Under the same 2 familiarization conditions, infants in a 2nd experiment treated as novel an object that differed in only 1 component from the familiar object. Infants' object recognition is enhanced by experience with multiple views, even when that experience is around an orthogonal axis of rotation, and infants are sensitive to even subtle shape differences between components of similar objects. In general, infants' performance does not accord with the predictions of the IFV model of object recognition. These findings motivate the extension of future research and theory beyond the limits of strictly interpolative mechanisms. [source] Importance of interpolation when constructing double-bootstrap confidence intervalsJOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES B (STATISTICAL METHODOLOGY), Issue 2 2000Peter Hall We show that, in the context of double-bootstrap confidence intervals, linear interpolation at the second level of the double bootstrap can reduce the simulation error component of coverage error by an order of magnitude. Intervals that are indistinguishable in terms of coverage error with theoretical, infinite simulation, double-bootstrap confidence intervals may be obtained at substantially less computational expense than by using the standard Monte Carlo approximation method. The intervals retain the simplicity of uniform bootstrap sampling and require no special analysis or computational techniques. Interpolation at the first level of the double bootstrap is shown to have a relatively minor effect on the simulation error. [source] Pooling-Based Data Interpolation and BackdatingJOURNAL OF TIME SERIES ANALYSIS, Issue 1 2007Massimiliano Marcellino C32; C43; C82 Abstract., Pooling forecasts obtained from different procedures typically reduces the mean square forecast error and more generally improve the quality of the forecast. In this paper, we evaluate whether pooling-interpolated or-backdated time series obtained from different procedures can also improve the quality of the generated data. Both simulation results and empirical analyses with macroeconomic time series indicate that pooling plays a positive and important role in this context also. [source] Enhancing molecular discovery using descriptor-free rearrangement clustering techniques for sparse data setsAICHE JOURNAL, Issue 2 2010Peter A. DiMaggio Jr. Abstract This article presents a descriptor-free method for estimating library compounds with desired properties from synthesizing and assaying minimal library space. The method works by identifying the optimal substituent ordering (i.e., the optimal encoding integer assignment to each functional group on every substituent site of molecular scaffold) based on a global pairwise difference metric intended to capture smoothness of the compound library. The reordering can be accomplished via a (i) mixed-integer linear programming (MILP) model, (ii) genetic algorithm based approach, or (iii) heuristic approach. We present performance comparisons between these techniques as well as an independent analysis of characteristics of the MILP model. Two sparsely sampled data matrices provided by Pfizer are analyzed to validate the proposed approach and we show that the rearrangement of these matrices leads to regular property landscapes which enable reliable property estimation/interpolation over the full library space. An iterative strategy for compound synthesis is also introduced that utilizes the results of the reordered data to direct the synthesis toward desirable compounds. We demonstrate in a simulated experiment using held out subsets of the data that the proposed iterative technique is effective in identifying compounds with desired physical properties. © 2009 American Institute of Chemical Engineers AIChE J, 2010 [source] Analytical inverse kinematics with body posture controlCOMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 2 2008Marcelo Kallmann Abstract This paper presents a novel whole-body analytical inverse kinematics (IK) method integrating collision avoidance and customizable body control for animating reaching tasks in real-time. Whole-body control is achieved with the interpolation of pre-designed key body postures, which are organized as a function of the direction to the goal to be reached. Arm postures are computed by the analytical IK solution for human-like arms and legs, extended with a new simple search method for achieving postures avoiding joint limits and collisions. In addition, a new IK resolution is presented that directly solves for joints parameterized in the swing-and-twist decomposition. The overall method is simple to implement, fast, and accurate, and therefore suitable for interactive applications controlling the hands of characters. The source code of the IK implementation is provided. Copyright © 2007 John Wiley & Sons, Ltd. [source] Barycentric Coordinates on SurfacesCOMPUTER GRAPHICS FORUM, Issue 5 2010Raif M. Rustamov This paper introduces a method for defining and efficiently computing barycentric coordinates with respect to polygons on general surfaces. Our construction is geared towards injective polygons (polygons that can be enclosed in a metric ball of an appropriate size) and is based on replacing the linear precision property of planar coordinates by a requirement in terms of center of mass, and generalizing this requirement to the surface setting. We show that the resulting surface barycentric coordinates can be computed using planar barycentric coordinates with respect to a polygon in the tangent plane. We prove theoretically that the surface coordinates properly generalize the planar coordinates and carry some of their useful properties such as unique reconstruction of a point given its coordinates, uniqueness for triangles, edge linearity, similarity invariance, and smoothness; in addition, these coordinates are insensitive to isometric deformations and can be used to reconstruct isometries. We show empirically that surface coordinates are shape-aware with consistent gross behavior across different surfaces, are well-behaved for different polygon types/locations on variety of surface forms, and that they are fast to compute. Finally, we demonstrate effectiveness of surface coordinates for interpolation, decal mapping, and correspondence refinement. [source] BetweenIT: An Interactive Tool for Tight InbetweeningCOMPUTER GRAPHICS FORUM, Issue 2 2010Brian Whited Abstract The generation of inbetween frames that interpolate a given set of key frames is a major component in the production of a 2D feature animation. Our objective is to considerably reduce the cost of the inbetweening phase by offering an intuitive and effective interactive environment that automates inbetweening when possible while allowing the artist to guide, complement, or override the results. Tight inbetweens, which interpolate similar key frames, are particularly time-consuming and tedious to draw. Therefore, we focus on automating these high-precision and expensive portions of the process. We have designed a set of user-guided semi-automatic techniques that fit well with current practice and minimize the number of required artist-gestures. We present a novel technique for stroke interpolation from only two keys which combines a stroke motion constructed from logarithmic spiral vertex trajectories with a stroke deformation based on curvature averaging and twisting warps. We discuss our system in the context of a feature animation production environment and evaluate our approach with real production data. [source] Implicit Surface Modelling with a Globally Regularised Basis of Compact SupportCOMPUTER GRAPHICS FORUM, Issue 3 2006C. Walder We consider the problem of constructing a globally smooth analytic function that represents a surface implicitly by way of its zero set, given sample points with surface normal vectors. The contributions of the paper include a novel means of regularising multi-scale compactly supported basis functions that leads to the desirable interpolation properties previously only associated with fully supported bases. We also provide a regularisation framework for simpler and more direct treatment of surface normals, along with a corresponding generalisation of the representer theorem lying at the core of kernel-based machine learning methods. We demonstrate the techniques on 3D problems of up to 14 million data points, as well as 4D time series data and four-dimensional interpolation between three-dimensional shapes. Categories and Subject Descriptors (according to ACM CCS): I.3.5 [Computer Graphics]: Curve, surface, solid, and object representations [source] Resampling Feature and Blend Regions in Polygonal Meshes for Surface Anti-AliasingCOMPUTER GRAPHICS FORUM, Issue 3 2001Mario Botsch Efficient surface reconstruction and reverse engineering techniques are usually based on a polygonal mesh representation of the geometry: the resulting models emerge from piecewise linear interpolation of a set of sample points. The quality of the reconstruction not only depends on the number and density of the sample points but also on their alignment to sharp and rounded features of the original geometry. Bad alignment can lead to severe alias artifacts. In this paper we present a sampling pattern for feature and blend regions which minimizes these alias errors. We show how to improve the quality of a given polygonal mesh model by resampling its feature and blend regions within an interactive framework. We further demonstrate sophisticated modeling operations that can be implemented based on this resampling technique. [source] Generating Consistent Motion Transition via Decoupled Framespace InterpolationCOMPUTER GRAPHICS FORUM, Issue 3 2000G. Ashraf The framespace interpolation algorithm abstracts motion sequences as 1D signals, and interpolates between them to create higher dimension signals, with weights drawn from a user specified curve in a bounded region. We reformulate the algorithm to achieve motion-state based transition via dynamic warping of framespaces and automatic transition timing via framespace frequency interpolation. Basis motions displaying diverse coordination configurations between upper and lower body-halves, cannot be consistently corresponded at a macro level. We address this problem here, through decoupled blending of these halves to achieve true consistency, and eliminate accumulated phase differences via cosine phase warp functions. This generalization enables interpolation of motions with diverse coordinations between the upper and lower bodies. [source] Computational Aspects of Risk-Based Inspection PlanningCOMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 3 2006Daniel Straub In this article, a computationally efficient method for the calculation of risk-based inspection (RBI) plans is presented, which overcomes the problem through the use of a generic approach. After an introduction in RBI planning, focus is set on the computational aspects of the methodology. The derivation of inspection plans through interpolation in databases with predefined generic inspection plans is demonstrated and the accuracy of the methodology is investigated. Finally, an overview is given on some recent applications of the generic approach in practice, including the implementation in efficient software tools. [source] Predictive identification of human skin sensitization thresholdsCONTACT DERMATITIS, Issue 5 2005David A. Basketter For years, methods have been available for the predictive identification of chemicals that possess the intrinsic potential to cause skin sensitization. However, many have proven less suitable for the determination of relative sensitizing potency. In this respect, the local lymph node assay (LLNA) has been shown to have a number of important advantages. Through interpolation of LLNA dose,response data, the concentration of a chemical required to produce a threshold positive response (a 3-fold increase in activity compared with concurrent vehicle controls, the EC3 value) can be measured. The robustness of this parameter has been demonstrated rigorously in terms of inter- and intralaboratory reproducibility. Additionally, the relationship between potency estimates from the LLNA and an appreciation of human potency based on clinical experience has been reported previously. In the present investigations, we have sought to consolidate further our understanding of the association between EC3 values and human skin-sensitization potency by undertaking a thorough and extensive analysis of existing human predictive assays, particularly where dose,response information is available, from historical human repeated insult patch tests (HRIPTs). From these human data, information on the approximate threshold for the induction of skin sensitization in the HRIPT was determined for 26 skin-sensitizing chemicals. These data were then compared with LLNA-derived EC3 values. The results from each assay, expressed as dose per unit area (,g/cm2), revealed a clear linear relationship between the 2 values, thereby substantiating further the utility of LLNA EC3 values for prediction of the relative human sensitizing potency of newly identified skin sensitizers. [source] Generalized Spitzer Function with Finite Collisionality in Toroidal PlasmasCONTRIBUTIONS TO PLASMA PHYSICS, Issue 8 2010W. Kernbichler Abstract The drift kinetic equation solver NEO-2 [1] which is based on the field line integration technique has been applied to compute the generalized Spitzer function in a tokamak with finite plasma collisionality. The resulting generalized Spitzer function has specific features which are pertinent to the finite plasma collisionality. They are absent in asymptotic (collisionless or highly collisional) regimes or in results drawn from interpolation between asymptotic limits. These features have the potential to improve the overall ECCD efficiency if one optimizes the microwave beam launch scenarii accordingly (© 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] SCALES: a large-scale assessment model of soil erosion hazard in Basse-Normandie (northern-western France)EARTH SURFACE PROCESSES AND LANDFORMS, Issue 8 2010P. Le Gouée Abstract The cartography of erosion risk is mainly based on the development of models, which evaluate in a qualitative and quantitative manner the physical reproduction of the erosion processes (CORINE, EHU, INRA). These models are mainly semi-quantitative but can be physically based and spatially distributed (the Pan-European Soil Erosion Risk Assessment, PESERA). They are characterized by their simplicity and their applicability potential at large temporal and spatial scales. In developing our model SCALES (Spatialisation d'éChelle fine de l'ALéa Erosion des Sols/large-scale assessment and mapping model of soil erosion hazard), we had in mind several objectives: (1) to map soil erosion at a regional scale with the guarantee of a large accuracy on the local level, (2) to envisage an applicability of the model in European oceanic areas, (3) to focus the erosion hazard estimation on the level of source areas (on-site erosion), which are the agricultural parcels, (4) to take into account the weight of the temporality of agricultural practices (land-use concept). Because of these objectives, the nature of variables, which characterize the erosion factors and because of its structure, SCALES differs from other models. Tested in Basse-Normandie (Calvados 5500,km2) SCALES reveals a strong predisposition of the study area to the soil erosion which should require to be expressed in a wet year. Apart from an internal validation, we tried an intermediate one by comparing our results with those from INRA and PESERA. It appeared that these models under estimate medium erosion levels and differ in the spatial localization of areas with the highest erosion risks. SCALES underlines here the limitations in the use of pedo-transfer functions and the interpolation of input data with a low resolution. One must not forget however that these models are mainly focused on an interregional comparative approach. Therefore the comparison of SCALES data with those of the INRA and PESERA models cannot result on a convincing validation of our model. For the moment the validation is based on the opinion of local experts, who agree with the qualitative indications delivered by our cartography. An external validation of SCALES is foreseen, which will be based on a thorough inventory of erosion signals in areas with different hazard levels. Copyright © 2010 John Wiley & Sons, Ltd. [source] Interpolation processes using multivariate geostatistics for mapping of climatological precipitation mean in the Sannio Mountains (southern Italy)EARTH SURFACE PROCESSES AND LANDFORMS, Issue 3 2005Nazzareno Diodato Abstract The spatial variability of precipitation has often been a topic of research, since accurate modelling of precipitation is a crucial condition for obtaining reliable results in hydrology and geomorphology. In mountainous areas, the sparsity of the measurement networks makes an accurate and reliable spatialization of rainfall amounts at the local scale difficult. The purpose of this paper is to show how the use of a digital elevation model can improve interpolation processes at the subregional scale for mapping the mean annual and monthly precipitation from rainfall observations (40 years) recorded in a region of 1400 km2 in southern Italy. Besides linear regression of precipitation against elevation, two methods of interpolation are applied: inverse squared distance and ordinary cokriging. Cross-validation indicates that the inverse distance interpolation, which ignores the information on elevation, yields the largest prediction errors. Smaller prediction errors are produced by linear regression and ordinary cokriging. However, the results seem to favour the multivariate geostatistical method including auxiliary information (related to elevation). We conclude that ordinary cokriging is a very flexible and robust interpolation method because it can take into account several properties of the landscape; it should therefore be applicable in other mountainous regions, especially where precipitation is an important geomorphological factor. Copyright © 2005 John Wiley & Sons, Ltd. [source] The effect of bidirectional flow on tidal channel planformsEARTH SURFACE PROCESSES AND LANDFORMS, Issue 3 2004Sergio Fagherazzi Abstract Salt marsh tidal channels are highly sinuous. For this project, ,eld surveys and aerial photographs were used to characterize the planform of tidal channels at China Camp Marsh in the San Francisco Bay, California. To model the planform evolution, we assume that the topographic curvature of the channel centreline is a key element driving meander migration. Extraction of curvature data from a planimetric survey, however, presents certain problems because simple calculations based on equally distanced points on the channel axis produce numerical noise that pollutes the ,nal curvature data. We found that a spline interpolation and a polynomial ,t to the survey data provided us with a robust means of calculating channel curvature. The curvature calculations, combined with data from numerous cross-sections along the tidal channel, were used to parameterize a computer model. With this model, based on recent theoretical work, the relationship between planform shape and meander migration as well as the consequences of bidirectional ,ow on planform evolution have been investigated. Bank failure in vegetated salt marsh channels is characterized by slump blocks that persist in the channel for several years. It is therefore possible to identify reaches of active bank erosion and test model predictions. Our results suggest that the geometry and evolution of meanders at China Camp Marsh, California, re,ect the ebb-dominated regime. Copyright © 2004 John Wiley & Sons, Ltd. [source] Comparison of fluorescent stains: Relative photostability and differential staining of proteins in two-dimensional gelsELECTROPHORESIS, Issue 15 2004Gary B. Smejkal Abstract The fluorescence of proteins stained with Deep Purple and SYPRO Ruby was measured over a time course of UV transillumination to determine the relative photostability of each stain. Mean spot fluorescence (n = 200 matched spots) in gels stained with Deep Purple decreased 27% following 2 min of UV transillumination, compared to SYPRO Ruby, which decreased 17%. After 19 min, an 83% decrease in Deep Purple fluorescence was observed, compared to 44% for SYPRO Ruby. By interpolation, the half-life of Deep Purple fluorescence was estimated to be approximately 6 min. The half-life of SYPRO Ruby fluorescence was not reached during the 19 min time course. Further, differential staining of proteins was observed in gels stained with Deep Purple and SYPRO Ruby as compared to colloidal Coomassie Brilliant Blue and silver staining. [source] Temporal analysis of spatial covariance of SO2 in EuropeENVIRONMETRICS, Issue 4 2007Marco Giannitrapani Abstract In recent years, the number of applications of spatial statistics has enormously increased in environmental and ecological sciences. A typical problem is the sampling of a pollution field, with the common objective of spatial interpolation. In this paper, we present a spatial analysis across time, focusing on sulphur dioxide (SO2) concentrations monitored from 1990 to 2001 at 125 sites across Europe. Four different methods of trend estimation have been used, and comparisons among them are shown. Spherical, Exponential and Gaussian variograms have been fitted to the residuals and compared. Time series analyses of the range, sill and nugget have been undertaken and a suggestion for defining a unique spatial correlation matrix for the overall time period of analysis is proposed. Copyright © 2006 John Wiley & Sons, Ltd. [source] Mapping sea bird densities over the North Sea: spatially aggregated estimates and temporal changesENVIRONMETRICS, Issue 6 2005Edzer J. Pebesma Abstract In the Dutch sector of the North Sea, sea bird densities are recorded bi-monthly by using airborne strip-transect monitoring. From these data we try to estimate: (i) high-resolution spatial patterns of sea bird densities; (ii) low-resolution spatial-average bird densities for large areas; and (iii) temporal changes in (i) and (ii), using data on Fulmaris glacialis as an example. For spatial estimation, we combined Poisson regression for modelling the trend as a function of water depth and distance to coast with kriging interpolation of the residual variability, assuming spatial (co)variances to be proportional to the trend value. Spatial averages were estimated by block kriging. For estimating temporal differences we used residual cokriging for two consecutive years, and show how this can be extended to analyse trends over multiple years. Approximate standard errors are obtained for all estimates. A comparison with a residual simple kriging approach reveals that ignoring temporal cross-correlations leads to a severe loss of statistical accuracy when assessing the significance of temporal changes. This article shows results for Fulmaris glacialis monitored during August/September in 1998 and 1999. Copyright © 2005 John Wiley & Sons, Ltd. [source] Space,time modeling of rainfall dataENVIRONMETRICS, Issue 6 2004Luis Guillermo Coca Velarde Abstract Climate variables assume non-negative values and are often measured as zero. This is just the case when the rainfall level, in the dry season, is measured in a specified place. Then, the stochastic modeling demands the inclusion of a probability mass point at the zero level, and the resulting model is a mixture of a continuous and a Bernoulli distribution. In this article, spatial conditional autoregressive effects dealing with the idea that neighbors present similar responses is considered and the response level is modeled in two stages. The aim is to consider spatial interpolation and prediction of levels in a Bayesian context. Data on weekly rainfall levels measured in different stations at the central region of Brazil, an area with two well-marked seasons, will be used as an example. A method for comparing models, based on the deviance function, is also implemented. The main conclusion is that the use of space,time models improves the modeling of hydrological and climatological variables, allowing the inclusion of real life considerations such as the influence of other covariates, space dependence and time effects such as seasonality. Copyright © 2004 John Wiley & Sons, Ltd. [source] A high frequency kriging approach for non-stationary environmental processesENVIRONMETRICS, Issue 5 2001Montserrat Fuentes Abstract Emission reductions were mandated in the Clean Air Act Amendments of 1990 with the expectation that they would result in major reductions in the concentrations of atmospherically transported pollutants. The emission reductions are intended to reduce public health risks and to protect sensitive ecosystems. To determine whether the emission reductions are having the intended effect on atmospheric concentrations, monitoring data must be analyzed taking into consideration the spatial structure shown by the data. Maps of pollutant concentrations and fluxes are useful over different geopolitical boundaries, to discover when, where, and to what extent the U.S. Nation's air quality is improving or declining. Since the spatial covariance structure shown by the data changes with location, the standard kriging methodology for spatial interpolation cannot be used because it assumes stationarity of the process. We present a new methodology for spatial interpolation of non-stationary processes. In this method the field is represented locally as a stationary isotropic random field, but the parameters of the stationary random field are allowed to vary across space. A procedure for interpolation is presented that uses an expression for the spectral density at high frequencies. New fitting algorithms are developed using spectral approaches. In cases where the data are distributed exactly or approximately on a lattice, it is argued that spectral approaches have potentially enormous computational benefits compared with maximum likelihood. The methods are extended to interpolation questions using approximate Bayesian approaches to account for parameter uncertainty. We develop applications to obtain the total loading of pollutant concentrations and fluxes over different geo-political boundaries. Copyright © 2001 John Wiley & Sons, Ltd. [source] Array antenna assisted doppler spread compensator for OFDMEUROPEAN TRANSACTIONS ON TELECOMMUNICATIONS, Issue 5 2002Minoru Okada This paper proposes a novel array-antenna-assisted Doppler spread compensator for orthogonal frequency division multiplexing (OFDM), which is sensitive to fast time-variation of the radio propagation channel. In the proposed compensator, a linear array antenna is installed on top of the vehicle. The compensator estimates the received signal at a certain point on the linear array antenna by using space domain interpolation. Because the relative position of the estimated receiving point with respect to the ground does not change during the effective symbol duration of an OFDM signal, the time variation due to the movement of the vehicle can be compensated for. Computer simulation shows that the compensator can compensate for the bit error rate performance degradation due to time-variation of the channel when the velocity of the vehicle is up to 180km/h and a two-element array antenna is used at the carrier frequency of 600 MHz. The bit error rate performance can be further improved by using a four-element array antenna. [source] Rapid Exponential Convergence of Finite Element Estimates of the Effective Properties of Heterogeneous MaterialsADVANCED ENGINEERING MATERIALS, Issue 11 2007A. Gusev We develop and validate a general-purpose error estimator for the finite element solutions for the effective properties of heterogeneous materials. We show that the error should decrease exponentially upon increasing order of the polynomial interpolation. We use this finding to demonstrate the practical feasibility of reliable property predictions for a majority of particulate-morphology heterogeneous materials. [source] Strain-life approach in thermo-mechanical fatigue evaluation of complex structuresFATIGUE & FRACTURE OF ENGINEERING MATERIALS AND STRUCTURES, Issue 9 2007ABSTRACT This paper is a contribution to strain-life approach evaluation of thermo-mechanically loaded structures. It takes into consideration the uncoupling of stress and damage evaluation and has the option of importing non-linear or linear stress results from finite element analysis (FEA). The multiaxiality is considered with the signed von Mises method. In the developed Damage Calculation Program (DCP) local temperature-stress-strain behaviour is modelled with an operator of the Prandtl type and damage is estimated by use of the strain-life approach and Skelton's energy criterion. Material data were obtained from standard isothermal strain-controlled low cycle fatigue (LCF) tests, with linear parameter interpolation or piecewise cubic Hermite interpolation being used to estimate values at unmeasured temperature points. The model is shown with examples of constant temperature loading and random force-temperature history. Additional research was done regarding the temperature dependency of the Kp used in the Neuber approximate formula for stress-strain estimation from linear FEA results. The proposed model enables computationally fast thermo-mechanical fatigue (TMF) damage estimations for random load and temperature histories. [source] Biaxial testing and analysis of bicycle-welded components for the definition of a safety standardFATIGUE & FRACTURE OF ENGINEERING MATERIALS AND STRUCTURES, Issue 6 2003N. PETRONE ABSTRACT This paper presents the experimental evaluation of the fatigue behaviour of welded components under non-proportional variable amplitude biaxial loads. The study was undertaken on welded mountain bike handlebar stems, which were different in terms of geometry and technology and tested with load histories that were reconstructed and accelerated from recorded field data. Loads measured in the field were decomposed into bending and torsional components; a synchronous Peak-Valley counting, a spectrum inflation technique, a spline interpolation and a final amplification were applied to the measured signals in order to obtain test drive signals with the correct content of biaxial non-proportional loadings. After evaluation of the bending and torsion load-life curves of components under constant amplitude fatigue, the resulting data from biaxial variable amplitude fatigue tests were analysed in order to evaluate the damage contribution as a result of the two load components and an equivalent simplified two-stage constant amplitude fatigue test was proposed to the working group ISO/SC1/TC149/WG4. [source] GIS visualisation and analysis of mobile hydroacoustic fisheries data: a practical exampleFISHERIES MANAGEMENT & ECOLOGY, Issue 6 2005A. R. COLEY Abstract, Hydroacoustic remote sensing of fish populations residing in large freshwater bodies has become a widely used and effective monitoring tool. However, easy visualisation of the data and effective analysis is more problematic. The use of GIS-based interpolations enables easy visualisation of survey data and an analysis tool for investigating fish populations. Three years of hydroacoustic surveys of Cardiff Bay in South Wales presented an opportunity to develop analysis and visualisation techniques. Inverse distance weighted (IDW) interpolation was used to show the potential of such techniques in analysing survey data both spatially (1-year survey) and temporally (by looking at the spatial changes between years). IDW was fairly successful in visualising the hydroacoustic data for Cardiff Bay. However, other techniques may improve on this initial work and provide improved analysis, total density estimates and statistically derived estimations of prediction error. [source] Geostatistical and multi-elemental analysis of soils to interpret land-use history in the Hebrides, ScotlandGEOARCHAEOLOGY: AN INTERNATIONAL JOURNAL, Issue 4 2007J.A. Entwistle In the absence of documentary evidence about settlement form and agricultural practice in northwest Scotland before the mid-18th century, a geoarchaeological approach to reconstructing medieval land use and settlement form is presented here. This study applies multielemental analysis to soils previously collected from a settlement site in the Hebrides and highlights the importance of a detailed knowledge of the local soil environment and the cultural context. Geostatistical methods were used to analyze the spatial variability and distribution of a range of soil properties typically associated with geoarchaeological investigations. Semivariograms were produced to determine the spatial dependence of soil properties, and ordinary kriging was undertaken to produce prediction maps of the spatial distribution of these soil properties and enable interpolation over nonsampled locations in an attempt to more fully elucidate former land-use activity and settlement patterns. The importance of identifying the spatial covariance of elements and the need for several lines of physical and chemical evidence is highlighted. For many townships in the Hebrides, whose precise location and layout prior to extensive land reorganization in the late 18th,early 19th century is not recoverable through plans, multi-elemental analysis of soils can offer a valuable prospective and diagnostic tool. © 2007 Wiley Periodicals, Inc. [source] |