Home About us Contact | |||
Gaussian Process (gaussian + process)
Terms modified by Gaussian Process Selected AbstractsSimulation of Real-Valued Discrete-Time Periodically Correlated Gaussian Processes with Prescribed Spectral Density MatricesJOURNAL OF TIME SERIES ANALYSIS, Issue 2 2007A. R. Soltani Abstract., In this article, we provide a spectral characterization for a real-valued discrete-time periodically correlated process, and then proceed on to establish a simulation procedure to simulate such a Gaussian process for a given spectral density. We also prove that the simulated process, at each time index, converges to the actual process in the mean square. [source] Identification of continuous-time nonlinear systems by using a gaussian process modelIEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING, Issue 6 2008Tomohiro Hachino Member Abstract This paper deals with a nonparametric identification of continuous-time nonlinear systems by using a Gaussian process model. Genetic algorithm is applied to train the Gaussian process prior model by minimizing the negative log marginal likelihood of the identification data. The nonlinear term of the objective system is estimated as the predictive mean function of the Gaussian process, and the confidence measure of the estimated nonlinear function is given by the predictive covariance function of the Gaussian process. Copyright © 2008 Institute of Electrical Engineers of Japan. Published by John Wiley & Sons, Inc. [source] Parameter identifiability with Kullback,Leibler information divergence criterionINTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, Issue 10 2009Badong Chen Abstract We study the problem of parameter identifiability with Kullback,Leibler information divergence (KLID) criterion. The KLID-identifiability is defined, which can be related to many other concepts of identifiability, such as the identifiability with Fisher's information matrix criterion, identifiability with least-squares criterion, and identifiability with spectral density criterion. We also establish a simple check criterion for the Gaussian process and derive an upper bound for the minimal identifiable horizon of Markov process. Furthermore, we define the asymptotic KLID-identifiability and prove that, under certain constraints, the KLID-identifiability will be a sufficient or necessary condition for the asymptotic KLID-identifiability. The consistency problems of several parameter estimation methods are also discussed. Copyright © 2008 John Wiley & Sons, Ltd. [source] Adaptive multiobjective optimization of process conditions for injection molding using a Gaussian process approachADVANCES IN POLYMER TECHNOLOGY, Issue 2 2007Jian Zhou Abstract Selecting the proper process conditions for the injection-molding process is treated as a multiobjective optimization problem, where different objectives, such as minimizing the injection pressure, volumetric shrinkage/warpage, or cycle time, present trade-off behaviors. As such, various optima may exist in the objective space. This paper presents the development of an integrated simulation-based optimization system that incorporates the design of computer experiments, Gaussian process (GP) for regression, multiobjective genetic algorithm (MOGA), and levels of adjacency to adaptively and automatically search for the Pareto-optimal solutions for different objectives. Since the GP approach can provide both the predictions and the estimations of the predictions simultaneously, a nondominated sorting procedure on the predicted variances at each iteration step is performed to intelligently select extra samples that can be used as additional training samples to improve the GP surrogate models. At the same time, user-defined adjacency constraint percentages are employed for evaluating the convergence of iteration. The illustrative applications in this paper show that the proposed optimization system can help mold designers to efficiently and effectively identify optimal process conditions. © 2007 Wiley Periodicals, Inc. Adv Polym Techn 26:71,85, 2007; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/adv.20092 [source] Simulation of Real-Valued Discrete-Time Periodically Correlated Gaussian Processes with Prescribed Spectral Density MatricesJOURNAL OF TIME SERIES ANALYSIS, Issue 2 2007A. R. Soltani Abstract., In this article, we provide a spectral characterization for a real-valued discrete-time periodically correlated process, and then proceed on to establish a simulation procedure to simulate such a Gaussian process for a given spectral density. We also prove that the simulated process, at each time index, converges to the actual process in the mean square. [source] Impact of the Sampling Rate on the Estimation of the Parameters of Fractional Brownian MotionJOURNAL OF TIME SERIES ANALYSIS, Issue 3 2006Zhengyuan Zhu Primary 60G18; secondary 62D05, 62F12 Abstract., Fractional Brownian motion is a mean-zero self-similar Gaussian process with stationary increments. Its covariance depends on two parameters, the self-similar parameter H and the variance C. Suppose that one wants to estimate optimally these parameters by using n equally spaced observations. How should these observations be distributed? We show that the spacing of the observations does not affect the estimation of H (this is due to the self-similarity of the process), but the spacing does affect the estimation of the variance C. For example, if the observations are equally spaced on [0, n] (unit-spacing), the rate of convergence of the maximum likelihood estimator (MLE) of the variance C is . However, if the observations are equally spaced on [0, 1] (1/n -spacing), or on [0, n2] (n -spacing), the rate is slower, . We also determine the optimal choice of the spacing , when it is constant, independent of the sample size n. While the rate of convergence of the MLE of C is in this case, irrespective of the value of ,, the value of the optimal spacing depends on H. It is 1 (unit-spacing) if H = 1/2 but is very large if H is close to 1. [source] Nonparametric Tests of Change-Points with Tapered DataJOURNAL OF TIME SERIES ANALYSIS, Issue 1 2001Yves Rozenholc In this work we build two families of nonparametric tests using tapered data for the off-line detection of change-points in the spectral characteristics of a stationary Gaussian process. This is done using the Kolmogorov,Smirnov statistics based on integrated tapered periodograms. Convergence is obtained under the null hypothesis by means of a double indexed (frequency, time) process together with some extensions of Dirichlet and Fejer kernels. Consistency is proved using these statistics under the alternative. Then, using numerical simulations, we observe that the use of tapered data significantly improves the properties of the test, especially in the case of small samples. [source] A MULTINOMIAL APPROXIMATION FOR AMERICAN OPTION PRICES IN LÉVY PROCESS MODELSMATHEMATICAL FINANCE, Issue 4 2006Ross A. Maller This paper gives a tree-based method for pricing American options in models where the stock price follows a general exponential Lévy process. A multinomial model for approximating the stock price process, which can be viewed as generalizing the binomial model of Cox, Ross, and Rubinstein (1979) for geometric Brownian motion, is developed. Under mild conditions, it is proved that the stock price process and the prices of American-type options on the stock, calculated from the multinomial model, converge to the corresponding prices under the continuous time Lévy process model. Explicit illustrations are given for the variance gamma model and the normal inverse Gaussian process when the option is an American put, but the procedure is applicable to a much wider class of derivatives including some path-dependent options. Our approach overcomes some practical difficulties that have previously been encountered when the Lévy process has infinite activity. [source] Process optimization of injection molding using an adaptive surrogate model with Gaussian process approachPOLYMER ENGINEERING & SCIENCE, Issue 5 2007Jian Zhou This article presents an integrated, simulation-based optimization procedure that can determine the optimal process conditions for injection molding without user intervention. The idea is to use a nonlinear statistical regression technique and design of computer experiments to establish an adaptive surrogate model with short turn-around time and adequate accuracy for substituting time-consuming computer simulations during system-level optimization. A special surrogate model based on the Gaussian process (GP) approach, which has not been employed previously for injection molding optimization, is introduced. GP is capable of giving both a prediction and an estimate of the confidence (variance) for the prediction simultaneously, thus providing direction as to where additional training samples could be added to improve the surrogate model. While the surrogate model is being established, a hybrid genetic algorithm is employed to evaluate the model to search for the global optimal solutions in a concurrent fashion. The examples presented in this article show that the proposed adaptive optimization procedure helps engineers determine the optimal process conditions more efficiently and effectively. POLYM. ENG. SCI., 47:684,694, 2007. © 2007 Society of Plastics Engineers. [source] The likelihood ratio test for homogeneity in finite mixture modelsTHE CANADIAN JOURNAL OF STATISTICS, Issue 2 2001Hanfeng Chen Abstract The authors study the asymptotic behaviour of the likelihood ratio statistic for testing homogeneity in the finite mixture models of a general parametric distribution family. They prove that the limiting distribution of this statistic is the squared supremum of a truncated standard Gaussian process. The autocorrelation function of the Gaussian process is explicitly presented. A re-sampling procedure is recommended to obtain the asymptotic p -value. Three kernel functions, normal, binomial and Poisson, are used in a simulation study which illustrates the procedure. [source] Model-Checking Techniques Based on Cumulative ResidualsBIOMETRICS, Issue 1 2002D. Y. Lin Summary. Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes under the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided. [source] Efficient sampling for spatial uncertainty quantification in multibody system dynamics applicationsINTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 5 2009Kyle P. Schmitt Abstract We present two methods for efficiently sampling the response (trajectory space) of multibody systems operating under spatial uncertainty, when the latter is assumed to be representable with Gaussian processes. In this case, the dynamics (time evolution) of the multibody systems depends on spatially indexed uncertain parameters that span infinite-dimensional spaces. This places a heavy computational burden on existing methodologies, an issue addressed herein with two new conditional sampling approaches. When a single instance of the uncertainty is needed in the entire domain, we use a fast Fourier transform technique. When the initial conditions are fixed and the path distribution of the dynamical system is relatively narrow, we use an incremental sampling approach that is fast and has a small memory footprint. Both methods produce the same distributions as the widely used Cholesky-based approaches. We illustrate this convergence at a smaller computational effort and memory cost for a simple non-linear vehicle model. Copyright © 2009 John Wiley & Sons, Ltd. [source] A Bayesian regression approach to terrain mapping and an application to legged robot locomotionJOURNAL OF FIELD ROBOTICS (FORMERLY JOURNAL OF ROBOTIC SYSTEMS), Issue 10 2009Christian Plagemann We deal with the problem of learning probabilistic models of terrain surfaces from sparse and noisy elevation measurements. The key idea is to formalize this as a regression problem and to derive a solution based on nonstationary Gaussian processes. We describe how to achieve a sparse approximation of the model, which makes the model applicable to real-world data sets. The main benefits of our model are that (1) it does not require a discretization of space, (2) it also provides the uncertainty for its predictions, and (3) it adapts its covariance function to the observed data, allowing more accurate inference of terrain elevation at points that have not been observed directly. As a second contribution, we describe how a legged robot equipped with a laser range finder can utilize the developed terrain model to plan and execute a path over rough terrain. We show how a motion planner can use the learned terrain model to plan a path to a goal location, using a terrain-specific cost model to accept or reject candidate footholds. To the best of our knowledge, this was the first legged robotics system to autonomously sense, plan, and traverse a terrain surface of the given complexity. © 2009 Wiley Periodicals, Inc. [source] Embedding a Gaussian discrete-time autoregressive moving average process in a Gaussian continuous-time autoregressive moving average processJOURNAL OF TIME SERIES ANALYSIS, Issue 4 2007Mituaki Huzii Abstract., Embedding a discrete-time autoregressive moving average (DARMA) process in a continuous-time ARMA (CARMA) process has been discussed by many authors. These authors have considered the relationship between the autocovariance structures of continuous-time and related discrete-time processes. In this article, we treat the problem from a slightly different point of view. We define embedding in a more rigid way by taking account of the probability structure. We consider Gaussian processes. First we summarize the necessary and sufficient condition for a DARMA process to be able to be embedded in a CARMA process. Secondly, we show a concrete condition such that a DARMA process can be embeddable in a CARMA process. This condition is new and general. Thirdly, we show some special cases including new examples. We show how we can examine embeddability for these special cases. [source] High Moment Partial Sum Processes of Residuals in ARMA Models and their ApplicationsJOURNAL OF TIME SERIES ANALYSIS, Issue 1 2007Hao Yu Abstract., In this article, we study high moment partial sum processes based on residuals of a stationary autoregressive moving average (ARMA) model with known or unknown mean parameter. We show that they can be approximated in probability by the analogous processes which are obtained from the i.i.d. errors of the ARMA model. However, if a unknown mean parameter is used, there will be an additional term that depends on model parameters and a mean estimator. When properly normalized, this additional term will vanish. Thus the processes converge weakly to the same Gaussian processes as if the residuals were i.i.d. Applications to change-point problems and goodness-of-fit are considered, in particular, cumulative sum statistics for testing ARMA model structure changes and the Jarque,Bera omnibus statistic for testing normality of the unobservable error distribution of an ARMA model. [source] Simulating a class of stationary Gaussian processes using the Davies,Harte algorithm, with application to long memory processesJOURNAL OF TIME SERIES ANALYSIS, Issue 5 2003PETER F. CRAIGMILE We demonstrate that the fast and exact Davies,Harte algorithm is valid for simulating a certain class of stationary Gaussian processes , those with a negative autocovariance sequence for all non-zero lags. The result applies to well known classes of long memory processes: Gaussian fractionally differenced (FD) processes, fractional Gaussian noise (fGn) and the nonstationary fractional Brownian Motion (fBm). [source] Prediction Variance and Information Worth of Observations in Time SeriesJOURNAL OF TIME SERIES ANALYSIS, Issue 4 2000Mohsen Pourahmadi The problem of developing measures of worth of observations in time series has not received much attention in the literature. Any meaningful measure of worth should naturally depend on the position of the observation as well as the objectives of the analysis, namely parameter estimation or prediction of future values. We introduce a measure that quantifies worth of a set of observations for the purpose of prediction of outcomes of stationary processes. The worth is measured as the change in the information content of the entire past due to exclusion or inclusion of a set of observations. The information content is quantified by the mutual information, which is the information theoretic measure of dependency. For Gaussian processes, the measure of worth turns out to be the relative change in the prediction error variance due to exclusion or inclusion of a set of observations. We provide formulae for computing predictive worth of a set of observations for Gaussian autoregressive moving-average processs. For non-Gaussian processes, however, a simple function of its entropy provides a lower bound for the variance of prediction error in the same manner that Fisher information provides a lower bound for the variance of an unbiased estimator via the Cramer-Rao inequality. Statistical estimation of this lower bound requires estimation of the entropy of a stationary time series. [source] Model-Checking Techniques Based on Cumulative ResidualsBIOMETRICS, Issue 1 2002D. Y. Lin Summary. Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes under the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided. [source] |