Home About us Contact | |||
Marginal Likelihood (marginal + likelihood)
Selected AbstractsIdentification of continuous-time nonlinear systems by using a gaussian process modelIEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING, Issue 6 2008Tomohiro Hachino Member Abstract This paper deals with a nonparametric identification of continuous-time nonlinear systems by using a Gaussian process model. Genetic algorithm is applied to train the Gaussian process prior model by minimizing the negative log marginal likelihood of the identification data. The nonlinear term of the objective system is estimated as the predictive mean function of the Gaussian process, and the confidence measure of the estimated nonlinear function is given by the predictive covariance function of the Gaussian process. Copyright © 2008 Institute of Electrical Engineers of Japan. Published by John Wiley & Sons, Inc. [source] A Bayesian Kepler periodogram detects a second planet in HD 208487MONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY, Issue 4 2007P. C. Gregory ABSTRACT An automatic Bayesian Kepler periodogram has been developed for identifying and characterizing multiple planetary orbits in precision radial velocity data. The periodogram is powered by a parallel tempering Markov chain Monte Carlo (MCMC) algorithm which is capable of efficiently exploring a multiplanet model parameter space. The periodogram employs an alternative method for converting the time of an observation to true anomaly that enables it to handle much larger data sets without a significant increase in computation time. Improvements in the periodogram and further tests using data from HD 208487 have resulted in the detection of a second planet with a period of 90982,92 d, an eccentricity of 0.370.26,0.20, a semimajor axis of 1.870.13,0.14 au and an M sin i= 0.45+0.11,0.13MJ. The revised parameters of the first planet are period = 129.8 ± 0.4 d, eccentricity = 0.20 ± 0.09, semimajor axis = 0.51 ± 0.02 au and M sin i= 0.41 ± 0.05 MJ. Particular attention is paid to several methods for calculating the model marginal likelihood which is used to compare the probabilities of models with different numbers of planets. [source] ESTIMATION, PREDICTION AND INFERENCE FOR THE LASSO RANDOM EFFECTS MODELAUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 1 2009Scott D. Foster Summary The least absolute shrinkage and selection operator (LASSO) can be formulated as a random effects model with an associated variance parameter that can be estimated with other components of variance. In this paper, estimation of the variance parameters is performed by means of an approximation to the marginal likelihood of the observed outcomes. The approximation is based on an alternative but equivalent formulation of the LASSO random effects model. Predictions can be made using point summaries of the predictive distribution of the random effects given the data with the parameters set to their estimated values. The standard LASSO method uses the mode of this distribution as the predictor. It is not the only choice, and a number of other possibilities are defined and empirically assessed in this article. The predictive mode is competitive with the predictive mean (best predictor), but no single predictor performs best across in all situations. Inference for the LASSO random effects is performed using predictive probability statements, which are more appropriate under the random effects formulation than tests of hypothesis. [source] Smooth Random Effects Distribution in a Linear Mixed ModelBIOMETRICS, Issue 4 2004Wendimagegn Ghidey Summary A linear mixed model with a smooth random effects density is proposed. A similar approach to P -spline smoothing of Eilers and Marx (1996, Statistical Science11, 89,121) is applied to yield a more flexible estimate of the random effects density. Our approach differs from theirs in that the B -spline basis functions are replaced by approximating Gaussian densities. Fitting the model involves maximizing a penalized marginal likelihood. The best penalty parameters minimize Akaike's Information Criterion employing Gray's (1992, Journal of the American Statistical Association87, 942,951) results. Although our method is applicable to any dimensions of the random effects structure, in this article the two-dimensional case is explored. Our methodology is conceptually simple, and it is relatively easy to fit in practice and is applied to the cholesterol data first analyzed by Zhang and Davidian (2001, Biometrics57, 795,802). A simulation study shows that our approach yields almost unbiased estimates of the regression and the smoothing parameters in small sample settings. Consistency of the estimates is shown in a particular case. [source] |