Block Bootstrap (block + bootstrap)

Distribution by Scientific Domains


Selected Abstracts


Bootstrap Methods for Markov Processes

ECONOMETRICA, Issue 4 2003
Joel L. Horowitz
The block bootstrap is the best known bootstrap method for time-series data when the analyst does not have a parametric model that reduces the data generation process to simple random sampling. However, the errors made by the block bootstrap converge to zero only slightly faster than those made by first-order asymptotic approximations. This paper describes a bootstrap procedure for data that are generated by a Markov process or a process that can be approximated by a Markov process with sufficient accuracy. The procedure is based on estimating the Markov transition density nonparametrically. Bootstrap samples are obtained by sampling the process implied by the estimated transition density. Conditions are given under which the errors made by the Markov bootstrap converge to zero more rapidly than those made by the block bootstrap. [source]


A neural network versus Black,Scholes: a comparison of pricing and hedging performances

JOURNAL OF FORECASTING, Issue 4 2003
Henrik Amilon
Abstract An Erratum has been published for this article in Journal of Forecasting 22(6-7) 2003, 551 The Black,Scholes formula is a well-known model for pricing and hedging derivative securities. It relies, however, on several highly questionable assumptions. This paper examines whether a neural network (MLP) can be used to find a call option pricing formula better corresponding to market prices and the properties of the underlying asset than the Black,Scholes formula. The neural network method is applied to the out-of-sample pricing and delta-hedging of daily Swedish stock index call options from 1997 to 1999. The relevance of a hedge-analysis is stressed further in this paper. As benchmarks, the Black,Scholes model with historical and implied volatility estimates are used. Comparisons reveal that the neural network models outperform the benchmarks both in pricing and hedging performances. A moving block bootstrap is used to test the statistical significance of the results. Although the neural networks are superior, the results are sometimes insignificant at the 5% level.,Copyright © 2003 John Wiley & Sons, Ltd. [source]


Kernel matching scheme for block bootstrap of time series data

JOURNAL OF TIME SERIES ANALYSIS, Issue 2 2004
Tae Yoon Kim
Abstract., The block bootstrap for time series consists in randomly resampling blocks of the original data with replacement and aligning these blocks into a bootstrap sample. Recently several matching schemes for the block bootstraps have been suggested to improve its performance by reduction of bias [Bernoulli 4 (1998), 305]. The matching schemes are usually achieved by aligning with higher likelihood those blocks which match at their ends. The kernel matching scheme we consider here takes some of the dependence structure of the data into account and is based on a kernel estimate of the conditional lag one distribution. In this article transition probabilities of the kernel matching scheme are investigated in detail by concentrating on a simple case. Our results here discuss theoretical properties of the transition probability matrix including ergodicity, which shows the potential of the matching scheme for bias reduction. [source]


A Three-step Method for Choosing the Number of Bootstrap Repetitions

ECONOMETRICA, Issue 1 2000
Donald W. K. Andrews
This paper considers the problem of choosing the number of bootstrap repetitions B for bootstrap standard errors, confidence intervals, confidence regions, hypothesis tests, p -values, and bias correction. For each of these problems, the paper provides a three-step method for choosing B to achieve a desired level of accuracy. Accuracy is measured by the percentage deviation of the bootstrap standard error estimate, confidence interval length, test's critical value, test's p -value, or bias-corrected estimate based on B bootstrap simulations from the corresponding ideal bootstrap quantities for which B=,. The results apply quite generally to parametric, semiparametric, and nonparametric models with independent and dependent data. The results apply to the standard nonparametric iid bootstrap, moving block bootstraps for time series data, parametric and semiparametric bootstraps, and bootstraps for regression models based on bootstrapping residuals. Monte Carlo simulations show that the proposed methods work very well. [source]


Kernel matching scheme for block bootstrap of time series data

JOURNAL OF TIME SERIES ANALYSIS, Issue 2 2004
Tae Yoon Kim
Abstract., The block bootstrap for time series consists in randomly resampling blocks of the original data with replacement and aligning these blocks into a bootstrap sample. Recently several matching schemes for the block bootstraps have been suggested to improve its performance by reduction of bias [Bernoulli 4 (1998), 305]. The matching schemes are usually achieved by aligning with higher likelihood those blocks which match at their ends. The kernel matching scheme we consider here takes some of the dependence structure of the data into account and is based on a kernel estimate of the conditional lag one distribution. In this article transition probabilities of the kernel matching scheme are investigated in detail by concentrating on a simple case. Our results here discuss theoretical properties of the transition probability matrix including ergodicity, which shows the potential of the matching scheme for bias reduction. [source]