Home About us Contact | |||
Empirical Application (empirical + application)
Selected AbstractsBenefit-Cost Analysis of Addiction Treatment: Methodological Guidelines and Empirical Application Using the DATCAP and ASIHEALTH SERVICES RESEARCH, Issue 2 2002Michael T. French Objective. To provide detailed methodological guidelines for using the Drug Abuse Treatment Cost Analysis Program (DATCAP) and Addiction Severity Index (ASI) in a benefit-cost analysis of addiction treatment. Data Sources/Study Setting. A representative benefit-cost analysis of three outpatient programs was conducted to demonstrate the feasibility and value of the methodological guidelines. Study Design. Procedures are outlined for using resource use and cost data collected with the DATCAP. Techniques are described for converting outcome measures from the ASI to economic (dollar) benefits of treatment. Finally, principles are advanced for conducting a benefit-cost analysis and a sensitivity analysis of the estimates. Data Collection/Extraction Methods. The DATCAP was administered at three outpatient drug-free programs in Philadelphia, PA, for 2 consecutive fiscal years (1996 and 1997). The ASI was administered to a sample of 178 treatment clients at treatment entry and at 7-months postadmission. Principal Findings. The DATCAP and ASI appear to have significant potential for contributing to an economic evaluation of addiction treatment. The benefit-cost analysis and subsequent sensitivity analysis all showed that total economic benefit was greater than total economic cost at the three outpatient programs, but this representative application is meant to stimulate future economic research rather than justifying treatment per se. Conclusions. This study used previously validated, research-proven instruments and methods to perform a practical benefit-cost analysis of real-world treatment programs. The study demonstrates one way to combine economic and clinical data and offers a methodological foundation for future economic evaluations of addiction treatment. [source] Food trade balances and unit values: What can they reveal about price competition?AGRIBUSINESS : AN INTERNATIONAL JOURNAL, Issue 1 2002Mark J. Gehlhar Price competition is a fundamental assumption in modeling trade. Empirical applications often use unit values as proxies for price. This is a problem if unit values cannot explain trade flows consistent with the price competition assumption. The paper determines whether this condition exists in food product trade. Trade balances by product are used to indicate successful competition in trade. Export and import unit values are used to determine if competition is dominated by price or nonprice competition. Trade flows are then categorized in four ways: successful price competition, unsuccessful price competition, successful nonprice competition, and unsuccessful nonprice competition. This categorization is applied to 372 food products using the Standard International Trade Classification. Nearly 40% of U.S. food exports could be characterized as dominated by nonprice competition. In those instances, we contend that unit values are not valid proxies for price, thereby limiting their usefulness in traditional import demand estimation and trade policy simulation models. © 2002 Wiley Periodicals, Inc. [source] Forecasting real-time data allowing for data revisionsJOURNAL OF FORECASTING, Issue 6 2007Kosei Fukuda Abstract A modeling approach to real-time forecasting that allows for data revisions is shown. In this approach, an observed time series is decomposed into stochastic trend, data revision, and observation noise in real time. It is assumed that the stochastic trend is defined such that its first difference is specified as an AR model, and that the data revision, obtained only for the latest part of the time series, is also specified as an AR model. The proposed method is applicable to the data set with one vintage. Empirical applications to real-time forecasting of quarterly time series of US real GDP and its eight components are shown to illustrate the usefulness of the proposed approach.,,Copyright © 2007 John Wiley & Sons, Ltd. [source] A Model for Evaluating Organizational Competencies: An Application in the Context of a Quality Management Initiative,DECISION SCIENCES, Issue 2 2005Ana Belén Escrig-Tena ABSTRACT Despite the important contributions made by the Competency-Based Perspective (CBP) to strategic thought, certain issues on the operational definition of the theoretical concepts that characterize this approach remain unresolved, thus limiting its empirical application. In addressing this issue, the present study puts forward a procedure for measuring the competencies that can be developed in association with a Quality Management (QM) initiative and analyzes the reliability and validity of the resulting scale. This procedure could be transferred to studies that aim to carry out an empirical analysis based on the theoretical position of the CBP. [source] A Parametric Approach to Flexible Nonlinear InferenceECONOMETRICA, Issue 3 2001James D. Hamilton This paper proposes a new framework for determining whether a given relationship is nonlinear, what the nonlinearity looks like, and whether it is adequately described by a particular parametric model. The paper studies a regression or forecasting model of the form yt=,(xt)+,t where the functional form of ,(,) is unknown. We propose viewing ,(,) itself as the outcome of a random process. The paper introduces a new stationary random field m(,) that generalizes finite-differenced Brownian motion to a vector field and whose realizations could represent a broad class of possible forms for ,(,). We view the parameters that characterize the relation between a given realization of m(,) and the particular value of ,(,) for a given sample as population parameters to be estimated by maximum likelihood or Bayesian methods. We show that the resulting inference about the functional relation also yields consistent estimates for a broad class of deterministic functions ,(,). The paper further develops a new test of the null hypothesis of linearity based on the Lagrange multiplier principle and small-sample confidence intervals based on numerical Bayesian methods. An empirical application suggests that properly accounting for the nonlinearity of the inflation-unemployment trade-off may explain the previously reported uneven empirical success of the Phillips Curve. [source] Competition Tests with a Non-Structural Model: the Panzar,Rosse Method Applied to Germany's Savings BanksGERMAN ECONOMIC REVIEW, Issue 1 2009Horst Gischer Banking; competition; market behaviour Abstract. In this paper we adopt the Panzar,Rosse approach to assess the competitive conditions in the German banking market for the period from 1993 to 2002. We suggest several improvements to the empirical application of the approach and show that frequently used empirical models that apply price rather than revenue functions lead to biased results. Using disaggregated annual data from more than 400 savings banks (Sparkassen) the empirical findings indicate monopolistic competition, the cases of monopoly and perfect competition are strongly rejected. Furthermore, small banks seem to enjoy even more market power than larger institutions. [source] OPTIMAL FORECAST COMBINATION UNDER REGIME SWITCHING*INTERNATIONAL ECONOMIC REVIEW, Issue 4 2005Graham Elliott This article proposes a new forecast combination method that lets the combination weights be driven by regime switching in a latent state variable. An empirical application that combines forecasts from survey data and time series models finds that the proposed regime switching combination scheme performs well for a variety of macroeconomic variables. Monte Carlo simulations shed light on the type of data-generating processes for which the proposed combination method can be expected to perform better than a range of alternative combination schemes. Finally, we show how time variations in the combination weights arise when the target variable and the predictors share a common factor structure driven by a hidden Markov process. [source] Decomposing the Value of Agricultural Multifunctionality: Combining Contingent Valuation and the Analytical Hierarchy ProcessJOURNAL OF AGRICULTURAL ECONOMICS, Issue 2 2007Zein Kallas Q18; Q11; Q25 Abstract Agricultural multifunctionality is the recognition of the joint exercise of economic, environmental and social functions by this sector. Nevertheless, not all these contributions to society are valued in markets, moreover a large share of them are public goods. For this reason, in order to make this concept of multifunctionality operative for the design of public policies, it is necessary to estimate the social demand of such functions. The objective of this article was to implement an empirical application along these lines. For this purpose, the agricultural system of cereal steppes in Tierra de Campos in Spain is taken as a case study. The economic valuation technique used relies on a combined implementation of contingent valuation and the analytical hierarchy process. The results obtained demonstrate the existence of a significant demand for the different attributes included in the multifunctionality concept, although this demand is heterogeneous and is based on the socioeconomic characteristics of individual persons. [source] A semiparametric model for binary response and continuous outcomes under index heteroscedasticityJOURNAL OF APPLIED ECONOMETRICS, Issue 5 2009Roger Klein This paper formulates a likelihood-based estimator for a double-index, semiparametric binary response equation. A novel feature of this estimator is that it is based on density estimation under local smoothing. While the proofs differ from those based on alternative density estimators, the finite sample performance of the estimator is significantly improved. As binary responses often appear as endogenous regressors in continuous outcome equations, we also develop an optimal instrumental variables estimator in this context. For this purpose, we specialize the double-index model for binary response to one with heteroscedasticity that depends on an index different from that underlying the ,mean response'. We show that such (multiplicative) heteroscedasticity, whose form is not parametrically specified, effectively induces exclusion restrictions on the outcomes equation. The estimator developed exploits such identifying information. We provide simulation evidence on the favorable performance of the estimators and illustrate their use through an empirical application on the determinants, and affect, of attendance at a government-financed school. Copyright © 2009 John Wiley & Sons, Ltd. [source] Asymmetric power distribution: Theory and applications to risk measurementJOURNAL OF APPLIED ECONOMETRICS, Issue 5 2007Ivana Komunjer Theoretical literature in finance has shown that the risk of financial time series can be well quantified by their expected shortfall, also known as the tail value-at-risk. In this paper, I construct a parametric estimator for the expected shortfall based on a flexible family of densities, called the asymmetric power distribution (APD). The APD family extends the generalized power distribution to cases where the data exhibits asymmetry. The first contribution of the paper is to provide a detailed description of the properties of an APD random variable, such as its quantiles and expected shortfall. The second contribution of the paper is to derive the asymptotic distribution of the APD maximum likelihood estimator (MLE) and construct a consistent estimator for its asymptotic covariance matrix. The latter is based on the APD score whose analytic expression is also provided. A small Monte Carlo experiment examines the small sample properties of the MLE and the empirical coverage of its confidence intervals. An empirical application to four daily financial market series reveals that returns tend to be asymmetric, with innovations which cannot be modeled by either Laplace (double-exponential) or Gaussian distribution, even if we allow the latter to be asymmetric. In an out-of-sample exercise, I compare the performances of the expected shortfall forecasts based on the APD-GARCH, Skew- t -GARCH and GPD-EGARCH models. While the GPD-EGARCH 1% expected shortfall forecasts seem to outperform the competitors, all three models perform equally well at forecasting the 5% and 10% expected shortfall. Copyright © 2007 John Wiley & Sons, Ltd. [source] On detrending and cyclical asymmetryJOURNAL OF APPLIED ECONOMETRICS, Issue 3 2003Zacharias Psaradakis This paper considers the issue of testing for symmetry of the business cycle. It is demonstrated that findings of symmetry should be interpreted with caution since tests tend to have low power to detect asymmetries when applied to data that have been filtered to isolate their stationary business-cycle component. This implies that asymmetries are likely to be detected in practice only when they are particularly prominent. An empirical application examines the properties of the cyclical component of real GDP for the G7 countries. Copyright © 2002 John Wiley & Sons, Ltd. [source] Parametric and semiparametric estimation of sample selection models: an empirical application to the female labour force in PortugalJOURNAL OF APPLIED ECONOMETRICS, Issue 1 2001Maria Fraga O. Martins This paper applies both parametric and semiparametric methods to the estimation of wage and participation equations for married women in Portugal. The semiparametric estimators considered are the two-stage estimators proposed by Newey (1991) and Andrews and Schafgans (1998). The selection equation results are compared using the specification tests proposed by Horowitz (1993), Horowitz and Härdle (1994), and the wage equation results are compared using a Hausman test. Significant differences between the two approaches indicate the inappropriateness of the standard parametric methods to the estimation of the model and for the purpose of policy simulations. The greater departure seems to occur in the range of the low values of the index corresponding to a specific group of women. Copyright © 2001 John Wiley & Sons, Ltd. [source] Vertical price leadership: A cointegration analysisAGRIBUSINESS : AN INTERNATIONAL JOURNAL, Issue 3 2002W. Erno Kuiper Here we detail a method to test whether or not retailers allow suppliers to set the wholesale price not only on the basis of the costs faced by the suppliers but also on the basis of consumer demand. Using standard theory, long-run price relationships between the stages in the channel are derived. Next, these static price relationships are imposed on a dynamic model to be tested for cointegration and long-run noncausality, embedding the hypotheses on vertical price leadership. To derive the testable implications of these hypotheses, we show that the common stochastic trend and long-run equilibrium error must explicitly be assigned to variables in the channel model. The model is particularly relevant for industries characterized by a low degree of product differentiation. An empirical application to two Dutch marketing channels for food products gives comprehensible results. [EconLit citations: C32, L12, Q11] © 2002 Wiley Periodicals, Inc. [source] Forecast accuracy and economic gains from Bayesian model averaging using time-varying weightsJOURNAL OF FORECASTING, Issue 1-2 2010Lennart Hoogerheide Abstract Several Bayesian model combination schemes, including some novel approaches that simultaneously allow for parameter uncertainty, model uncertainty and robust time-varying model weights, are compared in terms of forecast accuracy and economic gains using financial and macroeconomic time series. The results indicate that the proposed time-varying model weight schemes outperform other combination schemes in terms of predictive and economic gains. In an empirical application using returns on the S&P 500 index, time-varying model weights provide improved forecasts with substantial economic gains in an investment strategy including transaction costs. Another empirical example refers to forecasting US economic growth over the business cycle. It suggests that time-varying combination schemes may be very useful in business cycle analysis and forecasting, as these may provide an early indicator for recessions. Copyright © 2009 John Wiley & Sons, Ltd. [source] Combining forecasts using optimal combination weight and generalized autoregression,JOURNAL OF FORECASTING, Issue 5 2008Jeong-Ryeol Kurz-Kim Abstract In this paper, we consider a combined forecast using an optimal combination weight in a generalized autoregression framework. The generalized autoregression provides not only a combined forecast but also an optimal combination weight for combining forecasts. By simulation, we find that short- and medium-horizon (as well as partly long-horizon) forecasts from the generalized autoregression using the optimal combination weight are more efficient than those from the usual autoregression in terms of the mean-squared forecast error. An empirical application with US gross domestic product confirms the simulation result. Copyright © 2008 John Wiley & Sons, Ltd. [source] Can panel data really improve the predictability of the monetary exchange rate model?JOURNAL OF FORECASTING, Issue 5 2007Joakim Westerlund Abstract A common explanation for the inability of the monetary model to beat the random walk in forecasting future exchange rates is that conventional time series tests may have low power, and that panel data should generate more powerful tests. This paper provides an extensive evaluation of this power argument to the use of panel data in the forecasting context. In particular, by using simulations it is shown that although pooling of the individual prediction tests can lead to substantial power gains, pooling only the parameters of the forecasting equation, as has been suggested in the previous literature, does not seem to generate more powerful tests. The simulation results are illustrated through an empirical application. Copyright © 2007 John Wiley & Sons, Ltd. [source] Robustness of alternative non-linearity tests for SETAR modelsJOURNAL OF FORECASTING, Issue 3 2004Wai-Sum Chan Abstract In recent years there has been a growing interest in exploiting potential forecast gains from the non-linear structure of self-exciting threshold autoregressive (SETAR) models. Statistical tests have been proposed in the literature to help analysts check for the presence of SETAR-type non-linearities in an observed time series. It is important to study the power and robustness properties of these tests since erroneous test results might lead to misspecified prediction problems. In this paper we investigate the robustness properties of several commonly used non-linearity tests. Both the robustness with respect to outlying observations and the robustness with respect to model specification are considered. The power comparison of these testing procedures is carried out using Monte Carlo simulation. The results indicate that all of the existing tests are not robust to outliers and model misspecification. Finally, an empirical application applies the statistical tests to stock market returns of the four little dragons (Hong Kong, South Korea, Singapore and Taiwan) in East Asia. The non-linearity tests fail to provide consistent conclusions most of the time. The results in this article stress the need for a more robust test for SETAR-type non-linearity in time series analysis and forecasting. Copyright © 2004 John Wiley & Sons, Ltd. [source] An outlier robust GARCH model and forecasting volatility of exchange rate returnsJOURNAL OF FORECASTING, Issue 5 2002Beum-Jo Park Abstract Since volatility is perceived as an explicit measure of risk, financial economists have long been concerned with accurate measures and forecasts of future volatility and, undoubtedly, the Generalized Autoregressive Conditional Heteroscedasticity (GARCH) model has been widely used for doing so. It appears, however, from some empirical studies that the GARCH model tends to provide poor volatility forecasts in the presence of additive outliers. To overcome the forecasting limitation, this paper proposes a robust GARCH model (RGARCH) using least absolute deviation estimation and introduces a valuable estimation method from a practical point of view. Extensive Monte Carlo experiments substantiate our conjectures. As the magnitude of the outliers increases, the one-step-ahead forecasting performance of the RGARCH model has a more significant improvement in two forecast evaluation criteria over both the standard GARCH and random walk models. Strong evidence in favour of the RGARCH model over other competitive models is based on empirical application. By using a sample of two daily exchange rate series, we find that the out-of-sample volatility forecasts of the RGARCH model are apparently superior to those of other competitive models. Copyright © 2002 John Wiley & Sons, Ltd. [source] Testing Stochastic Cycles in Macroeconomic Time SeriesJOURNAL OF TIME SERIES ANALYSIS, Issue 4 2001L. A. Gil-Alana A particular version of the tests of Robinson (1994) for testing stochastic cycles in macroeconomic time series is proposed in this article. The tests have a standard limit distribution and are easy to implement in raw time series. A Monte Carlo experiment is conducted, studying the size and the power of the tests against different alternatives, and the results are compared with those based on other tests. An empirical application using historical US annual data is also carried out at the end of the article. [source] Testing for Error Correction in Panel Data,OXFORD BULLETIN OF ECONOMICS & STATISTICS, Issue 6 2007Joakim Westerlund Abstract This paper proposes new error correction-based cointegration tests for panel data. The limiting distributions of the tests are derived and critical values provided. Our simulation results suggest that the tests have good small-sample properties with small size distortions and high power relative to other popular residual-based panel cointegration tests. In our empirical application, we present evidence suggesting that international healthcare expenditures and GDP are cointegrated once the possibility of an invalid common factor restriction has been accounted for. [source] STRATEGY, STRUCTURE AND PROCESS IN THE PUBLIC SECTOR: A TEST OF THE MILES AND SNOW MODELPUBLIC ADMINISTRATION, Issue 4 2009RHYS ANDREWS We present a comprehensive empirical application of the Miles and Snow (1978) model of organizational strategy, structure and process to the public sector. We refine the model by distinguishing between strategy formulation and implementation, and applying it to 90 public service organizations. Although the empirical evidence shows that organizational strategies fit the Miles and Snow categories of prospector, defender and reactor, the relationship between these strategies and organizational structures (for example, centralization) and processes (for example, planning) is less consistent with their model. Conclusions are drawn for public management theory and practice. [source] Selection Bias and Continuous-Time Duration Models: Consequences and a Proposed SolutionAMERICAN JOURNAL OF POLITICAL SCIENCE, Issue 1 2006Frederick J. Boehmke This article analyzes the consequences of nonrandom sample selection for continuous-time duration analyses and develops a new estimator to correct for it when necessary. We conduct a series of Monte Carlo analyses that estimate common duration models as well as our proposed duration model with selection. These simulations show that ignoring sample selection issues can lead to biased parameter estimates, including the appearance of (nonexistent) duration dependence. In addition, our proposed estimator is found to be superior in root mean-square error terms when nontrivial amounts of selection are present. Finally, we provide an empirical application of our method by studying whether self-selectivity is a problem for studies of leaders' survival during and following militarized conflicts. [source] Improving robust model selection tests for dynamic modelsTHE ECONOMETRICS JOURNAL, Issue 2 2010Hwan-sik Choi Summary, We propose an improved model selection test for dynamic models using a new asymptotic approximation to the sampling distribution of a new test statistic. The model selection test is applicable to dynamic models with very general selection criteria and estimation methods. Since our test statistic does not assume the exact form of a true model, the test is essentially non-parametric once competing models are estimated. For the unknown serial correlation in data, we use a Heteroscedasticity/Autocorrelation-Consistent (HAC) variance estimator, and the sampling distribution of the test statistic is approximated by the fixed- b,asymptotic approximation. The asymptotic approximation depends on kernel functions and bandwidth parameters used in HAC estimators. We compare the finite sample performance of the new test with the bootstrap methods as well as with the standard normal approximations, and show that the fixed- b,asymptotics and the bootstrap methods are markedly superior to the standard normal approximation for a moderate sample size for time series data. An empirical application for foreign exchange rate forecasting models is presented, and the result shows the normal approximation to the distribution of the test statistic considered appears to overstate the data's ability to distinguish between two competing models. [source] Consistent estimation of binary-choice panel data models with heterogeneous linear trendsTHE ECONOMETRICS JOURNAL, Issue 2 2006Alban Thomas Summary, This paper presents an extension of fixed effects binary choice models for panel data, to the case of heterogeneous linear trends. Two estimators are proposed: a Logit estimator based on double conditioning and a semiparametric, smoothed maximum score estimator based on double differences. We investigate small-sample properties of these estimators with a Monte Carlo simulation experiment, and compare their statistical properties with standard fixed effects procedures. An empirical application to land renting decisions of Russian households between 1996 and 2002 is proposed. [source] Discrete choice and stochastic utility maximizationTHE ECONOMETRICS JOURNAL, Issue 1 2003Ruud H. Koning Discrete choice models are usually derived from the assumption of random utility maximization. We consider the reverse problem, whether choice probabilities are consistent with maximization of random utilities. This leads to tests that consider the variation of these choice probabilities with the average utilities of the alternatives. By restricting the range of the average utilities we obtain a sequence of tests with fewer maintained assumptions. In an empirical application, even the test with the fewest maintained assumptions rejects the hypothesis of random utility maximization. [source] A Gaussian approach for continuous time models of the short-term interest rateTHE ECONOMETRICS JOURNAL, Issue 2 2001Jun Yu This paper proposes a Gaussian estimator for nonlinear continuous time models of the short-term interest rate. The approach is based on a stopping time argument that produces a normalizing transformation facilitating the use of a Gaussian likelihood. A Monte Carlo study shows that the finite-sample performance of the proposed procedure offers an improvement over the discrete approximation method proposed by Nowman (1997). An empirical application to US and British interest rates is given. [source] Non-monotonic hazard functions and the autoregressive conditional duration modelTHE ECONOMETRICS JOURNAL, Issue 1 2000Joachim Grammig This paper shows that the monotonicity of the conditional hazard in traditional ACD models is both econometrically important and empirically invalid. To counter this problem we introduce a more flexible parametric model which is easy to fit and performs well both in simulation studies and in practice. In an empirical application to NYSE price duration processes, we show that non-monotonic conditional hazard functions are indicated for all stocks. Recently proposed specification tests for financial duration models clearly reject the standard ACD models, whereas the results for the new model are quite favorable. [source] Exchange rate uncertainty and employment: an algorithm describing ,play'APPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY, Issue 2 2001Ansgar Belke Abstract The paper deals with the impacts of exchange rate uncertainty on the relationship between macroeconomic labour market variables. Under uncertainty, areas of weak reactions,so-called ,play' areas,have to be considered at the macrolevel. The width of the play area is a positive function of the degree of uncertainty. When changes go beyond the play-area suddenly strong reactions (,spurts') occur. These non-linear dynamics are captured in a simplified linearized way. An algorithm describing linear play hysteresis is developed and implemented into a regression framework. As an empirical application, the exchange rate impacts on German employment are analysed considering play effects. Copyright © 2001 John Wiley & Sons, Ltd. [source] Demand Diversification Under Uncertainty and Market PowerASIAN ECONOMIC JOURNAL, Issue 4 2001John J. Y. Seo This paper justifies theoretically and empirically the diversification behaviour of an importing firm when it chooses the mixture of potentially differentiated products of its major input under price uncertainty. The paper investigates an equilibrium relationship among three key explanatory variables, which are the expected price, the systematic risk of price, and monopolistic market power of the suppliers in the market. The theoretical section shows that there exists a conflict between the risk,diversification effect and the agent's preference over certain products when the importer chooses the vector of optimal quantity shares. The latter effect may disturb or even dominate the former, which can be represented in an equilibrium relationship similar to the framework of the CAPM. As an empirical application, the Chinese wheat import market is examined and analysed to answer the questions raised by the basic statistics. JEL classification: F12; F14; L22 [source] Economic drought management index to evaluate water institutions' performance under uncertainty*AUSTRALIAN JOURNAL OF AGRICULTURAL & RESOURCE ECONOMICS, Issue 1 2007Eva Iglesias Reservoir management and intertemporal water allocation are critical issues in semiarid regions where agriculture has to confront highly variable rainfall patterns. In this paper, we derive and propose an economic drought management index (EDMI) to evaluate water institutions' performance to cope with drought risk. The EDMI is based on the optimal conditions of a stochastic dynamic optimisation problem that characterises reservoir management. The index's main advantages are its ease of interpretation and breadth of scope, as it incorporates information on hydrological processes, structural constraints, water institutions' rules, and the economic benefits of water use. An empirical application is developed to assess the institutional rules governing water allocation in two different supply systems in Andalusia (southern Spain). [source] |