Parametric Approach (parametric + approach)

Distribution by Scientific Domains


Selected Abstracts


A Parametric Approach to Flexible Nonlinear Inference

ECONOMETRICA, Issue 3 2001
James D. Hamilton
This paper proposes a new framework for determining whether a given relationship is nonlinear, what the nonlinearity looks like, and whether it is adequately described by a particular parametric model. The paper studies a regression or forecasting model of the form yt=,(xt)+,t where the functional form of ,(,) is unknown. We propose viewing ,(,) itself as the outcome of a random process. The paper introduces a new stationary random field m(,) that generalizes finite-differenced Brownian motion to a vector field and whose realizations could represent a broad class of possible forms for ,(,). We view the parameters that characterize the relation between a given realization of m(,) and the particular value of ,(,) for a given sample as population parameters to be estimated by maximum likelihood or Bayesian methods. We show that the resulting inference about the functional relation also yields consistent estimates for a broad class of deterministic functions ,(,). The paper further develops a new test of the null hypothesis of linearity based on the Lagrange multiplier principle and small-sample confidence intervals based on numerical Bayesian methods. An empirical application suggests that properly accounting for the nonlinearity of the inflation-unemployment trade-off may explain the previously reported uneven empirical success of the Phillips Curve. [source]


Nominal Wage Rigidity in Contract Data: A Parametric Approach

ECONOMICA, Issue 280 2003
Louis N. Christofides
Using wage agreements reached in the Canadian unionized sector during 1976,99, a period of high as well as exceptionally low inflation, we consider how histograms of wage adjustment changed as inflation reached the low levels of the 1990s. The histograms and parametric tests suggest that wage adjustment is characterized by downward nominal rigidity and significant spikes at zero. There is some evidence of modest menu-cost effects. We examine whether the rigidity features of wage adjustment are sensitive to indexation provisions, and investigate whether the distinction between short and long contracts is useful. [source]


Downward Wage Rigidity in Europe: A New Flexible Parametric Approach and Empirical Results

GERMAN ECONOMIC REVIEW, Issue 2 2010
Andreas Behr
Wage rigidity; ECHP; Sticky prices Abstract. We suggest a new parametric approach to estimate the extent of downward nominal wage rigidity in ten European countries between 1995 and 2001. The database used throughout is the User Data Base of the European Community Household Panel (ECHP). The proposed approach is based on the generalized hyperbolic distribution, which allows to model wage change distributions characterized by thick tales, skewness and leptokurtosis. Significant downward nominal wage rigidity is found in all countries under analysis, but the extent varies considerably across countries. Yearly estimates reveal increasing rigidity in Italy, Greece and Portugal, while rigidity is declining in Denmark and Belgium. The results imply that the costs of price stability differ substantially across Europe. [source]


Non-parametric,parametric model for random uncertainties in non-linear structural dynamics: application to earthquake engineering

EARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 3 2004
Christophe Desceliers
Abstract This paper deals with the transient response of a non-linear dynamical system with random uncertainties. The non-parametric probabilistic model of random uncertainties recently published and extended to non-linear dynamical system analysis is used in order to model random uncertainties related to the linear part of the finite element model. The non-linearities are due to restoring forces whose parameters are uncertain and are modeled by the parametric approach. Jayne's maximum entropy principle with the constraints defined by the available information allows the probabilistic model of such random variables to be constructed. Therefore, a non-parametric,parametric formulation is developed in order to model all the sources of uncertainties in such a non-linear dynamical system. Finally, a numerical application for earthquake engineering analysis is proposed concerning a reactor cooling system under seismic loads. Copyright © 2003 John Wiley & Sons, Ltd. [source]


Testing the group polarization hypothesis by using logit models

EUROPEAN JOURNAL OF SOCIAL PSYCHOLOGY, Issue 1 2002
Marķa F. Rodrigo
This paper focuses on methodological aspects of group polarization research and has two well-defined parts. The first part presents a methodological overview of group polarization research together with an examination of the inadequacy, under certain circumstances, of the traditional parametric approach usually used to test this phenomenon based on pre-test/post-test means comparison across groups. It is shown that this approach will produce masks effects when groups are heterogeneous with regard to the observed change from pre-test to post-test. The second part suggests an alternative methodological approach based on logit models for the analysis of contingency tables from a categorization of the variable ,kind of shift'. This approach is illustrated and compared with the parametric approach with a simulated data set. Copyright © 2002 John Wiley & Sons, Ltd. [source]


Assessment and compensation of inconsistent coupling conditions in point-receiver land seismic data

GEOPHYSICAL PROSPECTING, Issue 1 2007
Claudio Bagaini
ABSTRACT We introduce a method to detect and compensate for inconsistent coupling conditions that arise during onshore seismic data acquisitions. The reflected seismic signals, the surface waves, or the ambient-noise records can be used for the evaluation of the different coupling conditions of closely spaced geophones. We derive frequency-dependent correction operators using a parametric approach based upon a simple model of the interaction between geophone and soil. The redundancy of the measurements available permits verification of the assumptions made on the input signals in order to derive the method and to assess the validity of the model used. The method requires point-receiver data in which the signals recorded by the individual geophones are digitized. We have verified the accuracy of the method by applying it to multicomponent ambient-noise records acquired during a field experiment in which the coupling conditions were controlled and modified during different phases of the experiment. We also applied the method to field data, which were acquired without the coupling conditions being controlled, and found that only a few geophones showed an anomalous behaviour. It was also found that the length of the noise records routinely acquired during commercial surveys is too short to provide enough statistics for the application of our method. [source]


Downward Wage Rigidity in Europe: A New Flexible Parametric Approach and Empirical Results

GERMAN ECONOMIC REVIEW, Issue 2 2010
Andreas Behr
Wage rigidity; ECHP; Sticky prices Abstract. We suggest a new parametric approach to estimate the extent of downward nominal wage rigidity in ten European countries between 1995 and 2001. The database used throughout is the User Data Base of the European Community Household Panel (ECHP). The proposed approach is based on the generalized hyperbolic distribution, which allows to model wage change distributions characterized by thick tales, skewness and leptokurtosis. Significant downward nominal wage rigidity is found in all countries under analysis, but the extent varies considerably across countries. Yearly estimates reveal increasing rigidity in Italy, Greece and Portugal, while rigidity is declining in Denmark and Belgium. The results imply that the costs of price stability differ substantially across Europe. [source]


Inference for two-stage adaptive treatment strategies using mixture distributions

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES C (APPLIED STATISTICS), Issue 1 2010
Abdus S. Wahed
Summary., Treatment of complex diseases such as cancer, leukaemia, acquired immune deficiency syndrome and depression usually follows complex treatment regimes consisting of time varying multiple courses of the same or different treatments. The goal is to achieve the largest overall benefit defined by a common end point such as survival. Adaptive treatment strategy refers to a sequence of treatments that are applied at different stages of therapy based on the individual's history of covariates and intermediate responses to the earlier treatments. However, in many cases treatment assignment depends only on intermediate response and prior treatments. Clinical trials are often designed to compare two or more adaptive treatment strategies. A common approach that is used in these trials is sequential randomization. Patients are randomized on entry into available first-stage treatments and then on the basis of the response to the initial treatments are randomized to second-stage treatments, and so on. The analysis often ignores this feature of randomization and frequently conducts separate analysis for each stage. Recent literature suggested several semiparametric and Bayesian methods for inference related to adaptive treatment strategies from sequentially randomized trials. We develop a parametric approach using mixture distributions to model the survival times under different adaptive treatment strategies. We show that the estimators proposed are asymptotically unbiased and can be easily implemented by using existing routines in statistical software packages. [source]


Toward faster algorithms for dynamic traffic assignment.

NETWORKS: AN INTERNATIONAL JOURNAL, Issue 1 2003

Abstract Being first in a three-part series promising a practical solution to the user-equilibrium dynamic traffic assignment problem, this paper devises a parametric quickest-path tree algorithm, whose model makes three practical assumptions: (i) the traversal time of an arc i , j is a piecewise linear function of the arrival time at its i -node; (ii) the traversal time of a path is the sum of its arcs' traversal times; and (iii) the FIFO constraint holds, that is, later departure implies later arrival. The algorithm finds a quickest path, and its associated earliest arrival time, to every node for every desired departure time from the origin. Its parametric approach transforms a min-path tree for one departure-time interval into another for the next adjacent interval, whose shared boundary the algorithm determines on the fly. By building relatively few trees, it provides the topology explicitly and the arrival times implicitly of all min-path trees. Tests show the algorithm running upward of 10 times faster than the conventional brute-force approach, which explicitly builds a min-path tree for every departure time. Besides dynamic traffic assignment, other applications for which these findings have utility include traffic control planning, vehicle routing and scheduling, real-time highway route guidance, etc. © 2002 Wiley Periodicals, Inc. [source]


Refined Rank Regression Method with Censors

QUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, Issue 7 2004
Wendai Wang
Abstract Reliability engineers often face failure data with suspensions. The rank regression method with an approach introduced by Johnson has been commonly used to handle data with suspensions in engineering practice and commercial software. However, the Johnson method makes partial use of suspension information only,the positions of suspensions, not the exact times to suspensions. A new approach for rank regression with censored data is proposed in this paper, which makes full use of suspension information. Taking advantage of the parametric approach, the refined rank regression obtains the ,exact' mean order number for each failure point in the sample. With the ,exact' mean order number, the proposed method gives the ,best' fit to sample data for an assumed times-to-failure distribution. This refined rank regression is simple to implement and appears to have good statistical and convergence properties. An example is provided to illustrate the proposed method. Copyright © 2004 John Wiley & Sons, Ltd. [source]


Experiments in Associative Urbanism

ARCHITECTURAL DESIGN, Issue 4 2009
Tom Verebes
Abstract ,There has never been a more crucial time to challenge, reassess and propose alternatives to conventional urban masterplanning and its associated conventions, types and standards.' Tom Verebes describes how the Design Research Laboratory (DRL) at the Architectural Association in London has employed a parametric approach to urbanism that investigates how associative design systems can control local dynamic information flows through interactive systems, spaces and interfaces. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Adjustment for Missingness Using Auxiliary Information in Semiparametric Regression

BIOMETRICS, Issue 1 2010
Donglin Zeng
Summary In this article, we study the estimation of mean response and regression coefficient in semiparametric regression problems when response variable is subject to nonrandom missingness. When the missingness is independent of the response conditional on high-dimensional auxiliary information, the parametric approach may misspecify the relationship between covariates and response while the nonparametric approach is infeasible because of the curse of dimensionality. To overcome this, we study a model-based approach to condense the auxiliary information and estimate the parameters of interest nonparametrically on the condensed covariate space. Our estimators possess the double robustness property, i.e., they are consistent whenever the model for the response given auxiliary covariates or the model for the missingness given auxiliary covariate is correct. We conduct a number of simulations to compare the numerical performance between our estimators and other existing estimators in the current missing data literature, including the propensity score approach and the inverse probability weighted estimating equation. A set of real data is used to illustrate our approach. [source]


Genome-wide association analyses of expression phenotypes

GENETIC EPIDEMIOLOGY, Issue S1 2007
Gary K. Chen
Abstract A number of issues arise when analyzing the large amount of data from high-throughput genotype and expression microarray experiments, including design and interpretation of genome-wide association studies of expression phenotypes. These issues were considered by contributions submitted to Group 1 of the Genetic Analysis Workshop 15 (GAW15), which focused on the association of quantitative expression data. These contributions evaluated diverse hypotheses, including those relevant to cancer and obesity research, and used various analytic techniques, many of which were derived from information theory. Several observations from these reports stand out. First, one needs to consider the genetic model of the trait of interest and carefully select which single nucleotide polymorphisms and individuals are included early in the design stage of a study. Second, by targeting specific pathways when analyzing genome-wide data, one can generate more interpretable results than agnostic approaches. Finally, for datasets with small sample sizes but a large number of features like the Genetic Analysis Workshop 15 dataset, machine learning approaches may be more practical than traditional parametric approaches. Genet Epidemiol 31 (Suppl. 1): S7,S11, 2007. © 2007 Wiley-Liss, Inc. [source]


A novel method to identify gene,gene effects in nuclear families: the MDR-PDT

GENETIC EPIDEMIOLOGY, Issue 2 2006
E.R. Martin
Abstract It is now well recognized that gene,gene and gene,environment interactions are important in complex diseases, and statistical methods to detect interactions are becoming widespread. Traditional parametric approaches are limited in their ability to detect high-order interactions and handle sparse data, and standard stepwise procedures may miss interactions that occur in the absence of detectable main effects. To address these limitations, the multifactor dimensionality reduction (MDR) method [Ritchie et al., 2001: Am J Hum Genet 69:138,147] was developed. The MDR is wellsuited for examining high-order interactions and detecting interactions without main effects. The MDR was originally designed to analyze balanced case-control data. The analysis can use family data, but requires a single matched pair be selected from each family. This may be a discordant sib pair, or may be constructed from triad data when parents are available. To take advantage of additional affected and unaffected siblings requires a test statistic that measures the association of genotype with disease in general nuclear families. We have developed a novel test, the MDR-PDT, by merging the MDR method with the genotype-Pedigree Disequilibrium Test (geno-PDT)[Martin et al., 2003: Genet Epidemiol 25:203,213]. MDR-PDT allows identification of single-locus effects or joint effects of multiple loci in families of diverse structure. We present simulations to demonstrate the validity of the test and evaluate its power. To examine its applicability to real data, we applied the MDR-PDT to data from candidate genes for Alzheimer disease (AD) in a large family dataset. These results show the utility of the MDR-PDT for understanding the genetics of complex diseases. Genet. Epidemiol. 2006. © 2005 Wiley-Liss, Inc. [source]


Productivity Growth, Efficiency and Outsourcing in Manufacturing and Service Industries

JOURNAL OF ECONOMIC SURVEYS, Issue 1 2003
Almas Heshmati
This paper is a survey of recent contributions to, and developments of, the relationship between outsourcing, efficiency and productivity growth in manufacturing and services. The objective is to provide a thorough and up,to,date survey that provides a significant discussion on data, as well as on the core methods of measuring efficiency and productivity. First, the readers are introduced to the measurement of partial and total factor productivity growth. Different parametric and non,parametric approaches to the productivity measurement in the context of static, dynamic and firm,specific modelling are discussed. Second, we survey the econometric approach to efficiency analysis. The issues of modelling, distributional assumptions and estimation methods are discussed assuming that cross,sectional or panel data are available. Third, the relationship between outsourcing and productivity growth in manufacturing and services is discussed. The correspondence between a number of hypotheses and empirical findings are examined. Examples of varieties of relevant empirical applications, their findings and implications are presented. Fourth, measurement of inputs and outputs in manufacturing and services are discussed. Finally, to promote useful research, a number of factors important to the analysis of outsourcing, efficiency and productivity growth in the service sector are summarised. [source]


Calculation of off-resonance Raman scattering intensities with parametric models

JOURNAL OF RAMAN SPECTROSCOPY, Issue 12 2009
Daniel Bougeard
Abstract The paper reviews applications of parametric models for calculations of the Raman scattering intensity of materials with the main emphasis on the performance of the bond polarizability model. The use of the model in studies of polymers, aluminosilicates, and nanostructures is discussed and the existing sets of electro-optical parameters as well as their transferability are analyzed. The paper highlights the interplay between the first-principles and parametric approaches to the Raman intensity calculations and suggests further developments in this field. Copyright © 2009 John Wiley & Sons, Ltd. [source]