Home About us Contact | |||
Competing Approaches (competing + approach)
Selected AbstractsOptimal use of high-resolution topographic data in flood inundation modelsHYDROLOGICAL PROCESSES, Issue 3 2003P. D. Bates Abstract In this paper we explore the optimum assimilation of high-resolution data into numerical models using the example of topographic data provision for flood inundation simulation. First, we explore problems with current assimilation methods in which numerical grids are generated independent of topography. These include possible loss of significant length scales of topographic information, poor representation of the original surface and data redundancy. These are resolved through the development of a processing chain consisting of: (i) assessment of significant length scales of variation in the input data sets; (ii) determination of significant points within the data set; (iii) translation of these into a conforming model discretization that preserves solution quality for a given numerical solver; and (iv) incorporation of otherwise redundant sub-grid data into the model in a computationally efficient manner. This processing chain is used to develop an optimal finite element discretization for a 12 km reach of the River Stour in Dorset, UK, for which a high-resolution topographic data set derived from airborne laser altimetry (LiDAR) was available. For this reach, three simulations of a 1 in 4 year flood event were conducted: a control simulation with a mesh developed independent of topography, a simulation with a topographically optimum mesh, and a further simulation with the topographically optimum mesh incorporating the sub-grid topographic data within a correction algorithm for dynamic wetting and drying in fixed grid models. The topographically optimum model is shown to represent better the ,raw' topographic data set and that differences between this surface and the control are hydraulically significant. Incorporation of sub-grid topographic data has a less marked impact than getting the explicit hydraulic calculation correct, but still leads to important differences in model behaviour. The paper highlights the need for better validation data capable of discriminating between these competing approaches and begins to indicate what the characteristics of such a data set should be. More generally, the techniques developed here should prove useful for any data set where the resolution exceeds that of the model in which it is to be used. Copyright © 2002 John Wiley & Sons, Ltd. [source] Fast implementations and rigorous models: Can both be accommodated in NMPC?INTERNATIONAL JOURNAL OF ROBUST AND NONLINEAR CONTROL, Issue 8 2008Victor M. Zavala Abstract In less than two decades, nonlinear model predictive control has evolved from a conceptual framework to an attractive, general approach for the control of constrained nonlinear processes. These advances were realized both through better understanding of stability and robustness properties as well as improved algorithms for dynamic optimization. This study focuses on recent advances in optimization formulations and algorithms, particularly for the simultaneous collocation-based approach. Here, we contrast this approach with competing approaches for online application and discuss further advances to deal with applications of increasing size and complexity. To address these challenges, we adapt the real-time iteration concept, developed in the context of multiple shooting (Real-Time PDE-Constrained Optimization. SIAM: Philadelphia, PA, 2007; 25,52, 3,24), to a collocation-based approach with a full-space nonlinear programming solver. We show that straightforward sensitivity calculations from the Karush,Kuhn,Tucker system also lead to a real-time iteration strategy, with both direct and shifted variants. This approach is demonstrated on a large-scale polymer process, where online calculation effort is reduced by over two orders of magnitude. Copyright © 2007 John Wiley & Sons, Ltd. [source] Setting the agenda of attributes in the 1996 Spanish general electionJOURNAL OF COMMUNICATION, Issue 2 2000M McCombes We advance the central proposition of agenda-setting theory - that elements prominent in the mass media's picture of the world influence the salience of those elements in the audience's picture - through the explication of a second level of agenda setting: attribute agenda setting. This preliminary research on candidate images during the 1996 Spanish general election simultaneously examined 2 attribute dimensions - substantive and affective descriptions - to test the hypothesis that media attribute agendas influence the voters' attribute agenda. Empirically, a high degree of correspondence was found between the attribute agendas of 7 different mass media and the voters' attribute agenda for each of the 3 candidates. The median correlation from these 21 tests of the hypothesis is +.72. Sixth-order partial correlations in which the influence of the other 6 mass media are removed from the correlation between a medium's agenda and the voters' agenda for a particular candidate have a median value of +.73. Additional analyses of the attribute agendas of each medium's primary audience in comparison with its principal competitor also yielded evidence of second-level agenda setting. Future research should pursue complex longitudinal designs tracing the impact of media content on voters' images at both the aggregate and individual levels as part of the continuing scholarly dialogue on competing approaches to framing research and attribute agenda setting. [source] Combining inflation density forecastsJOURNAL OF FORECASTING, Issue 1-2 2010Christian Kascha Abstract In this paper, we empirically evaluate competing approaches for combining inflation density forecasts in terms of Kullback,Leibler divergence. In particular, we apply a similar suite of models to four different datasets and aim at identifying combination methods that perform well throughout different series and variations of the model suite. We pool individual densities using linear and logarithmic combination methods. The suite consists of linear forecasting models with moving estimation windows to account for structural change. We find that combining densities is a much better strategy than selecting a particular model ex ante. While combinations do not always perform better than the best individual model, combinations always yield accurate forecasts and, as we show analytically, provide insurance against selecting inappropriate models. Logarithmic combinations can be advantageous, in particular if symmetric densities are preferred. Copyright © 2010 John Wiley & Sons, Ltd. [source] Modelling the daily banknotes in circulation in the context of the liquidity management of the European Central Bank,JOURNAL OF FORECASTING, Issue 3 2009Alberto Cabrero Abstract The main focus of this paper is to model the daily series of banknotes in circulation. The series of banknotes in circulation displays very marked seasonal patterns. To the best of our knowledge the empirical performance of two competing approaches to model seasonality in daily time series, namely the ARIMA-based approach and the Structural Time Series approach, has never been put to the test. The application presented in this paper provides valid intuition on the merits of each approach. The forecasting performance of the models is also assessed in the context of their impact on the liquidity management of the Eurosystem.,,Copyright © 2008 John Wiley & Sons, Ltd. [source] New Institutional Economics' contribution to strategic groups analysisMANAGERIAL AND DECISION ECONOMICS, Issue 3 2007Stephane Tywoniak Rather than consider the two broad strands of strategic group research,performance-based and behavior-based studies,as competing approaches, we argue that they relate to complementary levels of analysis. We present a four-level framework for analyzing structures within industries drawn from New Institutional Economics (NIE) which covers different approaches to strategic group formation from institutional isomorphism and embeddedness through to the firm-level effects of certain resource deployments. We apply an institutional approach to a case study of the Australian banking industry and supplement this with a quantitative approach based around key strategic variables. This analysis suggests that distinct groups have emerged due to the institutional environment and the different regulatory environments experienced by various banks in the industry. Copyright © 2007 John Wiley & Sons, Ltd. [source] L1 Penalized Estimation in the Cox Proportional Hazards ModelBIOMETRICAL JOURNAL, Issue 1 2010Jelle J. Goeman Abstract This article presents a novel algorithm that efficiently computes L1 penalized (lasso) estimates of parameters in high-dimensional models. The lasso has the property that it simultaneously performs variable selection and shrinkage, which makes it very useful for finding interpretable prediction rules in high-dimensional data. The new algorithm is based on a combination of gradient ascent optimization with the Newton,Raphson algorithm. It is described for a general likelihood function and can be applied in generalized linear models and other models with an L1 penalty. The algorithm is demonstrated in the Cox proportional hazards model, predicting survival of breast cancer patients using gene expression data, and its performance is compared with competing approaches. An R package, penalized, that implements the method, is available on CRAN. [source] Complementary Log,Log Regression for the Estimation of Covariate-Adjusted Prevalence Ratios in the Analysis of Data from Cross-Sectional StudiesBIOMETRICAL JOURNAL, Issue 3 2009Alan D. Penman Abstract We assessed complementary log,log (CLL) regression as an alternative statistical model for estimating multivariable-adjusted prevalence ratios (PR) and their confidence intervals. Using the delta method, we derived an expression for approximating the variance of the PR estimated using CLL regression. Then, using simulated data, we examined the performance of CLL regression in terms of the accuracy of the PR estimates, the width of the confidence intervals, and the empirical coverage probability, and compared it with results obtained from log,binomial regression and stratified Mantel,Haenszel analysis. Within the range of values of our simulated data, CLL regression performed well, with only slight bias of point estimates of the PR and good confidence interval coverage. In addition, and importantly, the computational algorithm did not have the convergence problems occasionally exhibited by log,binomial regression. The technique is easy to implement in SAS (SAS Institute, Cary, NC), and it does not have the theoretical and practical issues associated with competing approaches. CLL regression is an alternative method of binomial regression that warrants further assessment. [source] Evolution Is Not a Necessary Assumption of CladisticsCLADISTICS, Issue 1 2000Andrew V.Z. Brower Although the point has already been emphasized by various authors that the assumption of descent with modification is not required to justify cladistics, recent debate suggests that there is still confusion surrounding the necessary and sufficient background knowledge underlying the method. Three general axioms necessary to justify cladistics,the discoverability of characters, hierarchy, and parsimony,are reviewed. Although the assumption of evolution is sufficient to justify cladistics, it is also sufficient to justify competing approaches like maximum likelihood, which suggests that the philosophical support for the cladistic approach is strengthened by purging reference to descent with modification altogether. [source] |