Additional Observations (additional + observation)

Distribution by Scientific Domains


Selected Abstracts


Additional observations of age-dependent migration behaviour in western honey buzzards Pernis apivorus

JOURNAL OF AVIAN BIOLOGY, Issue 6 2004
Nicolantonio AgostiniArticle first published online: 2 NOV 200
[source]


Fetal extrasystole may predict poor neonatal outcome

AUSTRALIAN AND NEW ZEALAND JOURNAL OF OBSTETRICS AND GYNAECOLOGY, Issue 4 2009
Jake A. BROWN
Extrasystoles particularly premature atrial contractions noted during labour on the fetal heart rate monitoring strip are usually thought to be benign. In pregnancies complicated by fetal infection and/or the fetal inflammatory response syndrome, there are some data that extrasystoles noted during the intrapartum period may be related to neonatal sepsis and eventual poor neonatal outcome including death or neonatal encephalopathy. Additional observations are needed to substantiate this hypothesis. [source]


Multi-variable parameter estimation to increase confidence in hydrological modelling

HYDROLOGICAL PROCESSES, Issue 2 2002
Sten Bergström
Abstract The expanding use and increased complexity of hydrological runoff models has given rise to a concern about overparameterization and risks for compensating errors. One proposed way out is the calibration and validation against additional observations, such as snow, soil moisture, groundwater or water quality. A general problem, however, when calibrating the model against more than one variable is the strategy for parameter estimation. The most straightforward method is to calibrate the model components sequentially. Recent results show that in this way the model may be locked up in a parameter setting, which is good enough for one variable but excludes proper simulation of other variables. This is particularly the case for water quality modelling, where a small compromise in terms of runoff simulation may lead to dramatically better simulations of water quality. This calls for an integrated model calibration procedure with a criterion that integrates more aspects on model performance than just river runoff. The use of multi-variable parameter estimation and internal control of the HBV hydrological model is discussed and highlighted by two case studies. The first example is from a forested basin in northern Sweden and the second one is from an agricultural basin in the south of the country. A new calibration strategy, which is integrated rather than sequential, is proposed and tested. It is concluded that comparison of model results with more measurements than only runoff can lead to increased confidence in the physical relevance of the model, and that the new calibration strategy can be useful for further model development. Copyright © 2002 John Wiley & Sons, Ltd. [source]


How to exploit external model of data for parameter estimation?

INTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, Issue 1 2006
Miroslav Kárný
Abstract Any cooperation in multiple-participant decision making (DM) relies on an exchange of individual knowledge pieces and aims. A general methodology of their rational exploitation without calling for an objective mediator is still missing. Desired methodology is proposed for an important particular case, when a participant, performing Bayesian parameter estimation, is offered a model relating the observable data to their past history. The designed solution is based on the so-called fully probabilistic design (FPD) of DM strategies. The result reduces to an ,ordinary' Bayesian estimation if the offered model is the sample probability density function (pdf), i.e. if it provides additional observations. Copyright © 2005 John Wiley & Sons, Ltd. [source]


ICOADS release 2.1 data and products

INTERNATIONAL JOURNAL OF CLIMATOLOGY, Issue 7 2005
Steven J. Worley
Abstract The International Comprehensive Ocean,Atmosphere Data Set (ICOADS), release 2.1 (1784,2002), is the largest available set of in situ marine observations. Observations from ships include instrument measurements and visual estimates, and data from moored and drifting buoys are exclusively instrumental. The ICOADS collection is constructed from many diverse data sources, and made inhomogeneous by the changes in observing systems and recording practices used throughout the period of record, which is over two centuries. Nevertheless, it is a key reference data set that documents the long-term environmental state, provides input to a variety of critical climate and other research applications, and serves as a basis for many associated products and analyses. The observational database is augmented with higher level ICOADS data products. The observed data are synthesized to products by computing statistical summaries, on a monthly basis, for samples within 2° latitude × 2° longitude and 1° × 1° boxes beginning in 1800 and 1960 respectively. For each resolution the summaries are computed using two different data mixtures and quality control criteria. This partially controls and contrasts the effects of changing observing systems and accounts for periods with greater climate variability. The ICOADS observations and products are freely distributed worldwide. The standard ICOADS release is supplemented in several ways; additional summaries are produced using experimental quality control, additional observations are made available in advance of their formal blending into a release, and metadata that define recent ships' physical characteristics and instruments are available. Copyright © 2005 Royal Meteorological Society [source]


Using the Lee,Carter Method to Forecast Mortality for Populations with Limited Data,

INTERNATIONAL STATISTICAL REVIEW, Issue 1 2004
Nan Li
Summary The Lee,Carter method for modeling and forecasting mortality has been shown to work quite well given long time series of data. Here we consider how it can be used when there are few observations at uneven intervals. Assuming that the underlying model is correct and that the mortality index follows a random walk with drift, we find the method can be used with sparse data. The central forecast depends mainly on the first and last observation, and so can be generated with just two observations, preferably not too close in time. With three data points, uncertainty can also be estimated, although such estimates of uncertainty are themselves highly uncertain and improve with additional observations. We apply the methods to China and South Korea, which have 3 and 20 data points, respectively, at uneven intervals. Résumé La méthode Lee,Carter de modélisation et de prévision de la mortalité a prouvé son bon fonctionnement avec des séries de données existant sur une longue période. Nous envisageons ici son utilisation lorsqu'on ne dispose que de quelques observations à intervalles irréguliers. En supposant que le modèle sous-jacent est correct et que l'indice de mortalité suit une marche aléatoire avec dérive, nous trouvons que cette méthode peut êetre utilisée avec des données éparses. La prévision centrale dépend alors principalement de la première et de la dernière observation. Elle peut donc êetre générée à partir de deux observations seulement, de préférence pas trop proches dans le temps. Avec trois points, on peut aussi estimer l'aléa, bienqu'un tel estimateur de l'aléa soit lui-mêeme très aléatoire. Il s'améliore cependant lorsqu'on dispose d'observations supplémentaires. Nous appliquons notre méthode àla Chine et à la Corée du Sud, pour lesquelles nous avons respectivement 3 et 20 points àintervalles irréguliers. [source]


Impact study of the 2003 North Atlantic THORPEX Regional Campaign

THE QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY, Issue 615 2006
Nadia Fourrié
Abstract An experiment took place during autumn 2003 with the aim of testing the feasibility of an operational targeting of observations over the North Atlantic Ocean in the context of the international programme THORPEX. The purpose of this paper is to evaluate the impact of these additional observations in the French operational model ARPEGE during the last three weeks of the campaign. Results are shown for large regions over and around the North Atlantic Ocean and for specific verification areas. Over Europe, the addition of observations is slightly beneficial for the forecast, mostly in the low troposphere over wide areas and above 100 hPa. However, the impact of extra data is more significant but also more mixed for the dedicated verification areas: they are case, forecast-range and level dependent. In addition, the information content is studied with the Degrees of Freedom for Signal (DFS) for the evaluation of the observation impact on the analysis of one case of December 2003. Firstly, the variations of the DFS have been illustrated in a simplified data assimilation system. It has been found for that case that satellite data have the most important global contribution to the overall analysis, especially the humidity sensitive infrared radiances. For the conventional data, the wind measurements of the aircraft and from the geostationary satellites are the most informative. For the targeted area, the data from aircraft and the dropsondes have the largest DFS. It has been noted that the DFS of the dropsondes located in the sensitivity maximum is larger than the other one even if there is no link between the DFS and the forecast. However, the impact of the dropsondes grows with respect to the forecast range and leads to an improvement of the forecast for this case. Copyright © 2006 Royal Meteorological Society [source]


An adaptive buddy check for observational quality control

THE QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY, Issue 577 2001
Dick P. Dee
Abstract An adaptive buddy-check algorithm is presented that adjusts tolerances for suspect observations, based on the variability of surrounding data. The algorithm derives from a statistical hypothesis test combined with maximum-likelihood covariance estimation. Its stability is shown to depend on the initial identification of outliers by a simple background check. The adaptive feature ensures that the final quality-control decisions are not very sensitive to prescribed statistics of first-guess and observation errors, nor on other approximations introduced into the algorithm. The implementation of the algorithm in a global atmospheric data assimilation is described. Its performance is contrasted with that of a non-adaptive buddy check, for the surface analysis of an extreme storm that took place over Europe on 27 December 1999. The adaptive algorithm allowed the inclusion of many important observations that differed greatly from the first guess and that would have been excluded on the basis of prescribed statistics. The analysis of the storm development was much improved as a result of these additional observations. [source]