Real-time Data (real-time + data)

Distribution by Scientific Domains


Selected Abstracts


Dealing with Benchmark Revisions in Real-Time Data: The Case of German Production and Orders Statistics,

OXFORD BULLETIN OF ECONOMICS & STATISTICS, Issue 2 2009
Thomas A. Knetsch
Abstract Benchmark revisions in non-stationary real-time data may adversely affect the results of regular revision analysis and the estimates of long-run economic relationships. Cointegration analysis can reveal the nature of vintage heterogeneity and guide the adjustment of real-time data for benchmark revisions. Affine vintage transformation functions estimated by cointegration regressions are a flexible tool, whereas differencing and rebasing work well only under certain circumstances. Inappropriate vintage transformation may cause observed revision statistics to be affected by nuisance parameters. Using real-time data of German industrial production and orders, the econometric techniques are exemplified and the theoretical claims are examined empirically. [source]


Real-time data-based risk assessment for hazard installations storing flammable gas

PROCESS SAFETY PROGRESS, Issue 3 2008
Zhou Jianfeng
Abstract Large quantities of dangerous substances, especially explosive or flammable gases, are processed or stored in hazard installations. A risk-based warning/early-warning system for major hazard installations is very important for the prevention of major accidents. The real-time data-based risk and the risk factors are analyzed in this article, and a fuzzy logic based real-time risk assessment method is proposed. On the basis of fuzzy logic theory, the likelihood of an accident occurrence and the consequence of the accident can be assessed, and the risk value or risk level can be evaluated by utilizing a risk matrix. The method takes advantage of the real-time data acquired from the safety monitoring system so that the change in the risk can be determined as the accident develops. The risk assessment simulation of a vapor cloud explosion (VCE) accident caused by gas leaked from an liquefied petroleum gas tank is performed. It is shown that the risk of a VCE accident varies with the change of the monitored data. © 2008 American Institute of Chemical Engineers Process Saf Prog, 2008 [source]


On-Line Control Architecture for Enabling Real-Time Traffic System Operations

COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 5 2004
Srinivas Peeta
Critical to their effectiveness are the control architectures that provide a blueprint for the efficient transmission and processing of large amounts of real-time data, and consistency-checking and fault tolerance mechanisms to ensure seamless automated functioning. However, the lack of low-cost, high-performance, and easy-to-build computing environments are key impediments to the widespread deployment of such architectures in the real-time traffic operations domain. This article proposes an Internet-based on-line control architecture that uses a Beowulf cluster as its computational backbone and provides an automated mechanism for real-time route guidance to drivers. To investigate this concept, the computationally intensive optimization modules are implemented on a low-cost 16-processor Beowulf cluster and a commercially available supercomputer, and the performance of these systems on representative computations is measured. The results highlight the effectiveness of the cluster in generating substantial computational performance scalability, and suggest that its performance is comparable to that of the more expensive supercomputer. [source]


A workflow portal supporting multi-language interoperation and optimization

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 12 2007
Lican Huang
Abstract In this paper we present a workflow portal for Grid applications, which supports different workflow languages and workflow optimization. We present an XSLT converter that converts from one workflow language to another and enables the interoperation between different workflow languages. We discuss strategies for choosing the optimal service from several semantically equivalent Web services in a Grid application. The dynamic selection of Web services involves discovering a set of semantically equivalent services by filtering the available services based on metadata, and selecting an optimal service based on real-time data and/or historical data recorded during prior executions. Finally, we describe the framework and implementation of the workflow portal which aggregates different components of the project using Java portlets. Copyright © 2007 John Wiley & Sons, Ltd. [source]


Metrics in the Science of Surge

ACADEMIC EMERGENCY MEDICINE, Issue 11 2006
Jonathan A. Handler MD
Metrics are the driver to positive change toward better patient care. However, the research into the metrics of the science of surge is incomplete, research funding is inadequate, and we lack a criterion standard metric for identifying and quantifying surge capacity. Therefore, a consensus working group was formed through a "viral invitation" process. With a combination of online discussion through a group e-mail list and in-person discussion at a breakout session of the Academic Emergency Medicine 2006 Consensus Conference, "The Science of Surge," seven consensus statements were generated. These statements emphasize the importance of funded research in the area of surge capacity metrics; the utility of an emergency medicine research registry; the need to make the data available to clinicians, administrators, public health officials, and internal and external systems; the importance of real-time data, data standards, and electronic transmission; seamless integration of data capture into the care process; the value of having data available from a single point of access through which data mining, forecasting, and modeling can be performed; and the basic necessity of a criterion standard metric for quantifying surge capacity. Further consensus work is needed to select a criterion standard metric for quantifying surge capacity. These consensus statements cover the future research needs, the infrastructure needs, and the data that are needed for a state-of-the-art approach to surge and surge capacity. [source]


Dielectrophoresis microsystem with integrated flow cytometers for on-line monitoring of sorting efficiency

ELECTROPHORESIS, Issue 24 2006
Zhenyu Wang
Abstract Dielectrophoresis (DEP) and flow cytometry are powerful technologies and widely applied in microfluidic systems for handling and measuring cells and particles. Here, we present a novel microchip with a DEP selective filter integrated with two microchip flow cytometers (FCs) for on-line monitoring of cell sorting processes. On the microchip, the DEP filter is integrated in a microfluidic channel network to sort yeast cells by positive DEP. The two FCs detection windows are set upstream and downstream of the DEP filter. When a cell passes through the detection windows, the light scattered by the cell is measured by integrated polymer optical elements (waveguide, lens, and fiber coupler). By comparing the cell counting rates measured by the two FCs, the collection efficiency of the DEP filter can be determined. The chips were used for quantitative determination of the effect of flow rate, applied voltage, conductivity of the sample, and frequency of the electric field on the sorting efficiency. A theoretical model for the capture efficiency was developed and a reasonable agreement with the experimental results observed. Viable and non-viable yeast cells showed different frequency dependencies and were sorted with high efficiency. At 2,MHz, more than 90% of the viable and less than 10% of the non-viable cells were captured on the DEP filter. The presented approach provides quantitative real-time data for sorting a large number of cells and will allow optimization of the conditions for, e.g., collecting cancer cells on a DEP filter while normal cells pass through the system. Furthermore, the microstructure is simple to fabricate and can easily be integrated with other microstructures for lab-on-a-chip applications. [source]


Using Taylor Rules to Understand European Central Bank Monetary Policy

GERMAN ECONOMIC REVIEW, Issue 3 2007
Stephan Sauer
Taylor rule; European Central Bank; real-time data Abstract. Over the last decade, the simple instrument policy rule developed by Taylor has become a popular tool for evaluating the monetary policy of central banks. As an extensive empirical analysis of the European Central Bank's (ECB) past behaviour still seems to be in its infancy, we estimate several instrument policy reaction functions for the ECB to shed some light on actual monetary policy in the euro area under the presidency of Wim Duisenberg and answer questions like whether the ECB has actually followed a stabilizing or a destabilizing rule so far. Looking at contemporaneous Taylor rules, the evidence presented suggests that the ECB is accommodating changes in inflation and hence follows a destabilizing policy. However, this impression seems to be largely due to the lack of a forward-looking perspective in such specifications. Either assuming rational expectations and using a forward-looking specification, or using expectations as derived from surveys result in Taylor rules that do imply a stabilizing role of the ECB. The use of real-time industrial production data does not seem to play such a significant role as in the case of the United States. [source]


Forecast-Based Monetary Policy: The Case of Sweden

INTERNATIONAL FINANCE, Issue 3 2003
Per Jansson
Central banks are dominant players in financial markets and economic policy. For both democratic and efficiency reasons, it is important that central banks' actions can be understood, predicted, and evaluated. Inflation-targeting central banks that publish their forecasts provide unique opportunities for detailed studies of monetary policy based on real-time data. This paper demonstrates how a central bank's forecasts can be used to identify two different forms of discretionary monetary policy: ,policy shocks' (deviations from systematic policy) and ,judgements' in forecasting. [source]


A monetary real-time conditional forecast of euro area inflation,

JOURNAL OF FORECASTING, Issue 4 2010
Sylvia Kaufmann
Abstract Based on a vector error correction model we produce conditional euro area inflation forecasts. We use real-time data on M3 and HICP, and include real GPD, the 3-month EURIBOR and the 10-year government bond yield as control variables. Real money growth and the term spread enter the system as stationary linear combinations. Missing and outlying values are substituted by model-based estimates using all available data information. In general, the conditional inflation forecasts are consistent with the European Central Bank's assessment of liquidity conditions for future inflation prospects. The evaluation of inflation forecasts under different monetary scenarios reveals the importance of keeping track of money growth rate in particular at the end of 2005. Copyright © 2009 John Wiley & Sons, Ltd. [source]


A new production function estimate of the euro area output gap,

JOURNAL OF FORECASTING, Issue 1-2 2010
Matthieu Lemoine
Abstract We develop a new version of the production function (PF) approach for estimating the output gap of the euro area. Assuming a CES (constant elasticity of substitution) technology, our model does not call for any (often imprecise) measure of the capital stock and improves the estimation of the trend total factor productivity using a multivariate unobserved components model. With real-time data, we assess this approach by comparing it with the Hodrick,Prescott (HP) filter and with a Cobb,Douglas PF approach with common cycle and implemented with a multivariate unobserved components model. Our new PF estimate appears highly concordant with the reference chronology of turning points and has better real-time properties than the univariate HP filter for sufficiently long time horizons. Its inflation forecasting power appears, like the other multivariate approach, less favourable than the statistical univariate method. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Forecasting real-time data allowing for data revisions

JOURNAL OF FORECASTING, Issue 6 2007
Kosei Fukuda
Abstract A modeling approach to real-time forecasting that allows for data revisions is shown. In this approach, an observed time series is decomposed into stochastic trend, data revision, and observation noise in real time. It is assumed that the stochastic trend is defined such that its first difference is specified as an AR model, and that the data revision, obtained only for the latest part of the time series, is also specified as an AR model. The proposed method is applicable to the data set with one vintage. Empirical applications to real-time forecasting of quarterly time series of US real GDP and its eight components are shown to illustrate the usefulness of the proposed approach.,,Copyright © 2007 John Wiley & Sons, Ltd. [source]


Data Revisions Are Not Well Behaved

JOURNAL OF MONEY, CREDIT AND BANKING, Issue 2-3 2008
AN ARUOBA, S. BORA
forecasting; news and noise; real-time data; NIPA variables We document the empirical properties of revisions to major macroeconomic variables in the United States. Our findings suggest that they do not satisfy simple desirable statistical properties. In particular, we find that these revisions do not have a zero mean, which indicates that the initial announcements by statistical agencies are biased. We also find that the revisions are quite large compared to the original variables and they are predictable using the information set at the time of the initial announcement, which means that the initial announcements of statistical agencies are not rational forecasts. [source]


Individualized and time-variant model for the functional link between thermoregulation and sleep onset

JOURNAL OF SLEEP RESEARCH, Issue 2 2006
STIJN QUANTEN
Summary This study makes use of control system model identification techniques to examine the relationship between thermoregulation and sleep regulation. Specifically, data-based mechanistic (DBM) modelling is used to formulate and experimentally test the hypothesis, put forth by Gilbert et al. [Sleep Med. Rev.8 (2004) 81], that there exists a connection between distal heat loss and sleepiness. Six healthy sleepers each spent three nights and the following day in the sleep laboratory: an adaptation, a cognitive arousal and a neutral testing day. In the cognitive arousal condition, a visit of a television camera crew took place and subjects were asked to be interviewed. During each of the three 25-min driving simulator tasks per day, the distal-to-proximal gradient and the electroencephalogram are recorded. It is observed from these experimental data that there exists a feedback connection between thermoregulation and sleep. In addition to providing experimental evidence in support of the Gilbert et al. (2004) hypothesis, the authors propose that the nature of the feedback connection is determined by the nature of sleep/wake state (i.e. NREM sleep versus unwanted sleepiness in active subjects). Besides this, an individualized and time-variant model for the linkage between thermoregulation and sleep onset is presented. This compact model feeds on real-time data regarding distal heat loss and sleepiness and contains a physically meaningful parameter that delivers an individual- and time-depending quantification of a well known biological features in the field of thermoregulation: the thermoregulatory error signal Thypo(t),Tset(t). A validation of these physical/biological features emphasizes the reliability and power of DBM in describing individual differences related to the sleep process. [source]


Central limit theorems for nonparametric estimators with real-time random variables

JOURNAL OF TIME SERIES ANALYSIS, Issue 5 2010
Tae Yoon Kim
Primary 62G07; 62F12; Secondary 62M05 C13; C14 In this article, asymptotic theories for nonparametric methods are studied when they are applied to real-time data. In particular, we derive central limit theorems for nonparametric density and regression estimators. For this we formally introduce a sequence of real-time random variables indexed by a parameter related to fine gridding of time domain (or fine discretization). Our results show that the impact of fine gridding is greater in the density estimation case in the sense that strong dependence due to fine gridding severely affects the major strength of nonparametric density estimator (or its data-adaptive property). In addition, we discuss some issues about nonparametric regression model with fine gridding of time domain. [source]


Dealing with Benchmark Revisions in Real-Time Data: The Case of German Production and Orders Statistics,

OXFORD BULLETIN OF ECONOMICS & STATISTICS, Issue 2 2009
Thomas A. Knetsch
Abstract Benchmark revisions in non-stationary real-time data may adversely affect the results of regular revision analysis and the estimates of long-run economic relationships. Cointegration analysis can reveal the nature of vintage heterogeneity and guide the adjustment of real-time data for benchmark revisions. Affine vintage transformation functions estimated by cointegration regressions are a flexible tool, whereas differencing and rebasing work well only under certain circumstances. Inappropriate vintage transformation may cause observed revision statistics to be affected by nuisance parameters. Using real-time data of German industrial production and orders, the econometric techniques are exemplified and the theoretical claims are examined empirically. [source]


Real-time data-based risk assessment for hazard installations storing flammable gas

PROCESS SAFETY PROGRESS, Issue 3 2008
Zhou Jianfeng
Abstract Large quantities of dangerous substances, especially explosive or flammable gases, are processed or stored in hazard installations. A risk-based warning/early-warning system for major hazard installations is very important for the prevention of major accidents. The real-time data-based risk and the risk factors are analyzed in this article, and a fuzzy logic based real-time risk assessment method is proposed. On the basis of fuzzy logic theory, the likelihood of an accident occurrence and the consequence of the accident can be assessed, and the risk value or risk level can be evaluated by utilizing a risk matrix. The method takes advantage of the real-time data acquired from the safety monitoring system so that the change in the risk can be determined as the accident develops. The risk assessment simulation of a vapor cloud explosion (VCE) accident caused by gas leaked from an liquefied petroleum gas tank is performed. It is shown that the risk of a VCE accident varies with the change of the monitored data. © 2008 American Institute of Chemical Engineers Process Saf Prog, 2008 [source]


Super-LOTIS (Livermore Optical Transient Imaging System)

ASTRONOMISCHE NACHRICHTEN, Issue 6-8 2004
D. Pérez-Ramírez
Abstract The 0.6-m Super-LOTIS telescope is a fully robotic system dedicated to the search for prompt optical emission from gamma-ray bursts (GRBs). The telescope began routine operations from its Steward Observatory site atop Kitt Peak (KPNO) in April 2000. This system is capable of responding to the Gamma-ray burst Coordinate Network (GCN) triggers within seconds. Together with LOTIS, these systems have been monitoring the GCN real-time data for automatic HETE2 GRB triggers. We will summarize the current capabilities of the system and present recent scientific results. (© 2004 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]