Computational Requirements (computational + requirement)

Distribution by Scientific Domains


Selected Abstracts


Use of GPS/MET refraction angles in three-dimensional variational analysis

THE QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY, Issue 570 2000
X. Zou
Abstract The Spectral Statistical Interpolation (SSI) analysis system of the National Centers for Environmental Prediction (NCEP) is modified to include GPS/MET data (meteorological data from the Global Positioning Satellite system) using a GPS ray-tracing operator. The new system is tested by incorporating 30 actual GPS/MET observations of refraction angles obtained during the GPS/MET experiment. This is the first time that real radio occupation refraction angles and refractivities have been incorporated into a three-dimensional variational analysis system. We examine the magnitude and the vertical distribution of the analysis adjustments that result from using refraction-angle observations in the NCEP SSI analysis system. The average magnitudes of the adjustments in the temperature and specific-humidity fields are approximately 0.4 degC and 0.6 g kg,1, respectively. Individual changes can be as large as 4 degC and 4g kg,1, respectively. The greatest adjustments to the temperature occur in the middle and upper troposphere and stratosphere, while the major changes in specific humidity occur in the lower troposphere. An assessment of the impact of the GPS/MET observations on the analysis, verified by conventional (mostly radiosonde) data, is difficult because of the small number of GPS/MET data used. Nevertheless, it is found that, even over data-rich regions (regions containing many radiosonde observations), and even when the verification data were the radiosonde data themselves, the use of GPS/MET refraction angles makes a slight improvement, overall, to the analysed temperatures and winds. The impact on the water-vapour analyses, again as measured against radiosonde data, is mixed, with improvements shown in some layers and degradation in others. Compared with the background field, the use of refraction angles from one occultation results in an analysis whose simulated refraction angles are much closer to the withheld GPS/MET refraction angles at the two nearby occultation locations, and whose temperature and moisture profiles are also closer to those resulting from the direct assimilation of the two withheld occultations. Although the forward model used in this study, with the ray tracing being carried out in a two-dimensional plane, is much cheaper than a more accurate three-dimensional forward model, it is still quite expensive. In order to further reduce the computational requirement for the assimilation of GPS/MET data, we test a scheme in which the GPS/MET-retrieved refractivities (instead of refraction angles) are used above a selected height for each occupation. These heights are determined objectively based on the departures from spherical symmetry of the model field. It is shown that the mixed use of GPS/MET refraction angles and refractivities produces an analysis result similar to the one using refraction angles alone, while the computational cost is reduced by more than 30%. [source]


MINIMAL VALID AUTOMATA OF SAMPLE SEQUENCES FOR DISCRETE EVENT SYSTEMS

ASIAN JOURNAL OF CONTROL, Issue 2 2004
Sheng-Luen Chung
ABSTRACT Minimal valid automata (MVA) refer to valid automata models that fit a given input-output sequence sample from a Mealy machine model. They are minimal in the sense that the number of states in these automata is minimal. Critical to system identification problems of discrete event systems, MVA can be considered as a special case of the minimization problem for incompletely specified sequential machine (ISSM). While the minimization of ISSM in general is an NP-complete problem, various approaches have been proposed to alleviate computational requirement by taking special structural properties of the ISSM at hand. In essence, MVA is to find the minimal realization of an ISSM where each state only has one subsequent state transition defined. This paper presents an algorithm that divides the minimization process into two phases: first to give a reduced machine for the equivalent sequential machine, and then to minimize the reduced machine into minimal realization solutions. An example with comprehensive coverage on how the associated minimal valid automata are derived is also included. [source]


Meeting Real,Time Traffic Flow Forecasting Requirements with Imprecise Computations

COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 3 2003
Brian L. Smith
This article explores the ability of imprecise computations to address real,time computational requirements in infrastructure control and management systems. The research in this area focuses on the development of nonparametric regression as a means to forecast traffic flow rates for transportation management systems. Nonparametric regression is a forecasting technique based on nearest neighbor searching, in which forecasts are derived from past observations that are similar to current conditions. A key concern regarding nonparametric regression is the significant time required to search for nearest neighbors in large databases. The results presented in this article indicate that approximate nearest neighbors, which are imprecise computations as applied to nonparametric regression, may be used to adequately speed the execution time of nonparametric regression, with acceptable degradations in forecast accuracy. The article concludes with a demonstration of the use of genetic algorithms as a design aid for real,time algorithms employing imprecise computations. [source]


The embedded ion method: A new approach to the electrostatic description of crystal lattice effects in chemical shielding calculations

CONCEPTS IN MAGNETIC RESONANCE, Issue 5 2006
Dirk Stueber
Abstract The nuclear magnetic shielding anisotropy of NMR active nuclei is highly sensitive to the nuclear electronic environment. Hence, measurements of the nuclear magnetic shielding anisotropy represent a powerful tool in the elucidation of molecular structure for a wide variety of materials. Quantum mechanical ab initio nuclear magnetic shielding calculations effectively complement the experimental NMR data by revealing additional structural information. The accuracy and capacity of these calculations has been improved considerably in recent years. However, the inherent problem of the limitation in the size of the systems that may be studied due to the relatively demanding computational requirements largely remains. Accordingly, ab initio shielding calculations have been performed predominantly on isolated molecules, neglecting the molecular environment. This approach is sufficient for neutral nonpolar systems, but leads to serious errors in the shielding calculations on polar and ionic systems. Conducting ab initio shielding calculations on clusters of molecules (i.e., including the nearest neighbor interactions) has improved the accuracy of the calculations in many cases. Other methods of simulating crystal lattice effects in shielding calculations that have been developed include the electrostatic representation of the crystal lattice using point charge arrays, full ab initio methods, ab initio methods under periodic boundary conditions, and hybrid ab initio/molecular dynamics methods. The embedded ion method (EIM) discussed here follows the electrostatic approach. The method mimics the intermolecular and interionic interactions experienced by a subject molecule or cluster in a given crystal in quantum mechanical shielding calculations with a large finite, periodic, and self-consistent array of point charges. The point charge arrays in the EIM are generated using the Ewald summation method and embed the molecule or ion of interest for which the ab initio shielding calculations are performed. The accuracy with which the EIM reproduces experimental nuclear magnetic shift tensor principal values, the sensitivity of the EIM to the parameters defining the point charge arrays, as well as the strengths and limitations of the EIM in comparison with other methods that include crystal lattice effects in chemical shielding calculations, are presented. © 2006 Wiley Periodicals, Inc. Concepts Magn Reson Part A 28A: 347,368, 2006 [source]


Fast template matching using correlation-based adaptive predictive search

INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY, Issue 3 2003
Shijun Sun
Abstract We have developed the Correlation-based Adaptive Predictive Search (CAPS) as a fast search strategy for multidimensional template matching. A 2D template is analyzed, and certain characteristics are computed from its autocorrelation. The extracted information is then used to speed up the search procedure. This method provides a significant improvement in computation time while retaining the accuracy of traditional full-search matching. We have extended CAPS to three and higher dimensions. An example of the third dimension is rotation where rotated targets can be located while again substantially reducing the computational requirements. CAPS can also be applied in multiple steps to further speed up the template matching process. Experiments were conducted to evaluate the performance of 2D, 3D, and multiple-step CAPS algorithms. Compared to the conventional full-search method, we achieved speedup ratios of up to 66.5 and 145 with 2D and 3D CAPS, respectively. © 2003 Wiley Periodicals, Inc. Int J Imaging Syst Technol 13, 169,178, 2003; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/ima.10055 [source]


PDB_REDO: automated re-refinement of X-ray structure models in the PDB

JOURNAL OF APPLIED CRYSTALLOGRAPHY, Issue 3 2009
Robbie P. Joosten
Structural biology, homology modelling and rational drug design require accurate three-dimensional macromolecular coordinates. However, the coordinates in the Protein Data Bank (PDB) have not all been obtained using the latest experimental and computational methods. In this study a method is presented for automated re-refinement of existing structure models in the PDB. A large-scale benchmark with 16,807 PDB entries showed that they can be improved in terms of fit to the deposited experimental X-ray data as well as in terms of geometric quality. The re-refinement protocol uses TLS models to describe concerted atom movement. The resulting structure models are made available through the PDB_REDO databank (http://www.cmbi.ru.nl/pdb_redo/). Grid computing techniques were used to overcome the computational requirements of this endeavour. [source]


Analysis of algorithms for two-stage flowshops with multi-processor task flexibility

NAVAL RESEARCH LOGISTICS: AN INTERNATIONAL JOURNAL, Issue 1 2004
George L. Vairaktarakis
Abstract In this article we introduce a 2-machine flowshop with processing flexibility. Two processing modes are available for each task: namely, processing by the designated processor, and processing simultaneously by both processors. The objective studied is makespan minimization. This production environment is encountered in repetitive manufacturing shops equipped with processors that have the flexibility to execute orders either individually or in coordination. In the latter case, the product designer exploits processing synergies between two processors so as to execute a particular task much faster than a dedicated processor. This type of flowshop environment is also encountered in labor-intensive assembly lines where products moving downstream can be processed either in the designated assembly stations or by pulling together the work teams of adjacent stations. This scheduling problem requires determining the mode of operation of each task, and the subsequent scheduling that preserves the flowshop constraints. We show that the problem is ordinary NP-complete and obtain an optimal solution using a dynamic programming algorithm with considerable computational requirements for medium and large problems. Then, we present a number of dynamic programming relaxations and analyze their worst-case error performance. Finally, we present a polynomial time heuristic with worst-case error performance comparable to that of the dynamic programming relaxations. © 2003 Wiley Periodicals, Inc. Naval Research Logistics, 2004. [source]