Function Evaluation (function + evaluation)

Distribution by Scientific Domains


Selected Abstracts


Approximation methods for reliability-based design optimization problems

GAMM - MITTEILUNGEN, Issue 2 2007
Irfan Kaymaz
Abstract Deterministic optimum designs are obtained without considering of uncertainties related to the problem parameters such as material parameters (yield stress, allowable stresses, moment capacities, etc.), external loadings, manufacturing errors, tolerances, cost functions, which could lead to unreliable designs, therefore several methods have been developed to treat uncertainties in engineering analysis and, more recently, to carry out design optimization with the additional requirement of reliability, which referred to as reliability-based design optimization. In this paper, two most common approaches for reliability-based design optimization are reviewed, one of which is reliability-index based approach and the other performancemeasure approach. Although both approaches can be used to evaluate the probabilistic constraint, their use can be prohibitive when the associated function evaluation required by the probabilistic constraint is expensive, especially for real engineering problems. Therefore, an adaptive response surface method is proposed by which the probabilistic constraint is replaced with a simple polynomial function, thus the computational time can be reduced significantly as presented in the example given in this paper. (© 2007 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


Executive function assessment of patients with schizophrenic disorder residual type in olanzapine treatment: an open study

HUMAN PSYCHOPHARMACOLOGY: CLINICAL AND EXPERIMENTAL, Issue 6 2005
Paolo Stratta
Abstract Cognitive deficits are a fundamental feature of the schizophrenic disorder, but the effect of antipsychotic treatment is still debated. The study assesses the effect of olanzapine on neurocognitive functioning and symptomatology of patients with schizophrenic disorder residual type. Executive function evaluation by the Wisconsin card sorting test (WCST) was performed on 39 patients treated with olanzapine (5,20,mg/day); the efficacy of drug in improving symptomatology, safety and quality of life was also evaluated. After 7 months of treatment, the mean number of WCST categories tended to increase. Correct responses increased with a statistically significant change from the baseline. The total and unique errors decreased significantly. At all post-baseline visits a decrease from baseline in the PANSS total, positive and negative scores was seen. The proportion of patients with less severe illness (CGI), increased over the course of the study with a corresponding decrease of patients with more severe illness. The quality of life scores also tended to improve during treatment. The Simpson Angus scale, Barnes-akathisia and abnormal involuntary movement scale scores decreased consistently. The most common treatment emergent drug related adverse events were weight gain, insomnia, agitation and anxiety. Neurocognitive functioning in terms of executive performance and symptomatology improved in people with schizophrenia residual type. Copyright © 2005 John Wiley & Sons, Ltd. [source]


Accuracy of Spirometry in Diagnosing Pulmonary Restriction in Elderly People

JOURNAL OF AMERICAN GERIATRICS SOCIETY, Issue 11 2009
Simone Scarlata MD
OBJECTIVES: To compare the accuracy of a diagnosis of pulmonary restriction made using forced vital capacity (FVC) less than the lower limit of normal (LLN) with the criterion standard diagnosis made using total lung capacity (TLC) less than the LLN in an elderly population. DESIGN: Retrospective analysis. SETTING: A teaching hospital. PARTICIPANTS: Five hundred sixty-four ambulatory and acute care hospital patients aged 65 to 96 underwent complete pulmonary function evaluation. MEASUREMENTS: Sensitivity, specificity, positive and negative predictive values (PPV, NPV) of diagnosis of pulmonary restriction defined as FVC less than the LLN were calculated in the overall sample and after stratification according to bronchial obstruction. Expected PPV and NPV at different background prevalence of true pulmonary restriction (5% and 15%) were calculated using the Bayes theorem. RESULTS: Low sensitivity (0.32) and high specificity (0.95) were found, with an area under the receiver operating characteristic curve (AUC) of 0.89. In participants without bronchial obstruction, specificity was even higher, although sensitivity decreased to 0.28 (AUC=0.92). The PPV was good (0.81), whereas with a low to moderate a priori probability (prevalence from 5% to 15%) the NPV was fair (,0.89). CONCLUSION: A reduction in FVC below LLN cannot reliably identify true pulmonary restriction in elderly people, confirming previous findings in the adult population. Normal FVC, instead, can effectively exclude pulmonary restriction regardless of the presence of bronchial obstruction when the a priori probability is low or moderately high. [source]


A multivariate likelihood SIRAS function for phasing and model refinement

ACTA CRYSTALLOGRAPHICA SECTION D, Issue 10 2009
Pavol Skubák
A likelihood function based on the multivariate probability distribution of all observed structure-factor amplitudes from a single isomorphous replacement with anomalous scattering experiment has been derived and implemented for use in substructure refinement and phasing as well as macromolecular model refinement. Efficient calculation of a multidimensional integration required for function evaluation has been achieved by approximations based on the function's properties. The use of the function in both phasing and protein model building with iterative refinement was essential for successful automated model building in the test cases presented. [source]


Point spread function analysis in a child with ectopia lentis: objective optical function evaluation and correction of refractive errors

ACTA OPHTHALMOLOGICA, Issue 5 2009
Mari Goto
No abstract is available for this article. [source]


How should a nonfunctioning pituitary macroadenoma be monitored after debulking surgery?

CLINICAL ENDOCRINOLOGY, Issue 6 2009
Yona Greenman
Summary Transsphenoidal surgery is the treatment of choice for nonfunctioning pituitary macroadenomas but is seldom curative. Tumour progression rates are high in patients with postoperative remnants. Therefore, long-term monitoring is necessary to detect tumour growth, which may be asymptomatic or manifest with visual field defects and/or pituitary dysfunction. In view of the generally slow-growing nature of these tumours, yearly magnetic resonance imaging, neuro-ophalmologic and pituitary function evaluation are appropriate during the first 3,5 years after surgery. If there is no evidence for tumour progression during this period, testing intervals may be extended thereafter. [source]


Freeform Shape Representations for Efficient Geometry Processing

COMPUTER GRAPHICS FORUM, Issue 3 2003
Leif Kobbelt
The most important concepts for the handling and storage of freeform shapes in geometry processing applications are parametric representations and volumetric representations. Both have their specific advantages and drawbacks. While the algebraic complexity of volumetric representations is independent from the shape complexity, the domain of a parametric representation usually has to have the same structure as the surface itself (which sometimes makes it necessary to update the domain when the surface is modified). On the other hand, the topology of a parametrically defined surface can be controlled explicitly while in a volumetric representation, the surface topology can change accidentally during deformation. A volumetric representation reduces distance queries or inside/outside tests to mere function evaluations but the geodesic neighborhood relation between surface points is difficult to resolve. As a consequence, it seems promising to combine parametric and volumetric representations to effectively exploit both advantages. In this talk, a number of projects are presented and discussed in which such a combination leads to efficient and numerically stable algorithms for the solution of various geometry processing tasks. Applications include global error control for mesh decimation and smoothing, topology control for level-set surfaces, and shape modeling with unstructured point clouds. [source]


Integrative optimization by RBF network and particle swarm optimization

ELECTRONICS & COMMUNICATIONS IN JAPAN, Issue 12 2009
Satoshi Kitayama
Abstract This paper presents a method for the integrative optimization system. Recently, many methods for global optimization have been proposed. The objective of these methods is to find a global minimum of nonconvex function. However, large numbers of function evaluations are required, in general. We utilize the response surface method to approximate function space to reduce the function evaluations. The response surface method is constructed from sampling points. The RBF Network, which is one of the neural networks, is utilized to approximate the function space. Then Particle Swarm Optimization (PSO) is applied to the response surface. The proposed system consists of three parts: (Part 1) generation of the sampling points, (Part 2) construction of response surface by RBF Network, (Part 3) optimization by PSO. By iterating these three parts, it is expected that the approximate global minimum of nonconvex function can be obtained with a small number of function evaluations. Through numerical examples, the effectiveness and validity are examined. © 2009 Wiley Periodicals, Inc. Electron Comm Jpn, 92(12): 31,42, 2009; Published online in Wiley InterScience (www.interscience. wiley.com). DOI 10.1002/ecj.10187 [source]


A reduced-order simulated annealing approach for four-dimensional variational data assimilation in meteorology and oceanography

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN FLUIDS, Issue 11 2008
I. Hoteit
Abstract Four-dimensional variational data assimilation in meteorology and oceanography suffers from the presence of local minima in the cost function. These local minima arise when the system under study is strongly nonlinear. The number of local minima further dramatically increases with the length of the assimilation period and often renders the solution to the problem intractable. Global optimization methods are therefore needed to resolve this problem. However, the huge computational burden makes the application of these sophisticated techniques unfeasible for large variational data assimilation systems. In this study, a Simulated Annealing (SA) algorithm, complemented with an order-reduction of the control vector, is used to tackle this problem. SA is a very powerful tool of combinatorial minimization in the presence of several local minima at the cost of increasing the execution time. Order-reduction is then used to reduce the dimension of the search space in order to speed up the convergence rate of the SA algorithm. This is achieved through a proper orthogonal decomposition. The new approach was implemented with a realistic eddy-permitting configuration of the Massachusetts Institute of Technology general circulation model (MITgcm) of the tropical Pacific Ocean. Numerical results indicate that the reduced-order SA approach was able to efficiently reduce the cost function with a reasonable number of function evaluations. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Derivative Free Optimization in Higher Dimension

INTERNATIONAL TRANSACTIONS IN OPERATIONAL RESEARCH, Issue 3 2001
Shamsuddin Ahmed
Non-linear optimizations that do not require explicit or implicit derivative information of an objective function are an alternate search strategy when the derivative of the objective function is not available. In factorial design, the number of trials for experimental identification method in Em is about (m+ 1). These (m+ 1) equally spaced points are allowed to form a geometry that is known as regular simplex. The simplex method is attributed to Spendley, Hext and Himsworth. The method is improved by maintaining a set of (m+ 1) points in m dimensional space to generate a non-regular simplex. This study suggests re-scaling the simplex in higher dimensions for a restart phase. The direction of search is also changed when the simplex degenerates. The performance of this derivative free search method is measured based on the number of function evaluations, number of restart attempts and improvements in function value. An algorithm that describes the improved method is presented and compared with the Nelder and Mead simplex method. The performance of this algorithm is also tested with artificial neural network (ANN) problem. The numbers of function evaluations are about 40 times less with the improved method against the Nelder and Mead (1965) method to train an ANN problem with 36 variables. [source]


Maximum Likelihood Estimation of VARMA Models Using a State-Space EM Algorithm

JOURNAL OF TIME SERIES ANALYSIS, Issue 5 2007
Konstantinos Metaxoglou
Abstract., We introduce a state-space representation for vector autoregressive moving-average models that enables maximum likelihood estimation using the EM algorithm. We obtain closed-form expressions for both the E- and M-steps; the former requires the Kalman filter and a fixed-interval smoother, and the latter requires least squares-type regression. We show via simulations that our algorithm converges reliably to the maximum, whereas gradient-based methods often fail because of the highly nonlinear nature of the likelihood function. Moreover, our algorithm converges in a smaller number of function evaluations than commonly used direct-search routines. Overall, our approach achieves its largest performance gains when applied to models of high dimension. We illustrate our technique by estimating a high-dimensional vector moving-average model for an efficiency test of California's wholesale electricity market. [source]


A new investigation of the extended Krylov subspace method for matrix function evaluations

NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, Issue 4 2010
L. Knizhnerman
Abstract For large square matrices A and functions f, the numerical approximation of the action of f(A) to a vector v has received considerable attention in the last two decades. In this paper we investigate theextended Krylov subspace method, a technique that was recently proposed to approximate f(A)v for A symmetric. We provide a new theoretical analysis of the method, which improves the original result for A symmetric, and gives a new estimate for A nonsymmetric. Numerical experiments confirm that the new error estimates correctly capture the linear asymptotic convergence rate of the approximation. By using recent algorithmic improvements, we also show that the method is computationally competitive with respect to other enhancement techniques. Copyright © 2009 John Wiley & Sons, Ltd. [source]