Validation Procedure (validation + procedure)

Distribution by Scientific Domains


Selected Abstracts


Measurements on electrical power systems under bi-tone conditions by using the virtual time-domain approach

EUROPEAN TRANSACTIONS ON ELECTRICAL POWER, Issue 1 2002
L. Peretto
The virtual time-domain approach is very convenient for the characterization of multi-tone signals, both quasiperiodic and periodic with very low fundamental frequency. Measurements on a three-phase load controlled by an inverter have been performed by means of a digital method implementing that approach. The results are reported and discussed also in comparison with those provided by a validation procedure. [source]


On fault isolation by neural-networks-based parameter estimation techniques

EXPERT SYSTEMS, Issue 1 2007
Ramon Ferreiro Garcia
Abstract: The aim of the work is to exploit some aspects of functional approximation techniques in parameter estimation procedures applied on fault detection and isolation tasks using backpropagation neural networks as functional approximation devices. The major focus of this paper deals with the strategy used in the data selection task as applied to the determination of non-conventional process parameters, such as performance or process-efficiency indexes, which are difficult to acquire by direct measurement. The implementation and validation procedure on a real case study is carried out with the aid of the facilities supplied by commercial neural networks toolboxes, which manage databases, neural network structures and highly efficient training algorithms. [source]


Use of caesium-137 data to evaluate SHETRAN simulated long-term erosion patterns in arable lands

HYDROLOGICAL PROCESSES, Issue 10 2004
Y. Norouzi Banis
Abstract The caesium-137 method of quantifying soil erosion is used to provide field data for validating the capability of the SHETRAN modelling system for predicting long-term (30-year) erosion rates and their spatial variability. Simulations were carried out for two arable farm sites (area 3,5 ha) in central England for which average annual erosion rates of 6·5 and 10·4 t ha,1 year,1 had already been determined using caesium-137 measurements. These rates were compared with a range of simulated values representing the uncertainty in model output derived from uncertainty in the evaluation of model parameters. A successful validation was achieved in that the simulation range contained the measured rate at both sites, whereas the spatial variability was reproduced excellently at one site and partially at the other. The results indicate that, as the caesium-137 technique measures the erosion caused by all the processes acting at a site, it is relevant to hydrologically based models such as SHETRAN only if erosion by wind, agricultural activities and other processes not represented in the model are insignificant. The results also indicate a need to reduce the uncertainty in model parameter evaluation. More generally, the caesium-137 technique is shown to provide field data that improve the severity of the validation procedure (accounting for internal as well as outlet conditions) and that add spatial variability to magnitude as a condition for identifying unrealistic parameter sets when seeking to reduce simulation uncertainty. Copyright © 2004 John Wiley & Sons, Ltd. [source]


Process-oriented catchment modelling and multiple-response validation

HYDROLOGICAL PROCESSES, Issue 2 2002
S. Uhlenbrook
Abstract The conceptual rainfall runoff model TAC (tracer-aided catchment model) has been developed based on the experimental results of tracer hydrological investigations at the mountainous Brugga and Zastler basins (40 and 18·4 km2). The model contains a physically realistic description of the runoff generation, which includes seven unit types each with characteristic dominating runoff generation processes. These processes are conceptualized by different linear and non-linear reservoir concepts. The model is applied to a period of 3·2 years on a daily time step with good success. In addition, an extensive model validation procedure was executed. Therefore, additional information (i.e. runoff in subbasins and a neighbouring basin, tracer concentrations and calculated runoff components) was used besides the simulated discharge of the basin investigated. This study shows the potential of tracer data for hydrological modelling. On the one hand, they are good tools to investigate the runoff generation processes. This is the basis for developing more realistic conceptualizations of the runoff generation routine. On the other hand, tracer data can serve as multi-response data to assess and validate a model. Copyright © 2002 John Wiley & Sons, Ltd. [source]


Development and Validation of a Geriatric Knowledge Test for Medical Students

JOURNAL OF AMERICAN GERIATRICS SOCIETY, Issue 6 2004
Ming Lee PhD
Objectives: To assesses the reliability and validity of a geriatrics knowledge test designed for medical students. Design: Cross-sectional studies. Setting: An academic medical center. Participants: A total of 343 (86% of those sampled) medical students participated in the initial study, including 137 (76%) first-year, 163 (96%) third-year, and 43 (86% of those sampled) fourth-year students in the 2000,2001 academic year. To cross-validate the instrument, another 165 (92%) third-year and 137 (76%) first-year students participated in the study in the 2001,2002 academic year. Measurements: An 18-item geriatrics knowledge test was developed. The items were selected from a pool of 23 items. An established instrument assessing the clinical skills of medical students was included in the validation procedure. Results: The instrument demonstrated good reliability (Cronbach ,=0.80) and known-groups and concurrent validity. Geriatrics knowledge scores increased progressively with the higher level of medical training (mean percentage correct=31.3, 65.3, and 66.5 for the first-year, third-year, and fourth-year classes, respectively, P<.001). A significant (P<.01) relationship was found between the third-year students' geriatrics knowledge and their clinical skills. Similar results, except the relationship between knowledge and clinical skills, were found in the cross-validation study, supporting the reliability and known-groups validity of the test. Conclusion: The 18-item geriatrics knowledge test demonstrated sound reliability and validity. The average scores of the student groups indicated substantial room for growth. The relationship between geriatrics knowledge and overall clinical skills needs further investigation. [source]


Nuclear microscopy: A tool for imaging elemental distribution and percutaneous absorption in vivo

MICROSCOPY RESEARCH AND TECHNIQUE, Issue 4 2007
Ana Veríssimo
Abstract Nuclear microscopy is a technique based on a focused beam of accelerated particles that has the ability of imaging the morphology of the tissue in vivo and of producing the correspondent elemental maps, whether in major, minor, or trace concentrations. These characteristics constitute a strong advantage in studying the morphology of human skin, its elemental distributions and the permeation mechanisms of chemical compounds. In this study, nuclear microscopy techniques such as scanning transmission ion microscopy and particle induced X-ray emission were applied simultaneously, to cryopreserved human skin samples with the purpose of obtaining high-resolution images of cells and tissue morphology. In addition, quantitative elemental profiling and mapping of phosphorus, calcium, chlorine, and potassium in skin cross-sections were obtained. This procedure accurately distinguishes the epidermal strata and dermis by overlapping in real time the elemental information with density images obtained from the transmitted beam. A validation procedure for elemental distributions in human skin based on differential density of epidermal strata and dermis was established. As demonstrated, this procedure can be used in future studies as a tool for the in vivo examination of trans-epidermal and -dermal delivery of products. Microsc. Res. Tech., 2007. © 2007 Wiley-Liss, Inc. [source]


Cleavage of cystatin C is not associated with multiple sclerosis

ANNALS OF NEUROLOGY, Issue 2 2007
Piero Del Boccio PhD
Recently, Irani and colleagues proposed a C-terminal cleaved isoform cystatin C (12.5kDa) in cerebrospinal fluid as a marker of multiple sclerosis. In this study, we demonstrate that the 12.5kDa product of cystatin C is formed by degradation of the first eight N-terminal residues. Moreover, such a degradation is not specific in the cerebrospinal fluid of multiple sclerosis, but rather is given by an inappropriate sample storage at ,20°C. We conclude that the use of the 12.5kDa product of cystatin C in cerebrospinal fluid might lead to a fallacious diagnosis of multiple sclerosis. Preanalytical validation procedure is mandatory for proteomics investigations. Ann Neurol 2007 [source]


Sequential design in quality control and validation of land cover databases

APPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY, Issue 2 2009
Elisabetta Carfagna
Abstract We have faced the problem of evaluating the quality of land cover databases produced through photo-interpretation of remote-sensing data according to a legend of land cover types. First, we have considered the quality control, that is, the comparison of a land cover database with the result of the photo-interpretation made by a more expert photo-interpreter, on a sample of the polygons. Then we have analysed the problem of validation, that is, the check of the photo-interpretation through a ground survey. We have used the percentage of area correctly photo-interpreted as a quality measure. Since the kind of land cover type and the size of the polygons affect the probability of making mistakes in the photo-interpretation, we stratify the polygons according to two variables: the land cover type of the photo-interpretation and the size of the polygons. We have proposed an adaptive sequential procedure with permanent random numbers in which the sample size per stratum is dependent on the previously selected units but the sample selection is not, and the stopping rule is not based on the estimates of the quality parameter. We have proved that this quality control and validation procedure allows unbiased and efficient estimates of the quality parameters and allows reaching high precision of estimates with the smallest sample size. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Analysis of endogenous glutathione-adducts and their metabolites

BIOMEDICAL CHROMATOGRAPHY, Issue 1 2010
Ian A. Blair
Abstract The ability to conduct validated analyses of glutathione (GSH)-adducts and their metabolites is critically important in order to establish whether they play a role in cellular biochemical or pathophysiological processes. The use of stable isotope dilution (SID) methodology in combination with liquid chromatography,tandem mass spectrometry (LC-MS/MS) provides the highest bioanalytical specificity possible for such analyses. Quantitative studies normally require the high sensitivity that can be obtained by the use of multiple reaction monitoring (MRM)/MS rather than the much less sensitive but more specific full scanning methodology. The method employs a parent ion corresponding to the intact molecule together with a prominent product ion that obtained by collision induced dissociation. Using SID LC-MRM/MS, analytes must have the same relative LC retention time to the heavy isotope internal standard established during the validation procedure, the correct parent ion and the correct product ion. This level of specificity cannot be attained with any other bioanalytical technique employed for biomarker analysis. This review will describe the application of SID LC-MR/MS methodology for the analysis of GSH-adducts and their metabolites. It will also discuss potential future directions for the use of this methodology for rigorous determination of their utility as disease and exposure biomarkers. Copyright © 2010 John Wiley & Sons, Ltd. [source]


Identification and determination of major flavonoids in rat urine by HPLC-UV and HPLC-MS methods following oral administration of Dalbergia odorifera extract

BIOMEDICAL CHROMATOGRAPHY, Issue 1 2006
Rongxia Liu
Abstract Flavonoids are the main active constituents of Dalbergia odorifera. The excretion of the major flavonoids in rat urine after oral administration of D. odorifera extract was investigated by HPLC-UV and HPLC-MS methods. Utilizing the HPLC-MS technique, 18 flavonoids, including five isoflavones, four isoflavanones, four neoflavones, two flavanones, two chalcones and one isoflavanonol were identified in free form in a urine sample based on the direct comparison of the corresponding tR, UV maximum absorbance (,max) values and MS data with the authentic standards. The amounts of the prominent flavonoids, (3R)-4,-methoxy-2,,3,7-trihydroxyisoflavanone and vestitone, were determined by HPLC-UV with the internal standard method, and the validation procedure confirmed that it afforded reliable analysis of these two analytes in urine after oral administration of D. odorifera extract. Copyright © 2005 John Wiley & Sons, Ltd. [source]


Optical remote mapping of rivers at sub-meter resolutions and watershed extents

EARTH SURFACE PROCESSES AND LANDFORMS, Issue 1 2008
W. Andrew Marcus
Abstract At watershed extents, our understanding of river form, process and function is largely based on locally intensive mapping of river reaches, or on spatially extensive but low density data scattered throughout a watershed (e.g. cross sections). The net effect has been to characterize streams as discontinuous systems. Recent advances in optical remote sensing of rivers indicate that it should now be possible to generate accurate and continuous maps of in-stream habitats, depths, algae, wood, stream power and other features at sub-meter resolutions across entire watersheds so long as the water is clear and the aerial view is unobstructed. Such maps would transform river science and management by providing improved data, better models and explanation, and enhanced applications. Obstacles to achieving this vision include variations in optics associated with shadows, water clarity, variable substrates and target,sun angle geometry. Logistical obstacles are primarily due to the reliance of existing ground validation procedures on time-of-flight field measurements, which are impossible to accomplish at watershed extents, particularly in large and difficult to access river basins. Philosophical issues must also be addressed that relate to the expectations around accuracy assessment, the need for and utility of physically based models to evaluate remote sensing results and the ethics of revealing information about river resources at fine spatial resolutions. Despite these obstacles and issues, catchment extent remote river mapping is now feasible, as is demonstrated by a proof-of-concept example for the Nueces River, Texas, and examples of how different image types (radar, lidar, thermal) could be merged with optical imagery. The greatest obstacle to development and implementation of more remote sensing, catchment scale ,river observatories' is the absence of broadly based funding initiatives to support collaborative research by multiple investigators in different river settings. Copyright © 2007 John Wiley & Sons, Ltd. [source]


Non-linearity and error in modelling soil processes

EUROPEAN JOURNAL OF SOIL SCIENCE, Issue 1 2001
T. M. Addiscott
Summary Error in models and their inputs can be propagated to outputs. This is important for modelling soil processes because soil properties used as parameters commonly contain error in the statistical sense, that is, variation. Model error can be assessed by validation procedures, but tests are needed for the propagation of (statistical) error from input to output. Input error interacts with non-linearity in the model such that it contributes to the mean of the output as well as its error. This can lead to seriously incorrect results if input error is ignored when a non-linear model is used, as is demonstrated for the Arrhenius equation. Tests for non-linearity and error propagation are suggested. The simplest test for non-linearity is a graph of the output against the input. This can be supplemented if necessary by testing whether the mean of the output changes as the standard deviation of the input increases. The tests for error propagation examine whether error is suppressed or exaggerated as it is propagated through the model and whether changes in the error in one input influence the propagation of another. Applying these tests to a leaching model with rate and capacity parameters showed differences between the parameters, which emphasized that statements about non-linearity must be for specific inputs and outputs. In particular, simulations of mean annual concentrations of solute in drainage and concentrations on individual days differed greatly in the amount of non-linearity revealed and in the way error was propagated. This result is interpreted in terms of decoherence. [source]


Single-crystal structure validation with the program PLATON

JOURNAL OF APPLIED CRYSTALLOGRAPHY, Issue 1 2003
A. L. Spek
The results of a single-crystal structure determination when in CIF format can now be validated routinely by automatic procedures. In this way, many errors in published papers can be avoided. The validation software generates a set of ALERTS detailing issues to be addressed by the experimenter, author, referee and publication journal. Validation was pioneered by the IUCr journal Acta Crystallographica Section C and is currently standard procedure for structures submitted for publication in all IUCr journals. The implementation of validation procedures by other journals is in progress. This paper describes the concepts of validation and the classes of checks that are carried out by the program PLATON as part of the IUCr checkCIF facility. PLATON validation can be run at any stage of the structure refinement, independent of the structure determination package used, and is recommended for use as a routine tool during or at least at the completion of every structure determination. Two examples are discussed where proper validation procedures could have avoided the publication of incorrect structures that had serious consequences for the chemistry involved. [source]


QSAR model for alignment-free prediction of human breast cancer biomarkers based on electrostatic potentials of protein pseudofolding HP-lattice networks

JOURNAL OF COMPUTATIONAL CHEMISTRY, Issue 16 2008
Santiago Vilar
Abstract Network theory allows relationships to be established between numerical parameters that describe the molecular structure of genes and proteins and their biological properties. These models can be considered as quantitative structure,activity relationships (QSAR) for biopolymers. The work described here concerns the first QSAR model for 122 proteins that are associated with human breast cancer (HBC), as identified experimentally by Sjöblom et al. (Science 2006, 314, 268) from over 10,000 human proteins. In this study, the 122 proteins related to HBC (HBCp) and a control group of 200 proteins that are not related to HBC (non-HBCp) were forced to fold in an HP lattice network. From these networks a series of electrostatic potential parameters (,k) was calculated to describe each protein numerically. The use of ,k as an entry point to linear discriminant analysis led to a QSAR model to discriminate between HBCp and non-HBCp, and this model could help to predict the involvement of a certain gene and/or protein in HBC. In addition, validation procedures were carried out on the model and these included an external prediction series and evaluation of an additional series of 1000 non-HBCp. In all cases good levels of classification were obtained with values above 80%. This study represents the first example of a QSAR model for the computational chemistry inspired search of potential HBC protein biomarkers. © 2008 Wiley Periodicals, Inc. J Comput Chem 2008 [source]


Identification of Novel CDK2 Inhibitors by QSAR and Virtual Screening Procedures

MOLECULAR INFORMATICS, Issue 11-12 2008
Ajay Babu, Padavala
Abstract Quantitative Structure,Activity Relationship (QSAR) studies were carried out on a set of 46 imidazo[1,2-a]pyridines, imidazo[1,2-b]pyridazines and 2,4-bis anilino pyrimidines, and nitroso-6-aminopyrimidine and 2,6-diaminopyrimidine inhibitors of CDK2 (Cyclin-dependent Kinase2) using a multiple regression procedure. The activity contributions of these compounds were determined from regression equation and the validation procedures such as external set cross-validation r2, (R2cv,ext) and the regression of observed activities against predicted activities and vice versa for validation set were described to analyze the predictive ability of the QSAR model. An accurate and reliable QSAR model involving five descriptors was chosen based on the FIT Kubinyi function which defines the statistical quality of the model. The proposed model due to its high predictive ability was utilized to screen similar repertoire of compounds reported in the literature, and the biological activities are estimated. The screening study clearly demonstrated that the strategy presented shall be used as an alternative to the time-consuming experiments as the model tolerated a variety of structural modifications signifying its potential for drug design studies. [source]


Self-Stigma Among Concealable Minorities in Hong Kong: Conceptualization and Unified Measurement

AMERICAN JOURNAL OF ORTHOPSYCHIATRY, Issue 2 2010
Winnie W. S. Mak
Self-stigma refers to the internalized stigma that individuals may have toward themselves as a result of their minority status. Not only can self-stigma dampen the mental health of individuals, it can deter them from seeking professional help lest disclosing their minority status lead to being shunned by service providers. No unified instrument has been developed to measure consistently self-stigma that could be applied to different concealable minority groups. The present study presented findings based on 4 studies on the development and validation of the Self-Stigma Scale, conducted in Hong Kong with community samples of mental health consumers, recent immigrants from Mainland China, and sexual minorities. Upon a series of validation procedures, a 9-item Self-Stigma Scale,Short Form was developed. Initial support on its reliability and construct validity (convergent and criterion validities) were found among 3 stigmatized groups. Utility of this unified measure was to establish an empirical basis upon which self-stigma of different concealable minority groups could be assessed under the same dimensions. Health-care professionals could make use of this short scale to assess potential self-stigmatization among concealable minorities, which may hamper their treatment process as well as their overall well-being. [source]