Satisfactory Performance (satisfactory + performance)

Distribution by Scientific Domains


Selected Abstracts


Segregation and scheduling for P2P applications with the interceptor middleware system

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 6 2008
Cosimo Anglano
Abstract Very large size peer-to-peer systems are often required to implement efficient and scalable services, but usually they can be built only by assembling resources contributed by many independent users. Among the guarantees that must be provided to convince these users to join the P2P system, particularly important is the ability of ensuring that P2P applications and services run on their nodes will not unacceptably degrade the performance of their own applications because of an excessive resource consumption. In this paper we present the Interceptor, a middleware-level application segregation and scheduling system, which is able to strictly enforce quantitative limitations on node resource usage and, at the same time, to make P2P applications achieve satisfactory performance even in face of these limitations. A proof-of-concept implementation has been carried out for the Linux operating system, and has been used to perform an extensive experimentation aimed at quantitatively evaluating the Interceptor. The results we obtained clearly demonstrate that the Interceptor is able to strictly enforce quantitative limitations on node resource usage, and at the same time to effectively schedule P2P applications. Copyright © 2007 John Wiley & Sons, Ltd. [source]


Ratio estimators in adaptive cluster sampling

ENVIRONMETRICS, Issue 6 2007
Arthur L. Dryver
Abstract In most surveys data are collected on many items rather than just the one variable of primary interest. Making the most use of the information collected is a issue of both practical and theoretical interest. Ratio estimates for the population mean or total are often more efficient. Unfortunately, ratio estimation is straightforward with simple random sampling, but this is often not the case when more complicated sampling designs are used, such as adaptive cluster sampling. A serious concern with ratio estimates introduced with many complicated designs is lack of independence, a necessary assumption. In this article, we propose two new ratio estimators under adaptive cluster sampling, one of which is unbiased for adaptive cluster sampling designs. The efficiencies of the new estimators to existing unbiased estimators, which do not utilize the auxiliary information, for adaptive cluster sampling and the conventional ratio estimation under simple random sampling without replacement are compared in this article. Related result shows the proposed estimators can be considered as a robust alternative of the conventional ratio estimator, especially when the correlation between the variable of interest and the auxiliary variable is not high enough for the conventional ratio estimator to have satisfactory performance. Copyright © 2007 John Wiley & Sons, Ltd. [source]


Generalized window advertising for TCP congestion control,

EUROPEAN TRANSACTIONS ON TELECOMMUNICATIONS, Issue 6 2002
Mario Gerla
Congestion in the Internet is a major cause of network performance degradation. The Generalized Window Advertising (GWA) scheme proposed in this paper is a new approach for enhancing the congestion control properties of TCP. GWA requires only minor modifications to the existing protocol stack and is completely backward compatible, allowing GWA-hosts to interact with non-GWA hosts without modifications. GWA exploits the notion of end-host-network cooperation, with the congestion level notified from the network to end hosts. It is based on solid control theory results mat guarantee performance and stable network operation. GWA is able to avoid window oscillations and the related fluctuations in offered load and network performance. This makes it more robust to sustained network overload due to a large number of connections competing for the same bottleneck, a situation where traditional TCP implementations fail to provide satisfactory performance. GWA-TCP is compared with traditional TCP, TCP with RED and also ECN using the ns-2 simulator. Results show that in most cases GWA-TCP outperforms the traditional schemes. In particular, when compared with ECN, it provides smoother network operation and increased fairness. [source]


Advances and Applications of Biodegradable Elastomers in Regenerative Medicine

ADVANCED FUNCTIONAL MATERIALS, Issue 2 2010
Maria Concepcion Serrano
Abstract When elastomers were first proposed as useful materials for regenerative medicine a few decades ago, their high versatility and suitability for a diverse and wide range of in vivo applications could not have been predicted. Due to their ability to recover after deformation, these materials were first introduced in tissue engineering in an attempt to mimic the mechanical properties of the extracellular matrix. Furthermore, elastomeric characteristics have been described as important criteria for cell interaction by modulating cellular behavior. From soft to hard tissues, elastomers have demonstrated degradation, mechanical, and biocompatibility requirements in accordance with the target tissue. In this feature article, biodegradable synthetic polyester elastomers that have been reported in the literature are discussed, with special focus on those that show promise for in vivo tissue replacement. Their satisfactory performance in vivo shows the promise of elastomers for use in regenerative medicine. However, further investigation is required to demonstrate the prospect of elastomer-based therapies in clinical trials. [source]


Neither fish nor fowl?

INDUSTRIAL RELATIONS JOURNAL, Issue 2 2004
An assessment of teacher capability procedures
ABSTRACT There has been an increasing focus on the performance of workers through appraisal, performance-related pay and performance management and this emphasis on measuring performance has extended to the public sector,more specifically, to the teaching profession. This paper uses research commissioned by the DfES to investigate the operation of capability procedures introduced to deal with the perceived problem of incompetent teachers. It revealed that the procedures suffered from a number of defects both in modus operandi and style and there was little evidence that their application resulted in either improved performance or dismissal when satisfactory performance was not achieved. [source]


Retransmission strategies for an acceptable quality of HDTV in the wireless and wired home scenario

INTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, Issue 3 2007
Frederik Vanhaverbeke
Abstract We evaluate the performance of various retransmission schemes to achieve the goal of less than one visible distortion in twelve hours for high definition television. The focus is on an indoor wireless link and a DSL link, and we consider systems where the packets are protected by means of retransmissions with and without forward error correcting (FEC) codes. In order to achieve a satisfactory performance with restricted latency, we propose an unconventional retransmission procedure. The overall conclusion is that retransmissions without FEC achieve the best performance with the lowest latency, lowest overhead and lowest complexity, both in the wireless and the wired home scenario. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Carrier-sense-assisted adaptive learning MAC protocols for distributed wireless LANs

INTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, Issue 7 2005
P. Nicopolitidis
Abstract A Carrier-sense-assisted adaptive learning MAC protocol for wireless LANs, capable of operating efficiently in bursty traffic wireless networks with unreliable channel feedback, is introduced. According to the proposed protocol, the mobile station that is granted permission to transmit is selected by means of learning automata. At each station, the learning automaton takes into account the network feedback information in order to update the choice probability of each mobile station. The proposed protocol utilizes carrier sensing in order to reduce the collisions that are caused by different decisions at the various mobile stations due to the unreliable channel feedback. Simulation results show satisfactory performance of the proposed protocol compared to similar MAC protocols. Copyright © 2005 John Wiley & Sons, Ltd. [source]


Assessment of anaerobic wastewater treatment failure using terminal restriction fragment length polymorphism analysis

JOURNAL OF APPLIED MICROBIOLOGY, Issue 6 2005
C. Scully
Abstract Aims:, The suitability of genetic fingerprinting to study the microbiological basis of anaerobic bioreactor failure is investigated. Methods and Results:, Two laboratory-scale anaerobic expanded granular sludge bed bioreactors, R1 and R2, were used for the mesophilic (37°C) treatment of high-strength [10 g chemical oxygen demand (COD) l,1] synthetic industrial-like wastewater over a 100-day trial period. A successful start up was achieved by both bioreactors with COD removal over 90%. Both reactors were operated under identical parameters; however, increased organic loading during the trial induced a reduction in the COD removal of R1, while R2 maintained satisfactory performance (COD removal >90%) throughout the experiment. Specific methanogenic activity measurements of biomass from both reactors indicated that the main route of methane production was hydrogenotrophic methanogenesis. Terminal restriction fragment length polymorphism (TRFLP) analysis was applied to the characterization of microbial community dynamics within the system during the trial. The principal differences between the two consortia analysed included an increased abundance of Thiovulum - and Methanococcus -like organisms and uncultured Crenarchaeota in R1. Conclusions:, The results indicated that there was a microbiological basis for the deviation, in terms of operational performance, of R1 and R2. Significance and Impact of the Study:, High-throughput fingerprinting techniques, such as TRFLP, have been demonstrated as practically relevant for biomonitoring of anaerobic reactor communities. [source]


Development of a mathematical model for studying bioethanol,water separation using hydrophilic polyetherimide membrane

JOURNAL OF APPLIED POLYMER SCIENCE, Issue 4 2008
M. E. T. Alvarez
Abstract An essentially predictive mathematical model was developed to simulate pervaporation process. The group contribution method UNIFAC was used for calculating the upstream activity coefficients. The diffusion coefficient in the membrane was predicted using free-volume theory. Free-volume parameters were determined with viscosity and temperature data, and the binary interaction solvent,polymer parameter was calculated by a group-contribution lattice-fluid equation of state (GCLF-EOS). A simulator named PERVAP was developed applying the mathematical model. Pervaporation process was simulated for separating bioethanol,water through polyetherimide membrane. The simulated results were validated using experimental data of bioethanol/water separation through polyetherimide membrane. The model presented a satisfactory performance compared to experimental data. Related to the simulation of the studied separation, a 99% molar enriched bioethanol stream was obtained with a recovery of 94%. © 2007 Wiley Periodicals, Inc. J Appl Polym Sci, 2008 [source]


A New Navigation Method for an Automatic Guided Vehicle

JOURNAL OF FIELD ROBOTICS (FORMERLY JOURNAL OF ROBOTIC SYSTEMS), Issue 3 2004
Chen Wuwei
This paper presents a new navigation method for an automatic guided vehicle (AGV). This method utilizes a new navigation and control scheme based on searching points on an arc. Safety measure indices are defined and are generated from the output of a fuzzy neural network which define the actions the AGV is to take when in the presence of obstacles. The proposed algorithm integrates several functions required for automatic guided vehicle navigation and tracking control and it exhibits satisfactory performance when maneuvering in complex environments. The automatic guided vehicle with this navigation control system not only can quickly process environmental information, but also can efficiently avoid dynamic or static obstacles, and reach targets safely and reliably. Extensive simulation and experimental results demonstrate the effectiveness and correct behavior of this scheme. © 2004 Wiley Periodicals, Inc. [source]


SEGMENTATION OF BEEF JOINT IMAGES USING HISTOGRAM THRESHOLDING

JOURNAL OF FOOD PROCESS ENGINEERING, Issue 6 2006
CHAOXIN ZHENG
ABSTRACT Four histogram-based thresholding methods, i.e., one-dimensional (1-D) histogram variance, 1-D histogram entropy, two-dimensional (2-D) histogram variance and 2-D histogram entropy, were proposed to segment the images of beef joints (raw, cooked and cooled) automatically from the background. The 2-D histogram-based methods incorporate a fast algorithm to reduce the calculation time, thus increasing the speed greatly. All the four methods were applied to 15 beef joint images captured from a video camera, and the methods including pixel classification, object overlap and object contrast for the evaluation of segmentation results were then employed to compare the performances or the abilities of the four different segmenting methods. Results indicate that the 2-D histogram variance thresholding method can accomplish the segmentation task with the most satisfactory performance. [source]


Online estimation and control of polymer quality in a copolymerization reactor

AICHE JOURNAL, Issue 5 2002
Myung-June Park
The validity of an online state estimator for a semi-batch MMA/MA solution copolymerization reactor was established using online densitometer and viscometer. Using the conventional extended Kalman filter (EKF) as the state estimator, the experiment was conducted under both isothermal and nonisothermal conditions for application to the control of copolymer properties. Further analysis was made by using ofline measurement data for the mol fraction of MMA in the remaining monomers and the solid content. The EKF was found to provide a good estimate for the state of the copolymerization system. A model predictive controller was designed and implemented to obtain copolymers with uniform copolymer composition and the desired weight average molecular weight by adopting the feed flow rate of MMA and the reaction temperature as control inputs. The controller was proven effective with a satisfactory performance for the control of polymer properties in the semi-batch copolymerization reactor. [source]


A feasibility study of daytime fog and low stratus detection with TERRA/AQUA-MODIS over land

METEOROLOGICAL APPLICATIONS, Issue 2 2006
Jörg Bendix
Abstract A scheme for the detection of fog and low stratus over land during daytime based on data of the MODIS (Moderate Resolution Imaging Spectroradiometer) instrument is presented. The method is based on an initial threshold test procedure in the MODIS solar bands 1,7 (0.62,2.155µm). Fog and low stratus detection generally relies on the definition of minimum and maximum fog and low stratus properties, which are converted to spectral thresholds by means of radiative transfer calculations (RTC). Extended sensitivity studies reveal that thresholds mainly depend on the solar zenith angle and, hence, illumination-dependent threshold functions are developed. Areas covered by snow, ice and mid-/high-level clouds as well as bright/hazy land surfaces are omitted from the initial classification result by means of a subsequent cloud-top height test based on MODIS IR band 31 (at 12 µm) and a NIR/VIS ratio test. The validation of the final fog and low stratus mask generally shows a satisfactory performance of the scheme. Validation problems occur due to the late overpass time of the TERRA platform and the time lag between SYNOP and satellite observations. Apparent misclassifications are mainly found at the edge of the fog layers, probably due to over- or underestimation of fog and low stratus cover in the transition zone from fog to haze. Copyright © 2006 Royal Meteorological Society. [source]


An octave-band smooth-wall pyramidal horn

MICROWAVE AND OPTICAL TECHNOLOGY LETTERS, Issue 4 2006
Krishnasamy T. Selvan
Abstract A smooth-wall pyramidal horn offering satisfactory performance over an octave bandwidth (in the frequency range 4,8 GHz) is reported. While the VSWR is 1.5 or better, the pattern and gain characteristics are also acceptable. The experimental results are presented and compared with those obtained by simulation. © 2006 Wiley Periodicals, Inc. Microwave Opt Technol Lett 48: 691,693, 2006; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/mop.21444 [source]


Estimation of Aqueous-Phase Reaction Rate Constants of Hydroxyl Radical with Phenols, Alkanes and Alcohols

MOLECULAR INFORMATICS, Issue 11-12 2009
Ya-nan Wang
Abstract A quantitative structure activity relationship (QSAR) model was developed for the aqueous-phase hydroxyl radical reaction rate constants (kOH) employing quantum chemical descriptors and multiple linear regressions (MLR). The QSAR development followed the OECD guidelines, with special attention to validation, applicability domain (AD) and mechanistic interpretation. The established model yielded satisfactory performance: the correlation coefficient square (R2) was 0.905, the root mean squared error (RMSE) was 0.139, the leave-many-out cross-validated QLMO2 was 0.806, and the external validated QEXT2 was 0.922 log units. The AD of the model covering compounds of phenols, alkanes and alcohols, was analyzed by Williams plot. The main molecular structural factors governing kOH are the energy of the highest occupied molecular orbital (EHOMO), average net atomic charges on hydrogen atoms (), molecular surface area (MSA) and dipole moment (,). It was concluded that kOH increased with increasing EHOMO and MSA, while decreased with increasing and ,. [source]


Quantitative Structure,Activity Relationship Studies for the Binding Affinities of Imidazobenzodiazepines for the ,6 Benzodiazepine Receptor Isoform Utilizing Optimized Blockwise Variable Combination by Particle Swarm Optimization for Partial Least Squares Modeling

MOLECULAR INFORMATICS, Issue 1 2007
Leqian Hu
Abstract Binding affinities of a series of substituted imidazobenzodiazepines for the ,6 Benzodiazepine Receptor (BzR) isoform are investigated by the Optimized Blockwise Variable Combination (OBVC) by Particle Swarm Optimization (PSO) based on Partial Least Squares (PLS) modeling. The QSAR analysis result showed that MolRef, AlogP, MRCM**-3, Rotatable bonds (Rotlbonds), Hydrogen Bond Acceptors (Hbond acceptor), five Jurs descriptors, two Shadow indices descriptors and principal moment of inertia are the most important descriptors among all the investigated descriptors. One can change the molar refractivity, the polar interactions between molecules, the shape of the molecules, the principal moments of inertia about the principal axes of a molecule, the hydrophobic character of the molecule, the number of Rotlbonds and Hbond acceptors of the compounds to adjust the binding affinities of imidazobenzodiazepine for the ,6 BzR isoform. The Quantitative Structure,Activity Relationship (QSAR) analysis result was also compared with MLR, PLS, and hierarchical PLS algorithms. It has been demonstrated that OBVC by PSO for PLS modeling shows satisfactory performance in the QSAR analysis. [source]


Intraoperative Study of Polarization and Evoked Response Signals in Different Endocardial Electrode Designs

PACING AND CLINICAL ELECTROPHYSIOLOGY, Issue 7 2001
CHING LAU
LAU, C., et al.: Intraoperative Study of Polarization and Evoked Response Signals in Different Endocardial Electrode Designs. Some new generation pacemakers use an algorithm based on evoked response (ER) detection to verify beat-to-beat capture and to enable automatic adjustment of output. This is a prospective acute study of polarization signal (PS) and ER in nine currently available electrodes. Intraoperative testing of ventricular bipolar electrodes used the Autocapture (AC) algorithm. The intrinsic R wave, PS, ER, acceptance of AC function, and stimulation thresholds (STs) were obtained. Ventricular electrodes were categorized as follows: titanium nitride (TiN)-coated passive and active fixation, high impedance (HI), passive fixation (VP), iridium oxide-coated titanium (IROX) (VI), and platinum helix (PH) active fixation. Acute testing was performed in 217 patients with an average age of 74.26 years, 59.6% were men with primary pacing indication-SSS (46.3%). There were no significant differences found with respect to R wave and threshold between the various electrodes. PH active-fixation electrodes had significantly higher ER and PS than other groups including the TiN-coated active-fixation electrodes. TiNcoated electrodes (active and passive fixation) had significantly lower PS than other electrodes. As a result, TiN electrodes had a significantly higher functional rate of AC (91.7%), whereas PH had the lowest rate (0%). In conclusion, (1) polarization characteristics are significantly different for commercially available ventricular electrodes, (2) certain physical features at the tissue to electrode interface like TiN coating appears to be more important in determining PS than electrode tip size and fixation method, and (3) the current algorithm for AC requires electrodes that provide low polarization for satisfactory performance. [source]


Micro-electrospray with stainless steel emitters

RAPID COMMUNICATIONS IN MASS SPECTROMETRY, Issue 14 2003
Wenqing Shui
The physical processes underlying micro-electrospray (micro-ES) performance were investigated using a stainless steel (SS) emitter with a blunt tip. Sheathless micro-ES could be generated at a blunt SS tip without any tapering or sanding if ESI conditions were optimized. The Taylor cone was found to shrink around the inner diameter of the SS tubing, which permitted a low flow rate of 150,nL/min for sheathless microspray on the blunt tip (100,,m i.d.,×,400,,m o.d.). It is believed that the wettability and/or hydrophobicity of SS tips are responsible for their micro-ES performance. The outlet orifice was further nipped to reduce the size of the spray cone and limit the flow rate to 50,150,nL/min, resulting in peptide detection down to attomole quantities consumed per spectrum. The SS emitter was also integrated into a polymethylmethacrylate microchip and demonstrated satisfactory performance in the analysis and identification of a myoglobin digest. Copyright © 2003 John Wiley & Sons, Ltd. [source]


Minimally Invasive Thyroidectomy: Basic and Advanced Techniques

THE LARYNGOSCOPE, Issue 3 2006
David J. Terris MD
Abstract Objective: Minimal access surgery in the thyroid compartment has evolved considerably over the past 10 years and now takes many forms. We advocate at least two distinct approaches, depending on the disease process and multiple patient factors. The technical aspects are explored in depth with liberal use of videographic demonstration. Methods: The authors conducted a comparison of two distinct surgical techniques with photographic and videographic documentation of two distinct minimal access approaches to the thyroid compartment termed minimally invasive thyroidectomy (MITh) and minimally invasive video-assisted thyroidectomy (MIVAT). Both historic and previously unpublished data (age, gender, pathology, incision length, and complications) are systematically analyzed. Results: Patients who underwent minimally invasive thyroidectomy (n = 31) had a mean age of 39.4 ± 10.7 years; seven were male and 24 were female. The most common diagnosis was follicular or Hürthle cell adenoma (29%), followed by papillary or follicular cancer (26%). The mean incision length was 4.9 ± 1.0 cm. One patient developed a hypertrophic scar and one patient developed thrombophlebitis of the anterior jugular vein. There were 14 patients in the MIVAT group with a mean age of 43.7 ± 11.4 years; one was male and 13 were female. The majority of patients had follicular adenoma (42.9%) or papillary carcinoma (21.4%) as their primary diagnosis. The mean incision length was 25 ± 4.3 mm (range, 20,30 mm), and there were no complications. Conclusions: Two distinct approaches to minimal access thyroid surgery are now available. The choice of approach depends on a number of patient and disease factors. Careful patient selection will result in continued safe and satisfactory performance of minimally invasive thyroid surgery. [source]


Wavelet transform in denoising magnetic archaeological prospecting data

ARCHAEOLOGICAL PROSPECTION, Issue 2 2007
B. Tsivouraki
Abstract Magnetic measurements in archaeological prospecting are often affected by cultural noise having the same high-frequency content as anomalies arising from buried antiquities. Also, in many cases the microrelief of the ground surface causes a noise that is coherent, pseudorandom and periodic. The main cause of this kind of noise is the ploughing of the earth. In this paper the efficiency of a wavelet denoising scheme is tested with respect to these types of unwanted disturbances. The proposed scheme combines the cyclospinning algorithm with a variable threshold calculated in each cycle of the algorithm. Tests on synthetic and real data show a satisfactory performance of the technique in suppressing both the white noise and the coherent noise caused by the systematic undulations of the ground surface. Copyright © 2007 John Wiley & Sons, Ltd. [source]


The PediPump: A Versatile, Implantable Pediatric Ventricular Assist Device,Update IV

ARTIFICIAL ORGANS, Issue 11 2009
Brian W. Duncan
Abstract Cleveland Clinic's PediPump (Cleveland, OH, USA) is a ventricular assist device designed for the support of pediatric patients. The PediPump is a mixed-flow ventricular assist device with a magnetically suspended impeller measuring 10.5 mm in diameter by 64.5 mm in length. Progress and achievements for the PediPump program are considered according to the development project's three primary objectives: Basic engineering: along with size reductions, substantial design improvements have been incorporated in each design iteration including the motor, magnetic bearings, axial touch points, and heat transfer path; Anatomic modeling and device fitting studies: Techniques based on computed tomography and magnetic resonance imaging have been developed to create three-dimensional anatomic-modeling and device-fitting tools to facilitate device implantation and to assist in preoperative planning. For in vivo testing, to date, six acute (6-h duration) and nine chronic (30-day target duration) implantations have been performed in sheep; the implantation of the PediPump appears to be relatively easy with excellent hemodynamic performance and minimal hemolysis during support. Cleveland Clinic's PediPump program supported by the National Heart, Lung and Blood Institute's Pediatric Circulatory Support Program has led to the development of a pediatric ventricular assist device that has satisfactory performance in preclinical evaluation and appears to be ready to support a program of clinical testing. [source]


USING LEAST SQUARE SVM FOR NONLINEAR SYSTEMS MODELING AND CONTROL

ASIAN JOURNAL OF CONTROL, Issue 2 2007
Haoran Zhang
ABSTRACT Support vector machine is a learning technique based on the structural risk minimization principle, and it is also a class of regression method with good generalization ability. The paper firstly introduces the mathematical model of regression least squares support vector machine (LSSVM), and designs incremental learning algorithms by the calculation formula of block matrix, then uses LSSVM to model nonlinear system, based on which to control nonlinear systems by model predictive method. Simulation experiments indicate that the proposed method provides satisfactory performance, and it achieves superior modeling performance to the conventional method based on neural networks, moreover it achieves well control performance. [source]


High-performance liquid chromatography assays for desmethoxyyangonin, methysticin, kavain and their microsomal metabolites

BIOMEDICAL CHROMATOGRAPHY, Issue 1 2009
Shuang Fu
Abstract Three novel, simple and reproducible high-performance liquid chromatography quantitative assays with UV detection were developed and validated for three major kavalactones,desmethoxyyangonin, methysticin and kavain,in rat liver microsomes using diazepam as an internal standard; liquid,liquid extraction was used for sample preparation and analysis was performed on a Shimadzu® 10A high-performance liquid chromatography system. The analysis was carried out in reversed-phase mode with a Luna® C18 column (150 × 2.00 mm, 3 µm) at 40°C. The limit of quantitation was 0.1 µg/mL using 0.25 mL of microsomal solution. The assays were linear over the range 0.1,10 µg/mL for desmethoxyyangonin, methysticin and kavain. Quality control samples exhibited good accuracy and precision with relative standard deviations lower than 15% and recoveries between 85 and 105%. The assays exhibited satisfactory performance with high sensitivity for quantifying desmethoxyyangonin, methysticin and kavain in rat liver microsomes and were successfully used to determine the three kavalactones and their microsomal metabolites. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Statistical Inference For Risk Difference in an Incomplete Correlated 2 × 2 Table

BIOMETRICAL JOURNAL, Issue 1 2003
Nian-Sheng Tang
Abstract In some infectious disease studies and 2-step treatment studies, 2 × 2 table with structural zero could arise in situations where it is theoretically impossible for a particular cell to contain observations or structural void is introduced by design. In this article, we propose a score test of hypotheses pertaining to the marginal and conditional probabilities in a 2 × 2 table with structural zero via the risk/rate difference measure. Score test-based confidence interval will also be outlined. We evaluate the performance of the score test and the existing likelihood ratio test. Our empirical results evince the similar and satisfactory performance of the two tests (with appropriate adjustments) in terms of coverage probability and expected interval width. Both tests consistently perform well from small- to moderate-sample designs. The score test however has the advantage that it is only undefined in one scenario while the likelihood ratio test can be undefined in many scenarios. We illustrate our method by a real example from a two-step tuberculosis skin test study. [source]


Exact Log-Rank Tests for Unequal Follow-Up

BIOMETRICS, Issue 4 2003
Georg Heinze
Summary. The asymptotic log-rank and generalized Wilcoxon tests are the standard procedures for comparing samples of possibly censored survival times. For comparison of samples of very different sizes, an exact test is available that is based on a complete permutation of log-rank or Wilcoxon scores. While the asymptotic tests do not keep their nominal sizes if sample sizes differ substantially, the exact complete permutation test requires equal follow-up of the samples. Therefore, we have developed and present two new exact tests also suitable for unequal follow-up. The first of these is an exact analogue of the asymptotic log-rank test and conditions on observed risk sets, whereas the second approach permutes survival times while conditioning on the realized follow-up in each group. In an empirical study, we compare the new procedures with the asymptotic log-rank test, the exact complete permutation test, and an earlier proposed approach that equalizes the follow-up distributions using artificial censoring. Results confirm highly satisfactory performance of the exact procedure conditioning on realized follow-up, particularly in case of unequal follow-up. The advantage of this test over other options of analysis is finally exemplified in the analysis of a breast cancer study. [source]


A Solution to the Problem of Monotone Likelihood in Cox Regression

BIOMETRICS, Issue 1 2001
Georg Heinze
Summary. The phenomenon of monotone likelihood is observed in the fitting process of a Cox model if the likelihood converges to a finite value while at least one parameter estimate diverges to ±,. Monotone likelihood primarily occurs in small samples with substantial censoring of survival times and several highly predictive covariates. Previous options to deal with monotone likelihood have been unsatisfactory. The solution we suggest is an adaptation of a procedure by Firth (1993, Biometrika80, 27,38) originally developed to reduce the bias of maximum likelihood estimates. This procedure produces finite parameter estimates by means of penalized maximum likelihood estimation. Corresponding Wald-type tests and confidence intervals are available, but it is shown that penalized likelihood ratio tests and profile penalized likelihood confidence intervals are often preferable. An empirical study of the suggested procedures confirms satisfactory performance of both estimation and inference. The advantage of the procedure over previous options of analysis is finally exemplified in the analysis of a breast cancer study. [source]


Validation of hydrological models for climate scenario simulation: the case of Saguenay watershed in Quebec

HYDROLOGICAL PROCESSES, Issue 23 2007
Yonas B. Dibike
Abstract This paper presents the results of an investigation into the problems associated with using downscaled meteorological data for hydrological simulations of climate scenarios. The influence of both the hydrological models and the meteorological inputs driving these models on climate scenario simulation studies are investigated. A regression-based statistical tool (SDSM) is used to downscale the daily precipitation and temperature data based on climate predictors derived from the Canadian global climate model (CGCM1), and two types of hydrological model, namely the physically based watershed model WatFlood and the lumped-conceptual modelling system HBV-96, are used to simulate the flow regimes in the major rivers of the Saguenay watershed in Quebec. The models are validated with meteorological inputs from both the historical records and the statistically downscaled outputs. Although the two hydrological models demonstrated satisfactory performances in simulating stream flows in most of the rivers when provided with historic precipitation and temperature records, both performed less well and responded differently when provided with downscaled precipitation and temperature data. By demonstrating the problems in accurately simulating river flows based on downscaled data for the current climate, we discuss the difficulties associated with downscaling and hydrological models used in estimating the possible hydrological impact of climate change scenarios. Copyright © 2007 John Wiley & Sons, Ltd. [source]