Accurate Measure (accurate + measure)

Distribution by Scientific Domains
Distribution within Medical Sciences


Selected Abstracts


Quantitative evaluation of the lengths of homobifunctional protein cross-linking reagents used as molecular rulers

PROTEIN SCIENCE, Issue 7 2001
Nora S. Green
Abstract Homobifunctional chemical cross-linking reagents are important tools for functional and structural characterization of proteins. Accurate measures of the lengths of these molecules currently are not available, despite their widespread use. Stochastic dynamics calculations now provide quantitative measures of the lengths, and length dispersions, of 32 widely used molecular rulers. Significant differences from published data have been found. [source]


The eyes have it: visual pop-out in infants and adults

DEVELOPMENTAL SCIENCE, Issue 2 2006
Scott A. Adler
Visual search studies with adults have shown that stimuli that contain a unique perceptual feature pop out from dissimilar distractors and are unaffected by the number of distractors. Studies with very young infants have suggested that they too might exhibit pop-out. However, infant studies have used paradigms in which pop-out is measured in seconds or minutes, whereas in adults pop-out occurs in milliseconds. In addition, with the previous infant paradigms the effects from higher cognitive processes such as memory cannot be separated from pop-out and selective attention. Consequently, whether infants exhibit the phenomenon of pop-out and have selective attention mechanisms as found in adults is not clear. This study was an initial attempt to design a paradigm that would provide a comparable measure between infants and adults, thereby allowing a more accurate determination of the developmental course of pop-out and selective attention mechanisms. To this end, we measured 3-month-olds' and adults' saccade latencies to visual arrays that contained either a + among Ls (target-present) or all Ls (target-absent) with set sizes of 1, 3, 5 or 8 items. In Experiment 1, infants' saccade latencies remained unchanged in the target-present conditions as set size increased, whereas their saccade latencies increased linearly in the target-absent conditions as set size increased. In Experiment 2, adults' saccade latencies in the target-present and target-absent conditions showed the same pattern as the infants. The only difference between the infants and adults was that the infants' saccade latencies were slower in every condition. These results indicate that infants do exhibit pop-out on a millisecond scale, that it is unaffected by the number of distractors, and likely have similar functioning selective attention mechanisms. Moreover, the results indicate that eye movement latencies are a more comparable and accurate measure for assessing the phenomenon of pop-out and underlying attentional mechanisms in infants. [source]


Pathological gambling: an increasing public health problem

ACTA PSYCHIATRICA SCANDINAVICA, Issue 4 2001
Article first published online: 7 JUL 200
Gambling has always existed, but only recently has it taken on the endlessly variable and accessible forms we know today. Gambling takes place when something valuable , usually money , is staked on the outcome of an event that is entirely unpredictable. It was only two decades ago that pathological gambling was formally recognized as a mental disorder, when it was included in the DSM-III in 1980. For most people, gambling is a relaxing activity with no negative consequences. For others, however, gambling becomes excessive. Pathological gambling is a disorder that manifests itself through the irrepressible urge to wager money. This disorder ultimately dominates the gambler's life, and has a multitude of negative consequences for both the gambler and the people they interact with, i.e. friends, family members, employers. In many ways, gambling might seem a harmless activity. In fact, it is not the act of gambling itself that is harmful, but the vicious cycle that can begin when a gambler wagers money they cannot afford to lose, and then continues to gamble in order to recuperate their losses. The gambler's ,tragic flaw' of logic lies in their failure to understand that gambling is governed solely by random, chance events. Gamblers fail to recognize this and continue to gamble, attempting to control outcomes by concocting strategies to ,beat the game'. Most, if not all, gamblers try in some way to predict the outcome of a game when they are gambling. A detailed analysis of gamblers' selfverbalizations reveals that most of them behave as though the outcome of the game relied on their personal ,skills'. From the gambler's perspective, skill can influence chance , but in reality, the random nature of chance events is the only determinant of the outcome of the game. The gambler, however, either ignores or simply denies this fundamental rule (1). Experts agree that the social costs of pathological gambling are enormous. Changes in gaming legislation have led to a substantial expansion of gambling opportunities in most industrialized countries around the world, mainly in Europe, America and Australia. Figures for the United States' leisure economy in 1996 show gross gambling revenues of $47.6 billion, which was greater than the combined revenue of $40.8 billion from film box offices, recorded music, cruise ships, spectator sports and live entertainment (2). Several factors appear to be motivating this growth: the desire of governments to identify new sources of revenue without invoking new or higher taxes; tourism entrepreneurs developing new destinations for entertainment and leisure; and the rise of new technologies and forms of gambling (3). As a consequence, prevalence studies have shown increased gambling rates among adults. It is currently estimated that 1,2% of the adult population gambles excessively (4, 5). Given that the prevalence of gambling is related to the accessibility of gambling activities, and that new forms of gambling are constantly being legalized throughout most western countries, this figure is expected to rise. Consequently, physicians and mental health professionals will need to know more about the diagnosis and treatment of pathological gamblers. This disorder may be under-diagnosed because, clinically, pathological gamblers usually seek help for the problems associated with gambling such as depression, anxiety or substance abuse, rather than for the excessive gambling itself. This issue of Acta Psychiatrica Scandinavica includes the first national survey of problem gambling completed in Sweden, conducted by Volberg et al. (6). This paper is based on a large sample (N=9917) with an impressively high response rate (89%). Two instruments were used to assess gambling activities: the South Oaks Gambling Screen-Revised (SOGS-R) and an instrument derived from the DSM-IV criteria for pathological gambling. Current (1 year) and lifetime prevalence rates were collected. Results show that 0.6% of the respondents were classified as probable pathological gamblers, and 1.4% as problem gamblers. These data reveal that the prevalence of pathological gamblers in Sweden is significantly less than what has been observed in many western countries. The authors have pooled the rates of problem (1.4%) and probable pathological gamblers (0.6%), to provide a total of 2.0% for the current prevalence. This 2% should be interpreted with caution, however, as we do not have information on the long-term evolution of these subgroups of gamblers; for example, we do not know how many of each subgroup will become pathological gamblers, and how many will decrease their gambling or stop gambling altogether. Until this information is known, it would be preferable to keep in mind that only 0.6% of the Swedish population has been identified as pathological gamblers. In addition, recent studies show that the SOGS-R may be producing inflated estimates of pathological gambling (7). Thus, future research in this area might benefit from the use of an instrument based on DSM criteria for pathological gambling, rather than the SOGS-R only. Finally, the authors suggest in their discussion that the lower rate of pathological gamblers obtained in Sweden compared to many other jurisdictions may be explained by the greater availability of games based on chance rather than games based on skill or a mix of skill and luck. Before accepting this interpretation, researchers will need to demonstrate that the outcomes of all games are determined by other factor than chance and randomness. Many studies have shown that the notion of randomness is the only determinant of gambling (1). Inferring that skill is an important issue in gambling may be misleading. While these are important issues to consider, the Volberg et al. survey nevertheless provides crucial information about gambling in a Scandinavian country. Gambling will be an important issue over the next few years in Sweden, and the publication of the Volberg et al. study is a landmark for the Swedish community (scientists, industry, policy makers, etc.). This paper should stimulate interesting discussions and inspire new, much-needed scientific investigations of pathological gambling. Acta Psychiatrica Scandinavica Guido Bondolfi and Robert Ladouceur Invited Guest Editors References 1.,LadouceurR & WalkerM. The cognitive approach to understanding and treating pathological gambling. In: BellackAS, HersenM, eds. Comprehensive clinical psychology. New York: Pergamon, 1998:588 , 601. 2.,ChristiansenEM. Gambling and the American economy. In: FreyJH, ed. Gambling: socioeconomic impacts and public policy. Thousand Oaks, CA: Sage, 1998:556:36 , 52. 3.,KornDA & ShafferHJ. Gambling and the health of the public: adopting a public health perspective. J Gambling Stud2000;15:289 , 365. 4.,VolbergRA. Problem gambling in the United States. J Gambling Stud1996;12:111 , 128. 5.,BondolfiG, OsiekC, FerreroF. Prevalence estimates of pathological gambling in Switzerland. Acta Psychiatr Scand2000;101:473 , 475. 6.,VolbergRA, AbbottMW, RönnbergS, MunckIM. Prev-alence and risks of pathological gambling in Sweden. Acta Psychiatr Scand2001;104:250 , 256. 7.,LadouceurR, BouchardC, RhéaumeNet al. Is the SOGS an accurate measure of pathological gambling among children, adolescents and adults?J Gambling Stud2000;16:1 , 24. [source]


Two-Dimensional Assessment of Right Ventricular Function: An Echocardiographic,MRI Correlative Study

ECHOCARDIOGRAPHY, Issue 5 2007
Nagesh S. Anavekar M.D.
Background: While echocardiography is used most frequently to assess right ventricular (RV) function in clinical practice, echocardiography is limited in its ability to provide an accurate measure of RV ejection fraction (RVEF). Hence, quantitative estimation of RV function has proven difficult in clinical practice. Objective: We sought to determine which commonly used echocardiographic measures of RV function were most accurate in comparison with an MRI-derived estimate of RVEF. Methods: We analyzed RV function in 36 patients who had cardiac MRI studies and echocardiograms within a 24 hour period. 2D parameters of RV function,right ventricular fractional area change (RVFAC), tricuspid annular motion (TAM), and transverse fractional shortening (TFS) were obtained from the four-chamber view. RV volumes and EFs were derived from volumetric reconstruction based on endocardial tracing of the RV chamber from the short axis images. Echocardiographic assessment of RV function was correlated with MRI findings. Results: RVFAC measured by echocardiography correlated best with MRI-derived RVEF (r = 0.80, P < 0.001). Neither TAM (r = 0.17; P = 0.30) nor TFC (r = 0.12; p< 0.38) were significantly correlated with RVEF. Conclusions: RVFAC is the best of commonly utilized echocardiographic 2D measure of RV function and correlated best with MRI-derived RV ejection fraction. Condensed Abstract: While echocardiography is used most frequently to assess RV function in clinical practice, echocardiography is limited in its ability to provide an accurate measure of RV ejection fraction (RVEF). Using cardiac MRI, RV fractional area change (RVFAC), determined either by MRI or echocardiography, was found to correlate best with MRI-derived RVEF. [source]


Cost-effectiveness analysis based on the number-needed-to-treat: common sense or non-sense?

HEALTH ECONOMICS, Issue 1 2004
Ivar Sønbø Kristiansen
Abstract This paper explores and critically discusses some of the methodological limitations of using the number-needed-to-treat (NNT) in economic evaluation. We argue that NNT may be a straightforward measure of benefit when the effect of an intervention is immediate, but that serious problems arise when the effect is delay rather than avoidance of an adverse event. In this case, NNT is not a robust or accurate measure of effect, but will vary considerably and inconsistently over time. This weakness will naturally spill over onto any CEA based on NNT. A literature review demonstrated that CEAs based on NNT were all published within the last five years, and that all studies suffered from important limitations. A major weakness of using NNT is the imposed restrictions on the outcome measure, which can only be strictly uni-dimensional and non-generic. Using NNT in economic evaluations is obtained at a cost in terms of both methodological shortcomings, and a reduced ability for such evaluations to serve as a useful tool in decision making processes. The use of NNT in economic evaluations might be better avoided. To every complicated question, there is a simple, straightforward, easy , and probably wrong answer (Occam's Sledgehammer). Copyright © 2003 John Wiley & Sons, Ltd. [source]


Rapid assessment of internodal myelin integrity in central nervous system tissue

JOURNAL OF NEUROSCIENCE RESEARCH, Issue 4 2010
Daniel A. Kirschner
Abstract Monitoring pathology/regeneration in experimental models of de-/remyelination requires an accurate measure not only of functional changes but also of the amount of myelin. We tested whether X-ray diffraction (XRD), which measures periodicity in unfixed myelin, can assess the structural integrity of myelin in fixed tissue. From laboratories involved in spinal cord injury research and in studying the aging primate brain, we solicited "blind" samples and used an electronic detector to record rapidly the diffraction patterns (30 min each pattern) from them. We assessed myelin integrity by measuring its periodicity and relative amount. Fixation of tissue itself introduced ±10% variation in periodicity and ±40% variation in relative amount of myelin. For samples having the most native-like periods, the relative amounts of myelin detected allowed distinctions to be made between normal and demyelinating segments, between motor and sensory tracts within the spinal cord, and between aged and young primate CNS. Different periodicities also allowed distinctions to be made between samples from spinal cord and nerve roots and between well-fixed and poorly fixed samples. Our findings suggest that, in addition to evaluating the effectiveness of different fixatives, XRD could also be used as a robust and rapid technique for quantitating the relative amount of myelin among spinal cords and other CNS tissue samples from experimental models of de- and remyelination. © 2009 Wiley-Liss, Inc. [source]


A method for interleaved acquisition of a vascular input function for dynamic contrast-enhanced MRI in experimental rat tumours

NMR IN BIOMEDICINE, Issue 3 2004
Dominick J. O. McIntyre
Abstract Dynamic contrast-enhanced MRI is widely used for the evaluation of the response of experimental rodent tumours to antitumour therapy, particularly for the newly developing antiangiogenic and antivascular agents. However, standard models require a time-course for the plasma concentration of contrast agent (usually referred to as the arterial input function) to calculate the transfer constant Ktrans from the dynamic time-course data. Ideally, the plasma concentration time-course should be measured during each experiment to obtain the most accurate measure of Ktrans. This is technically difficult in rodents, so assumed values are generally used. A method is presented here using interleaved acquisitions from a tail coil to obtain the plasma concentration simultaneously with DCE-MRI data obtained from a solenoid coil around the tumour. The SNR of the resulting vascular input function data is high compared with methods using a volume coil to acquire plasma concentrations from the aorta and vena cava. Copyright © 2004 John Wiley & Sons, Ltd. [source]


The Dynamics of Ethnic Fragmentation

AMERICAN JOURNAL OF ECONOMICS AND SOCIOLOGY, Issue 2 2005
A Proposal for an Expanded Measurement Index
This paper identifies problems associated with the current empirical measurement of ethnic diversity in economic development literature. An expanded index of ethnic diversity is proposed to include variables such as religion and race, and the results are compared to the prevailing index utilized in empirical literature. The mean of the proposed index is significantly larger and the standard deviation is significantly smaller than that of the prevailing index. This would suggest that disparities in ethnic diversity among countries are not as wide as previously assumed. Further, these results confirm that a comprehensive and more accurate measure of ethnic diversity requires more than a linguistic measurement, which is the primary factor utilized in the prevailing index. [source]


Hedging or Market Timing?

THE JOURNAL OF FINANCE, Issue 2 2005
Selecting the Interest Rate Exposure of Corporate Debt
ABSTRACT This paper examines whether firms are hedging or timing the market when selecting the interest rate exposure of their new debt issuances. I use a more accurate measure of the interest rate exposure chosen by firms by combining the initial exposure of newly issued debt securities with their use of interest rate swaps. The results indicate that the final interest rate exposure is largely driven by the slope of the yield curve at the time the debt is issued. These results suggest that interest rate risk management practices are primarily driven by speculation or myopia, not hedging considerations. [source]


Implied correlation index: A new measure of diversification

THE JOURNAL OF FUTURES MARKETS, Issue 2 2005
Vasiliki D. Skintzi
Most approaches in forecasting future correlation depend on the use of historical information as their basic information set. Recently, there have been some attempts to use the notion of "implied" correlation as a more accurate measure of future correlation. This study proposes an innovative methodology for backing-out implied correlation measures from index options. This new measure called implied correlation index reflects the market view of the future level of the diversification in the market portfolio represented by the index. The methodology is applied to the Dow Jones Industrial Average index, and the statistical properties and the dynamics of the proposed implied correlation measure are examined. The evidence of this study indicates that the implied correlation index fluctuates substantially over time and displays strong dynamic dependence. Moreover, there is a systematic tendency for the implied correlation index to increase when the market index returns decrease and/or the market volatility increases, indicating limited diversification when it is needed most. Finally, the forecast performance of the implied correlation index is assessed. Although the implied correlation index is a biased forecast of realized correlation, it has a high explanatory power, and it is orthogonal to the information set compared to a historical forecast. © 2005 Wiley Periodicals, Inc. Jrl Fut Mark 25:171,197, 2005 [source]


The primary FRCA OSCE , an accurate measure of communication skills?

ANAESTHESIA, Issue 1 2009
D. J. McColl
No abstract is available for this article. [source]


Risk Assessment for Quantitative Responses Using a Mixture Model

BIOMETRICS, Issue 2 2000
Mehdi Razzaghi
Summary. A problem that frequently occurs in biological experiments with laboratory animals is that some subjects are less susceptible to the treatment than others. A mixture model has traditionally been proposed to describe the distribution of responses in treatment groups for such experiments. Using a mixture dose-response model, we derive an upper confidence limit on additional risk, defined as the excess risk over the background risk due to an added dose. Our focus will be on experiments with continuous responses for which risk is the probability of an adverse effect defined as an event that is extremely rare in controls. The asymptotic distribution of the likelihood ratio statistic is used to obtain the upper confidence limit on additional risk. The method can also be used to derive a benchmark dose corresponding to a specified level of increased risk. The EM algorithm is utilized to find the maximum likelihood estimates of model parameters and an extension of the algorithm is proposed to derive the estimates when the model is subject to a specified level of added risk. An example is used to demonstrate the results, and it is shown that by using the mixture model a more accurate measure of added risk is obtained. [source]


A limited sampling strategy for tacrolimus in renal transplant patients

BRITISH JOURNAL OF CLINICAL PHARMACOLOGY, Issue 4 2008
Binu S. Mathew
WHAT IS ALREADY KNOWN ABOUT THIS SUBJECT , Tacrolimus trough concentration is being currently used for dose individualization. , Limited sampling strategies (LSS) have been developed and validated for renal transplant patients. , Earlier literature has suggested that measurement of tacrolimus AUC is more reliable than trough with respect to both rejection and nephrotoxicity. WHAT THIS STUDY ADDS , Four thousand renal transplants take place annually in India, with many patients prescribed tacrolimus in combination with mycophenolate and steroid. , In this study a LSS with two points, i.e. trough and 1.5 h postdose was developed and validated to estimate AUC0,12. , The added benefit of only a single additional sample with completion of blood collection in 1.5 h and minimum additional cost makes this a viable LSS algorithm in renal transplant patients. , In patients having tacrolimus trough concentrations outside the recommended range (<3 and >10 ng ml,1 in the treatment protocol in our institution) or having side-effects in spite of trough concentrations in the desired range, we can estimate AUC using this LSS for a better prediction of exposure. AIMS To develop and validate limited sampling strategy (LSS) equations to estimate area under the curve (AUC0,12) in renal transplant patients. METHODS Twenty-nine renal transplant patients (3,6 months post transplant) who were at steady state with respect to tacrolimus kinetics were included in this study. The blood samples starting with the predose (trough) and collected at fixed time points for 12 h were analysed by microparticle enzyme immunoassay. Linear regression analysis estimated the correlations of tacrolimus concentrations at different sampling time points with the total measured AUC0,12. By applying multiple stepwise linear regression analysis, LSS equations with acceptable correlation coefficients (R2), bias and precision were identified. The predictive performance of these models was validated by the jackknife technique. RESULTS Three models were identified, all with R2 , 0.907. Two point models included one with trough (C0) and 1.5 h postdose (C1.5), another with trough and 4 h postdose. Increasing the number of sampling time points to more than two increased R2 marginally (0.951 to 0.990). After jackknife validation, the two sampling time point (trough and 1.5 h postdose) model accurately predicted AUC0,12. Regression coefficient R2 = 0.951, intraclass correlation = 0.976, bias [95% confidence interval (CI)] 0.53% (,2.63, 3.69) and precision (95% CI) 6.35% (4.36, 8.35). CONCLUSION The two-point LSS equation [AUC0,12 = 19.16 + (6.75.C0) + (3.33.C1.5)] can be used as a predictable and accurate measure of AUC0,12 in stable renal transplant patients prescribed prednisolone and mycophenolate. [source]


Comparison of apoptosis and mortality measurements in peripheral blood mononuclear cells (PBMCs) using multiple methods

CELL PROLIFERATION, Issue 5 2005
S. Glisic-Milosavljevic
Death through apoptosis is the main process by which aged cells that have lost their function are eliminated. Apoptotic cells are usually detected microscopically by changes in their morphology. However, determination of early apoptotic events is important for in vitro (and ex vivo) studies. The main objective of the present study is to find the most sensitive method for apoptosis detection in human peripheral blood mononuclear cells (PBMCs) by comparing six different methods following five different means of immunological stimulation at 3 and 5 days. Each of six apoptosis quantification methods, except the trypan blue exclusion test, is a combination of two stains, one for the specific detection of apoptotic cells and the other for the unspecific detection of dead cells. Values for apoptosis and mortality were compared with a reference method. The choice of apoptosis detection method is more important following 3 days of stimulation than after 5 days of stimulation (P = 2 × 10,6 versus P = 1 × 10,2). In contrast, we find mortality measurements following the different means of stimulation highly significant at both 3 and 5 days (F2.28 = 7.9, P = 1.4 × 10,6 at 3 days and F2.28 = 8.5, P = 4.5 × 10,7 at 5 days). Variation as a result of the combination of specific PBMC stimulation and the method used to detect apoptosis is reduced considerably with time (F1.58 + 3.7, P + 3 × 10,7 at 3 days to F = (1.58) = 0.97, P = 0.5 at 5 days). Based on Tukey's test, YO-PRO-1 is the most sensitive stain for apoptosis and, when combined with 7-AAD, provides an accurate measure of apoptosis and mortality. In conclusion, we propose YO-PRO-1/7-AAD as a new combination and low-cost alternative for the sensitive detection of early apoptosis. [source]


Current Technique of Fluid Status Assessment

CONGESTIVE HEART FAILURE, Issue 2010
FACEP, W. Frank Peacock MD
Congest Heart Fail. 2010;16(4)(suppl 1):S45,S51. ©2010 Wiley Periodicals, Inc. Early in the management of acute illness, it is critically important that volume status is accurately estimated. If inappropriate therapy is given because of errors in volume assessment, acute mortality rates are increased. Unfortunately, as the gold standard of radioisotopic volume measurement is costly and time-consuming, in the acute care environment clinicians are forced to rely on less accurate measures. In this manuscript, the authors review the currently available techniques of volume assessment for patients presenting with acute illness. In addition to discussing the accuracy of the history, physical examination, and radiography, acoustic cardiography and bedside ultrasonography are presented. [source]


Phenotypic variation and FMRP levels in fragile X

DEVELOPMENTAL DISABILITIES RESEARCH REVIEW, Issue 1 2004
Danuta Z. Loesch
Abstract Data on the relationships between cognitive and physical phenotypes, and a deficit of fragile X mental retardation 1 (FMR1) gene-specific protein product, FMRP, are presented and discussed in context with earlier findings. The previously unpublished results obtained, using standard procedures of regression and correlations, showed highly significant associations in males between FMRP levels and the Wechsler summary and subtest scores and in females between these levels and the full-scale intelligence quotient (FSIQ), verbal and performance IQ, and some Wechsler subtest scores. The published results based on data from 144 extended families with fragile X, recruited from Australia and the United States within a collaborative NIH-supported project, were obtained using robust modification of maximum likelihood in pedigrees. The results indicated that processing speed, short-term memory, and the ability to control attention, especially in the context of regulating goal-directed behavior, may be primarily affected by the FMRP depletion. The effect of this depletion on physical phenotype was also demonstrated, especially on body and head height and extensibility of finger joints. It is recommended that further studies should rely on more accurate measures of FMRP levels, and use of larger samples, to overcome extensive variability in the data. MRDD Research Reviews 2004;10:31,41. © 2004 Wiley-Liss, Inc. [source]


QTL Analysis of Trabecular Bone in BXD F2 and RI Mice,

JOURNAL OF BONE AND MINERAL RESEARCH, Issue 8 2006
Abbey L Bower
Abstract A sample of 693 mice was used to identify regions of the mouse genome associated with trabecular bone architecture as measured using ,CT. QTLs for bone in the proximal tibial metaphysis were identified on several chromosomes indicating regions containing genes that regulate properties of trabecular bone. Introduction: Age-related osteoporosis is a condition of major concern because of the morbidity and mortality associated with osteoporotic fractures in humans. Osteoporosis is characterized by reduced bone density, strength, and altered trabecular architecture, all of which are quantitative traits resulting from the actions of many genes working in concert with each other and the environment over the lifespan. ,CT gives accurate measures of trabecular bone architecture providing phenotypic data related to bone volume and trabecular morphology. The primary objective of this research was to identify chromosomal regions called quantitative trait loci (QTLs) that contain genes influencing trabecular architecture as measured by ,CT. Materials and Methods: The study used crosses between C57BL/6J (B6) and DBA/2J (D2) as progenitor strains of a second filial (F2) generation (n = 141 males and 148 females) and 23 BXD recombinant inbred (RI) strains (n , 9 of each sex per strain). The proximal tibial metaphyses of the 200-day-old mice were analyzed by ,CT to assess phenotypic traits characterizing trabecular bone, including bone volume fraction, trabecular connectivity, and quantitative measures of trabecular orientation and anisotropy. Heritabilities were calculated and QTLs were identified using composite interval mapping. Results: A number of phenotypes were found to be highly heritable. Heritability values for measured phenotypes using RI strains ranged from 0.15 for degree of anisotropy in females to 0.51 for connectivity density in females and total volume in males. Significant and confirmed QTLs, with LOD scores ,4.3 in the F2 cohort and ,1.5 in the corresponding RI cohort were found on chromosomes 1 (43 cM), 5 (44 cM), 6 (20 cM), and 8 (49 cM). Other QTLs with LOD scores ranging from 2.8 to 6.9 in the F2 analyses were found on chromosomes 1, 5, 6, 8, 9, and 12. QTLs were identified using data sets comprised of both male and female quantitative traits, suggesting similar genetic action in both sexes, whereas others seemed to be associated exclusively with one sex or the other, suggesting the possibility of sex-dependent effects. Conclusions: Identification of the genes underlying these QTLs may lead to improvements in recognizing individuals most at risk for developing osteoporosis and in the design of new therapeutic interventions. [source]


THE TEN COMMANDMENTS FOR OPTIMIZING VALUE-AT-RISK AND DAILY CAPITAL CHARGES

JOURNAL OF ECONOMIC SURVEYS, Issue 5 2009
Michael McAleer
Abstract Credit risk is the most important type of risk in terms of monetary value. Another key risk measure is market risk, which is concerned with stocks and bonds, and related financial derivatives, as well as exchange rates and interest rates. This paper is concerned with market risk management and monitoring under the Basel II Accord, and presents Ten Commandments for optimizing value-at-risk (VaR) and daily capital charges, based on choosing wisely from (1) conditional, stochastic and realized volatility; (2) symmetry, asymmetry and leverage; (3) dynamic correlations and dynamic covariances; (4) single index and portfolio models; (5) parametric, semi-parametric and non-parametric models; (6) estimation, simulation and calibration of parameters; (7) assumptions, regularity conditions and statistical properties; (8) accuracy in calculating moments and forecasts; (9) optimizing threshold violations and economic benefits; and (10) optimizing private and public benefits of risk management. For practical purposes, it is found that the Basel II Accord would seem to encourage excessive risk taking at the expense of providing accurate measures and forecasts of risk and VaR. [source]


Monitoring bird migration with a fixed-beam radar and a thermal-imaging camera

JOURNAL OF FIELD ORNITHOLOGY, Issue 3 2006
Sidney A. Gauthreaux Jr
ABSTRACT Previous studies using thermal imaging cameras (TI) have used target size as an indicator of target altitude when radar was not available, but this approach may lead to errors if birds that differ greatly in size are actually flying at the same altitude. To overcome this potential difficulty and obtain more accurate measures of the flight altitudes and numbers of individual migrants, we have developed a technique that combines a vertically pointed stationary radar beam and a vertically pointed thermal imaging camera (VERTRAD/TI). The TI provides accurate counts of the birds passing through a fixed, circular sampling area in the TI display, and the radar provides accurate data on their flight altitudes. We analyzed samples of VERTRAD/TI video data collected during nocturnal fall migration in 2000 and 2003 and during the arrival of spring trans-Gulf migration during the daytime in 2003. We used a video peak store (VPS) to make time exposures of target tracks in the video record of the TI and developed criteria to distinguish birds, foraging bats, and insects based on characteristics of the tracks in the VPS images and the altitude of the targets. The TI worked equally well during daytime and nighttime observations and best when skies were clear, because thermal radiance from cloud heat often obscured targets. The VERTRAD/TI system, though costly, is a valuable tool for measuring accurate bird migration traffic rates (the number of birds crossing 1609.34 m [1 statute mile] of front per hour) for different altitudinal strata above 25 m. The technique can be used to estimate the potential risk of migrating birds colliding with man-made obstacles of various heights (e.g., communication and broadcast towers and wind turbines),a subject of increasing importance to conservation biologists. SINOPSIS Estudios previos, en donde no se ha hecho uso de radar, han utilizado cámaras infrarojas de imagen termal (CIT) y el tamaño de individuos como indicador, para detereminar la altura de vuelo. Sin embargo, este método puede dar origen a errores si las aves que vuelan a una misma altura varían en tamaño. Para subsanar esta dificultad y tomar medidas más exactas de la altura de vuelo y el número de individuos en una bandada, desarrollamos una técnica que combina un radar de rayos fijos con antena parabólica (RRF) con una cámara infraroja de imagen termal (RRT/CIT). El CIT provee de un conteo preciso de las aves pasando por un área circular fija de muestreo y el radar provee el dato preciso de la altura de vuelo. Utilizando una videograbadora digital, analizamos las muestras tomadas con la combinación RRT/CIT durante la migración otoñal noctura en el 2000 y el 2003 y durante la migración primaveral diurna del 2003, a través del Golfo de México. Utilizamos la cámara de video para hacer exposiciones en lapsos de tiempo en lo tomado por el CIT y desarrollamos criterios para distinguir entre aves, murciélagos e insectos, usando la huella dejada en el video y la altura del objeto. El CIT trabajo de forma eficiente tanto de dia como de noche, pero aún mejor cuando el cielo estaba despejado (cuando esta ausente la interferencia por la irradiación de calor de parte de las nubes). El sistema RRT/CIT, aunque costoso, es una herramienta valiosa para medir con presición las rutas migratorias y el número de aves moviéndose a diferente altura. Dicho sistema es de gran utilidad para determinar el riesgo de coliciones de aves migratorias con obstáculos construidos por el hombre a diferentes alturas (ej. torres de comunicación o turbinas de viento), asuntos de gran relevancia e importancia para la conservación de aves. [source]


Daily volatility forecasts: reassessing the performance of GARCH models

JOURNAL OF FORECASTING, Issue 6 2004
David G. McMillan
Abstract Volatility plays a key role in asset and portfolio management and derivatives pricing. As such, accurate measures and good forecasts of volatility are crucial for the implementation and evaluation of asset and derivative pricing models in addition to trading and hedging strategies. However, whilst GARCH models are able to capture the observed clustering effect in asset price volatility in-sample, they appear to provide relatively poor out-of-sample forecasts. Recent research has suggested that this relative failure of GARCH models arises not from a failure of the model but a failure to specify correctly the ,true volatility' measure against which forecasting performance is measured. It is argued that the standard approach of using ex post daily squared returns as the measure of ,true volatility' includes a large noisy component. An alternative measure for ,true volatility' has therefore been suggested, based upon the cumulative squared returns from intra-day data. This paper implements that technique and reports that, in a dataset of 17 daily exchange rate series, the GARCH model outperforms smoothing and moving average techniques which have been previously identified as providing superior volatility forecasts. Copyright © 2004 John Wiley & Sons, Ltd. [source]


An outlier robust GARCH model and forecasting volatility of exchange rate returns

JOURNAL OF FORECASTING, Issue 5 2002
Beum-Jo Park
Abstract Since volatility is perceived as an explicit measure of risk, financial economists have long been concerned with accurate measures and forecasts of future volatility and, undoubtedly, the Generalized Autoregressive Conditional Heteroscedasticity (GARCH) model has been widely used for doing so. It appears, however, from some empirical studies that the GARCH model tends to provide poor volatility forecasts in the presence of additive outliers. To overcome the forecasting limitation, this paper proposes a robust GARCH model (RGARCH) using least absolute deviation estimation and introduces a valuable estimation method from a practical point of view. Extensive Monte Carlo experiments substantiate our conjectures. As the magnitude of the outliers increases, the one-step-ahead forecasting performance of the RGARCH model has a more significant improvement in two forecast evaluation criteria over both the standard GARCH and random walk models. Strong evidence in favour of the RGARCH model over other competitive models is based on empirical application. By using a sample of two daily exchange rate series, we find that the out-of-sample volatility forecasts of the RGARCH model are apparently superior to those of other competitive models. Copyright © 2002 John Wiley & Sons, Ltd. [source]


Measurements of area and the (island) species,area relationship: new directions for an old pattern

OIKOS, Issue 10 2008
Kostas A. Triantis
The species,area relationship is one of the strongest empirical generalizations in geographical ecology, yet controversy persists about some important questions concerning its causality and application. Here, using more accurate measures of island surface size for five different island systems, we show that increasing the accuracy of the estimation of area has negligible impact on the fit and form of the species,area relationship, even though our analyses included some of the most topographically diverse island groups in the world. In addition, we show that the inclusion of general measurements of environmental heterogeneity (in the form of the so-called choros model), can substantially improve the descriptive power of models of island species number. We suggest that quantification of other variables, apart from area, that are also critical for the establishment of biodiversity and at the same time have high explanatory power (such as island age, distance, productivity, energy, and environmental heterogeneity), is necessary if we are to build up a more predictive science of species richness variation across island systems. [source]


How well do patients report noncompliance with antihypertensive medications?: a comparison of self-report versus filled prescriptions

PHARMACOEPIDEMIOLOGY AND DRUG SAFETY, Issue 1 2004
Philip S. Wang MD
Abstract Purpose To address poor patient compliance with antihypertensives, clinicians and researchers need accurate measures of adherence with prescribed regimens. Although self-reports are often the only means available in routine practice, their accuracy and agreement with other data sources remain questionable. Methods A telephone survey was conducted on 200 hypertensive patients treated with a single antihypertensive agent in a large health maintenance organization (HMO) or a Veterans Affairs medical center (VAMC) to obtain self-reports of the frequency of missing antihypertensive therapy. We then analyzed records of all filled prescriptions to calculate the number of days that patients actually had antihypertensive medications available for use. Agreement between the two data sources was measured with correlation coefficients and kappa statistics. Logistic regression models were used to identify demographic, clinical and psychosocial correlates of overstating compliance. Results There was very poor agreement between self-reported compliance and days actually covered by filled prescriptions (Spearman correlation coefficient,=,0.15; 95%CI: 0.01, 0.28). Very poor agreement was also observed between a categorical measure of self-reported compliance (ever vs. never missing a dose) and categories of actual compliance defined by filled prescriptions (<,80% vs >,80% of days covered; kappa,=,0.12, 95%CI: ,0.02, 0.26). Surprisingly, few factors were associated with inaccurate self-reporting in either crude or adjusted analyses; fewer visits to health care providers was significantly associated with overstating compliance. Conclusions Compliance was markedly overstated in this sample of patients and few characteristics identified those who reported more versus less accurately. Clinicians and researchers who rely on self-reports should be aware of these limits and should take steps to enhance their accuracy. Copyright © 2003 John Wiley & Sons, Ltd. [source]