Home About us Contact | |||
Expensive
Kinds of Expensive Terms modified by Expensive Selected AbstractsFast simulation of skin slidingCOMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 2-3 2009Xiaosong Yang Abstract Skin sliding is the phenomenon of the skin moving over underlying layers of fat, muscle and bone. Due to the complex interconnections between these separate layers and their differing elasticity properties, it is difficult to model and expensive to compute. We present a novel method to simulate this phenomenon at real-time by remeshing the surface based on a parameter space resampling. In order to evaluate the surface parametrization, we borrow a technique from structural engineering known as the force density method (FDM)which solves for an energy minimizing form with a sparse linear system. Our method creates a realistic approximation of skin sliding in real-time, reducing texture distortions in the region of the deformation. In addition it is flexible, simple to use, and can be incorporated into any animation pipeline. Copyright © 2009 John Wiley & Sons, Ltd. [source] Approximating character biomechanics with real-time weighted inverse kinematicsCOMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 4-5 2007Michael Meredith Abstract In this paper we show how the expensive, offline dynamic simulations of character motions can be approximated using the cheaper weighted inverse kinematics (WIK)-based approach. We first show how a dynamics-based approach can be used to produce a motion that is representative of a real target actor using the motion of a different source actor and the biomechanics of the target actor. This is compared against a process that uses WIK to achieve the same motion mapping goal without direct biomechanical input. The parallels between the results of the two approaches are described and further reasoned from a mathematical perspective. Thus we demonstrate how character biomechanics can be approximated with real-time WIK. Copyright © 2007 John Wiley & Sons, Ltd. [source] Sparsely Precomputing The Light Transport Matrix for Real-Time RenderingCOMPUTER GRAPHICS FORUM, Issue 4 2010Fu-Chung Huang Precomputation-based methods have enabled real-time rendering with natural illumination, all-frequency shadows, and global illumination. However, a major bottleneck is the precomputation time, that can take hours to days. While the final real-time data structures are typically heavily compressed with clustered principal component analysis and/or wavelets, a full light transport matrix still needs to be precomputed for a synthetic scene, often by exhaustive sampling and raytracing. This is expensive and makes rapid prototyping of new scenes prohibitive. In this paper, we show that the precomputation can be made much more efficient by adaptive and sparse sampling of light transport. We first select a small subset of "dense vertices", where we sample the angular dimensions more completely (but still adaptively). The remaining "sparse vertices" require only a few angular samples, isolating features of the light transport. They can then be interpolated from nearby dense vertices using locally low rank approximations. We demonstrate sparse sampling and precomputation 5 × faster than previous methods. [source] Scalable real-time animation of riversCOMPUTER GRAPHICS FORUM, Issue 2 2009Qizhi Yu Many recent games and applications target the interactive exploration of realistic large scale worlds. These worlds consist mostly of static terrain models, as the simulation of animated fluids in these virtual worlds is computationally expensive. Adding flowing fluids, such as rivers, to these virtual worlds would greatly enhance their realism, but causes specific issues: as the user is usually observing the world at close range, small scale details such as waves and ripples are important. However, the large scale of the world makes classical methods impractical for simulating these effects. In this paper, we present an algorithm for the interactive simulation of realistic flowing fluids in large virtual worlds. Our method relies on two key contributions: the local computation of the velocity field of a steady flow given boundary conditions, and the advection of small scale details on a fluid, following the velocity field, and uniformly sampled in screen space. [source] Interaction-Dependent Semantics for Illustrative Volume RenderingCOMPUTER GRAPHICS FORUM, Issue 3 2008Peter Rautek In traditional illustration the choice of appropriate styles and rendering techniques is guided by the intention of the artist. For illustrative volume visualizations it is difficult to specify the mapping between the 3D data and the visual representation that preserves the intention of the user. The semantic layers concept establishes this mapping with a linguistic formulation of rules that directly map data features to rendering styles. With semantic layers fuzzy logic is used to evaluate the user defined illustration rules in a preprocessing step. In this paper we introduce interaction-dependent rules that are evaluated for each frame and are therefore computationally more expensive. Enabling interaction-dependent rules, however, allows the use of a new class of semantics, resulting in more expressive interactive illustrations. We show that the evaluation of the fuzzy logic can be done on the graphics hardware enabling the efficient use of interaction-dependent semantics. Further we introduce the flat rendering mode and discuss how different rendering parameters are influenced by the rule base. Our approach provides high quality illustrative volume renderings at interactive frame rates, guided by the specification of illustration rules. [source] High-Quality Adaptive Soft Shadow MappingCOMPUTER GRAPHICS FORUM, Issue 3 2007Gaël Guennebaud Abstract The recent soft shadow mapping technique [GBP06] allows the rendering in real-time of convincing soft shadows on complex and dynamic scenes using a single shadow map. While attractive, this method suffers from shadow overestimation and becomes both expensive and approximate when dealing with large penumbrae. This paper proposes new solutions removing these limitations and hence providing an efficient and practical technique for soft shadow generation. First, we propose a new visibility computation procedure based on the detection of occluder contours, that is more accurate and faster while reducing aliasing. Secondly, we present a shadow map multi-resolution strategy keeping the computation complexity almost independent on the light size while maintaining high-quality rendering. Finally, we propose a view-dependent adaptive strategy, that automatically reduces the screen resolution in the region of large penumbrae, thus allowing us to keep very high frame rates in any situation. [source] Semi-Automatic 3D Reconstruction of Urban Areas Using Epipolar Geometry and Template MatchingCOMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 7 2006José Miguel Sales Dias The main challenge is to compute the relevant information,building's height and volume, roof's description, and texture,algorithmically, because it is very time consuming and thus expensive to produce it manually for large urban areas. The algorithm requires some initial calibration input and is able to compute the above-mentioned building characteristics from the stereo pair and the availability of the 2D CAD and the digital elevation model of the same area, with no knowledge of the camera pose or its intrinsic parameters. To achieve this, we have used epipolar geometry, homography computation, automatic feature extraction and we have solved the feature correspondence problem in the stereo pair, by using template matching. [source] Welding Automation in Space-Frame Bridge ConstructionCOMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 3 2001Alistair Greig The SPACES system has been proposed as an alternative for long-span bridge construction. Tubular space frames offer a structurally more efficient solution for bridges, but they have been considered too expensive because the joints at the nodal intersections of the tubular members are difficult and expensive to weld. The benefits of the SPACES system can only be realized by using a computer-integrated construction system to drive down the fabrication costs. A key component of the computer-integrated construction is the robotic welding system. This article describes the development of a lightweight automated welding system for the joining of tubular members. It addresses the geometry of intersecting cylinders and the kinematics and design of a 5-degree-of-freedom manipulator. Summary solutions are given for both. The control software is described briefly, and mention of the welding tests and overall business process is also made. A consortium of U.K. industry and universities is conducting the work. [source] Adaptable cache service and application to grid cachingCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 9 2010Laurent d'Orazio Abstract Caching is an important element to tackle performance issues in largely distributed data management. However, caches are efficient only if they are well configured according to the context of use. As a consequence, they are usually built from scratch. Such an approach appears to be expensive and time consuming in grids where the various characteristics lead to many heterogeneous cache requirements. This paper proposes a framework facilitating the construction of sophisticated and dynamically adaptable caches for heterogeneous applications. Such a framework has enabled the evaluation of several configurations for distributed data querying systems and leads us to propose innovative approaches for semantic and cooperative caching. This paper also reports the results obtained in bioinformatics data management on grids showing the relevance of our proposals. Copyright © 2009 John Wiley & Sons, Ltd. [source] Dynamic file system semantics to enable metadata optimizations in PVFSCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 14 2009Michael Kuhn Abstract Modern file systems maintain extensive metadata about stored files. While metadata typically is useful, there are situations when the additional overhead of such a design becomes a problem in terms of performance. This is especially true for parallel and cluster file systems, where every metadata operation is even more expensive due to their architecture. In this paper several changes made to the parallel cluster file system Parallel Virtual File System (PVFS) are presented. The changes target at the optimization of workloads with large numbers of small files. To improve the metadata performance, PVFS was modified such that unnecessary metadata is not managed anymore. Several tests with a large quantity of files were performed to measure the benefits of these changes. The tests have shown that common file system operations can be sped up by a factor of two even with relatively few changes. Copyright © 2009 John Wiley & Sons, Ltd. [source] Childhood cancer,mainly curable so where next?ACTA PAEDIATRICA, Issue 4 2000AW Craft More than 70% of childhood cancer is now curable with best modern therapy. The treatment is expensive but in terms of cost per life year saved, USD 1750, compares very favourably with other major health interventions. The rate of improvement in survival is slowing down. New, "designer", treatments are needed and, better still, prevention. The causes of childhood cancer are beginning to emerge. The origin for many is probably in utero and may be initiated by dietary and other environmental exposures perhaps in susceptible individuals. However, one of the great challenges for the future must be to extend the benefits of modern treatment to the 80% of the world's children who currently have little or no access to it in economically disadvantaged and emerging nations. The International Paediatric Oncology Society (SIOP) is leading the way in bringing hope for children with cancer worldwide. In India, with the support of the WHO, there is a "train the trainers" programme. In Africa, pilot studies of cost-effective treatments for Burkitt's lymphoma are producing gratifying results in Malawi and there are several examples of twinning programmes between major centres in developed and less well-developed countries. Conclusions: The future for children with cancer is bright. Most are curable and prevention may be just over the horizon. [source] The J2EE ECperf benchmark results: transient trophies or technology treasures?CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 10 2004Paul Brebner Abstract ECperf, the widely recognized industry standard J2EE benchmark, has attracted a large number of results submissions and their subsequent publications. However, ECperf places little restriction on the hardware platforms, operating systems and databases utilized in the benchmarking process. This, combined with the existence of only two primary metrics, makes it difficult to accurately compare the performance of the Application Server products themselves. By mining the full-disclosure archives for trends and correlations, we have discovered that J2EE technology is very scalable both in a scale-up and scale-out manner. Other observed trends include a linear correlation between middle-tier total processing power and throughput, as well as between J2EE Application Server license costs and throughput. However, the results clearly indicate that there is an increasing cost per user with increasing capacity systems and scale-up is proportionately more expensive than scale-out. Finally, the correlation between middle-tier processing power and throughput, combined with results obtained from a different ,lighter-weight' benchmark, facilitates an estimate of throughput for different types of J2EE applications. Copyright © 2004 John Wiley & Sons, Ltd. [source] Right Ventricular Function in Congenital Heart Defects Assessed by Regional Wall MotionCONGENITAL HEART DISEASE, Issue 3 2010FSCAI, Michael R. Nihill MB ABSTRACT Objectives., To develop a simple method to assess right ventricular function by angiography. Background., Conventional methods of evaluating right ventricular function are inaccurate, cumbersome, and expensive. Methods., We analyzed biplane right ventricular angiograms taken in the posterior,anterior and lateral projections using software to measure right ventricular volumes and regional wall motion in 78 patients with normal hearts (n = 29), atrial septal defects (ASD n = 13), pulmonary valve stenosis (PVS n = 21), and postoperative atrial switch patients (n = 15). We also measured the shortening fraction (SF) from the midtricuspid annulus to the septum and correlated various angiographic measurements with the right ventricular (RV) ejection fraction. Results., The volume-overloaded patients (ASD) had larger end diastolic volumes and increased SF compared with normal patients, while the pressure-loaded patients (PVS) had normal volumes and SF. The postoperative atrial switch patients had decreased systolic function and increased end diastolic volume. The SF for all of the patients correlated with the ejection fraction (r= 0.785, P, .0001). Conclusions., A simple measurement of the end diastolic and end systolic distance from the midtricuspid annulus to the septum (SF) provides a good index of RV function by angiography and correlates well with RV ejection fraction. [source] Cardiac basal metabolism: energetic cost of calcium withdrawal in the adult rat heartACTA PHYSIOLOGICA, Issue 3 2010P. Bonazzola Abstract Aim:, Cardiac basal metabolism upon extracellular calcium removal and its relationship with intracellular sodium and calcium homeostasis was evaluated. Methods:, A mechano-calorimetric technique was used that allowed the simultaneous and continuous measurement of both heat rate and resting pressure in arterially perfused quiescent adult rat hearts. Using pharmacological tools, the possible underlying mechanisms related to sodium and calcium movements were investigated. Results:, Resting heat rate (expressed in mW g,1dry wt) increased upon calcium withdrawal (+4.4 ± 0.2). This response was: (1) unaffected by the presence of tetrodotoxin (+4.3 ± 0.6), (2) fully blocked by both, the decrease in extracellular sodium concentration and the increase in extracellular magnesium concentration, (3) partially blocked by the presence of either nifedipine (+2.8 ± 0.4), KB-R7943 (KBR; +2.5 ± 0.2), clonazepam (CLO; +3.1 ± 0.3) or EGTA (+1.9 ± 0.3). The steady heat rate under Ca2+ -free conditions was partially reduced by the addition of Ru360 (,1.1 ± 0.2) but not CLO in the presence of EGTA, KBR or Ru360. Conclusion:, Energy expenditure for resting state maintenance upon calcium withdrawal depends on the intracellular rise in both sodium and calcium. Our data are consistent with a mitochondrial Ca2+ cycling, not detectable under normal calcium diastolic levels. The experimental condition here analysed, partially simulates findings reported under certain pathological situations including heart failure in which mildly increased levels of both diastolic sodium and calcium have also been found. Therefore, under such pathological conditions, hearts should distract chemical energy to fuel processes associated with sodium and calcium handling, making more expensive the maintenance of their functions. [source] Preventing the Spread of Invasive Species: Economic Benefits of Intervention Guided by Ecological PredictionsCONSERVATION BIOLOGY, Issue 1 2008REUBEN P. KELLER bioeconomía; economía de las invasiones; invasiones biológicas; Orconectes rusticus; predicciones ecológicas Abstract:,Preventing the invasion of freshwater aquatic species is the surest way to reduce their impacts, but it is also often expensive. Hence, the most efficient prevention programs will rely on accurate predictions of sites most at risk of becoming invaded and concentrate resources at those sites. Using data from Vilas County, Wisconsin (U.S.A.), collected in the 1970s, we constructed a predictive occurrence model for rusty crayfish (Orconectes rusticus) and applied it to an independent data set of 48 Vilas County lakes to predict which of these were most likely to become invaded between 1975 and 2005. We nested this invasion model within an economic framework to determine whether targeted management, derived from our quantitative predictions of likely invasion sites, would increase the economic value of lakes in the independent data set. Although the optimum expenditure on lake protection was high, protecting lakes at this level would have produced net economic benefits of at least $6 million over the last 30 years. We did not attempt to determine the value of nonmarket benefits of protection; thus, our results are likely to underestimate the total benefits from preventing invasions. Our results demonstrate that although few data are available early in an invasion, these data may be sufficient to support targeted, effective, and economically rational management. In addition, our results show that ecological predictions are becoming sufficiently accurate that their application in management can produce net economic benefits. Resumen:,La prevención de la invasión de especies dulceacuícolas es la manera más segura de reducir sus impactos, pero a menudo es costosa. Por lo tanto, los programas de prevención más eficientes dependerán de predicciones precisas de los sitios con mayor riesgo de ser invadidos y concentrarán recursos en esos sitios. Utilizando datos recolectados en los 70s en el Condado Vilas, Wisconsin (E.U.A.), desarrollamos un modelo predictivo de la ocurrencia de Orconectes rusticus y lo aplicamos en un conjunto de datos independientes de 48 lagos en el Condado de Vilas para predecir cuales fueron más susceptibles de ser invadidos entre 1975 y 2005. Anidamos este modelo de invasión en un marco económico para determinar si los objetivos de manejo, derivados de nuestras predicciones cuantitativas de sitios susceptibles a la invasión, incrementarían el valor económico de los lagos del conjunto independiente de datos. Aunque el gasto óptimo para la protección de lagos fue alto, la protección de lagos a este nivel podría haber producido beneficios económicos por un mínimo de $6 millones en los últimos 30 años. No intentamos determinar el valor de los beneficios no comerciables de la protección; por lo tanto, es probable que nuestros resultados subestimen los beneficios totales de la prevención de invasiones. Nuestros resultados demuestran que, aunque se disponga de pocos datos del inicio de una invasión, esos datos pueden ser suficientes para fundamentar acciones de manejo efectivas y económicamente racionales. Adicionalmente, nuestros resultados muestran que las predicciones ecológicas se están volviendo tan precisas que su aplicación en el manejo puede producir beneficios económicos netos. [source] Influence of Temporal Scale of Sampling on Detection of Relationships between Invasive Plants and the Diversity Patterns of Plants and ButterfliesCONSERVATION BIOLOGY, Issue 6 2004RALPH MAC NALLY But monitoring is often neglected because it can be expensive and time-consuming. Accordingly, it is valuable to determine whether the temporal extent of sampling alters the validity of inferences about the response of diversity measures to environmental variables affected by restoration actions. Non-native species alter ecosystems in undesirable ways, frequently homogenizing flora and fauna and extirpating local populations of native species. In the Mojave Desert, invasion of salt-cedar (Tamarix ramosissima Ledeb.) and human efforts to eradicate salt-cedar have altered vegetation structure, vegetation composition, and some measures of faunal diversity. We examined whether similar inferences about relationships between plants and butterflies in the Muddy River drainage (Nevada, U.S.A.) could have been obtained by sampling less intensively (fewer visits per site over the same period of time) or less extensively (equal frequency of visits but over a more limited period of time). We also tested whether rank order of butterfly species with respect to occurrence rate (proportion of sites occupied) would be reflected accurately in temporal subsamples. Temporal subsampling did not lead to erroneous inferences about the relative importance of six vegetation-based predictor variables on the species richness of butterflies. Regardless of the temporal scale of sampling, the species composition of butterflies was more similar in sites with similar species composition of plants. The rank order of occurrence of butterfly species in the temporal subsamples was highly correlated with the rank order of species occurrence in the full data set. Thus, similar inferences about associations between vegetation and butterflies and about relative occurrence rates of individual species of butterflies could be obtained by less intensive or extensive temporal sampling. If compromises between temporal intensity and extent of sampling must be made, our results suggest that maximizing temporal extent will better capture variation in biotic interactions and species occurrence. Resumen:,El monitoreo es un componente importante de los esfuerzos de restauración y de manejo adoptivo. Pero el monitoreo a menudo es desatendido porque puede ser costoso y consume tiempo. En consecuencia, es valioso determinar si la extensión temporal del muestreo altera la validez de inferencias sobre la respuesta de medidas de diversidad a variables ambientales afectadas por acciones de restauración. Las especies no nativas alteran a los ecosistemas de manera indeseable, frecuentemente homogenizan la flora y fauna y extirpan poblaciones locales de especies nativas. En el Desierto Mojave, la invasión de Tamarix ramosissima Ledeb. y los esfuerzos humanos para erradicarla han alterado la estructura y composición de la vegetación y algunas medidas de diversidad de fauna. Examinamos si se podían obtener inferencias similares sobre las relaciones entre plantas y mariposas en la cuenca Muddy River (Nevada, E.U.A.) muestreando menos intensivamente (menos visitas por sitio en el mismo período de tiempo) o menos extensivamente (igual frecuencia de visitas pero sobre un período de tiempo más limitado). También probamos si el orden jerárquico de especies de mariposas con respecto a la tasa de ocurrencia (proporción de sitios ocupados) se reflejaba con precisión en las submuestras temporales. El submuestreo temporal no condujo a inferencias erróneas acerca de la importancia relativa de seis variables predictivas basadas en vegetación sobre la riqueza de especies de mariposas. A pesar de la escala temporal del muestreo, la composición de especies de mariposas fue más similar en sitios con composición de especies de plantas similar. El orden jerárquico de ocurrencia de especies de mariposas en las muestras subtemporales estuvo muy correlacionado con el orden jerárquico de ocurrencia de especies en todo el conjunto de datos. Por lo tanto, se pudieron obtener inferencias similares de las asociaciones entre vegetación y mariposas y de las tasas de ocurrencia relativa de especies individuales de mariposas con muestreo temporal menos intensivo o extensivo. Si se deben hacer compromisos entre la intensidad y extensión de muestreo temporal, nuestros resultados sugieren que la maximización de la extensión temporal capturará la variación en interacciones bióticas y ocurrencia de especies más adecuadamente. [source] Optimal conservation planning for migratory animals: integrating demographic information across seasonsCONSERVATION LETTERS, Issue 3 2010Justin Sheehy Abstract Conservation strategies for migratory animals are typically based on ad-hoc or simple ranking methods and focus on a single period of the annual cycle. We use a density-dependent population model to examine one-time land purchase strategies for a migratory population with a breeding and wintering grounds. Under equal rates of habitat loss, we show that it is optimal to invest more, but never solely, in the habitat with the higher density dependence to habitat cost ratio. When there are two habitats that vary in quality within a season, the best strategy is to invest only in one habitat. Whether to purchase high- or low-quality habitat depends on the general life history of the species and the ratio of habitat quality to habitat cost. When carry-over effects are incorporated, it is almost always optimal to invest in high-quality habitat during the season that produces the carry-over effect. We apply this model to a threatened warbler population and show the optimal strategy is to purchase more breeding than wintering habitat despite the fact that breeding habitat is over ten times more expensive. Our model provides a framework for developing year-round conservation strategies for migratory animals and has important implications for long-term planning and management. [source] STRATEGIC BEHAVIORS TOWARD ENVIRONMENTAL REGULATION: A CASE OF TRUCKING INDUSTRYCONTEMPORARY ECONOMIC POLICY, Issue 1 2007TERENCE LAM We used trucking industry's response to the U.S. Environmental Protection Agency's acceleration of 2004 diesel emissions standards as a case study to examine the importance of accounting for regulatees' strategic behaviors in drafting of environmental regulations. Our analysis of the time series data of aggregate U.S. and Canada heavy-duty truck production data from 1992 through 2003 found that heavy-duty trucks production increased by 20%,23% in the 6 mo prior to the date of compliance. The increases might be due to truck operators pre-buying trucks with less expensive but noncompliant engines and behaving strategically in anticipation of other uncertainties. (JEL L51, Q25) [source] THE CRIME-CONTROL EFFECT OF INCARCERATION: DOES SCALE MATTER?,CRIMINOLOGY AND PUBLIC POLICY, Issue 2 2006RAYMOND V. LIEDKA Research Summary: Several prominent empirical studies estimate models of a constant proportional effect of prison on crime, finding that effect is substantial and negative. A separate literature argues against the crime-reducing effect of prison but mainly on theoretical grounds. This second literature suggests that the elasticity of the prison/crime relationship is not constant. We provide a model that nests these two literatures. Using data from the United States over 30 years, we find strong evidence that the negative relationship between prison and crime becomes less strongly negative as the scale of imprisonment increases. This revisionist model indicates that (1) at low levels of incarceration, a constant elasticity model underestimates the negative relationship between incarceration and crime, and (2) at higher levels of incarceration, the constant elasticity model overstates the negative effect. Policy Implications: These results go beyond the claim of declining marginal returns, instead finding accelerating declining marginal returns. As the prison population continues to increase, albeit at a slower rate, after three decades of phenomenal growth, these findings provide an important caution that for many jurisdictions, the point of accelerating declining marginal returns may have set in. Any policy discussion of the appropriate scale of punishment should be concerned with the empirical impact of this expensive and intrusive government intervention. [source] Diagnostic utility of the Quick Inventory of Depressive Symptomatology (QIDS-C16 and QIDS-SR16) in the elderlyACTA PSYCHIATRICA SCANDINAVICA, Issue 3 2010P. M. Doraiswamy Doraiswamy PM, Bernstein IH, Rush AJ, Kyutoku Y, Carmody TJ, Macleod L, Venkatraman S, Burks M, Stegman D, Witte B, Trivedi MH. Diagnostic utility of the Quick Inventory of Depressive Symptomatology (QIDS-C16 and QIDS-SR16) in the elderly. Objective:, To evaluate psychometric properties and comparability ability of the Montgomery-Åsberg Depression Rating Scale (MADRS) vs. the Quick Inventory of Depressive Symptomatology,Clinician-rated (QIDS-C16) and Self-report (QIDS-SR16) scales to detect a current major depressive episode in the elderly. Method:, Community and clinic subjects (age ,60 years) were administered the Mini-International Neuropsychiatric Interview (MINI) for DSM-IV and three depression scales randomly. Statistics included classical test and Samejima item response theories, factor analyzes, and receiver operating characteristic methods. Results:, In 229 elderly patients (mean age = 73 years, 39% male, 54% current depression), all three scales were unidimensional and with nearly equal Cronbach , reliability (0.85,0.89). Each scale discriminated persons with major depression from the non-depressed, but the QIDS-C16 was slightly more accurate. Conclusion:, All three tests are valid for detecting geriatric major depression with the QIDS-C16 being slightly better. Self-rated QIDS-SR16 is recommended as a screening tool as it is least expensive and least time consuming. [source] Porcine Sebaceous Cyst Model: An Inexpensive, Reproducible Skin Surgery SimulatorDERMATOLOGIC SURGERY, Issue 8 2005Jonathan Bowling MBChB background. Surgical simulators are an established part of surgical training and are regularly used as part of the objective structured assessment of technical skills. Specific artificial skin models representing cutaneous pathology are available, although they are expensive when compared with pigskin. The limitations of artificial skin models include their difficulty in representing lifelike cutaneous pathology. objective. Our aim was to devise an inexpensive, reproducible surgical simulator that provides the most lifelike representation of the sebaceous cyst. materials and methods. Pigskin, either pig's feet/trotters or pork belly, was incised, and a paintball was inserted subcutaneously and fixed with cyanoacrylic glue. results. This model has regularly been used in cutaneous surgical courses that we have organized. Either adding more cyanoacrylic glue or allowing more time for the paint ball to absorb fluid from surrounding tissue can also adjust the degree of difficulty. conclusions. The degree of correlation with lifelike cutaneous pathology is such that we recommend that all courses involved in basic skin surgery should consider using the porcine sebaceous cyst model when teaching excision of sebaceous cysts. [source] ACCESS TO ESSENTIAL MEDICINES: A HOBBESIAN SOCIAL CONTRACT APPROACHDEVELOPING WORLD BIOETHICS, Issue 2 2005RICHARD E. ASHCROFT ABSTRACT Medicines that are vital for the saving and preserving of life in conditions of public health emergency or endemic serious disease are known as essential medicines. In many developing world settings such medicines may be unavailable, or unaffordably expensive for the majority of those in need of them. Furthermore, for many serious diseases (such as HIV/AIDS and tuberculosis) these essential medicines are protected by patents that permit the patent-holder to operate a monopoly on their manufacture and supply, and to price these medicines well above marginal cost. Recent international legal doctrine has placed great stress on the need to globalise intellectual property rights protections, and on the rights of intellectual property rights holders to have their property rights enforced. Although international intellectual property rights law does permit compulsory licensing of protected inventions in the interests of public health, the use of this right by sovereign states has proved highly controversial. In this paper I give an argument in support of states' sovereign right to expropriate private intellectual property in conditions of public health emergency. This argument turns on a social contract argument for the legitimacy of states. The argument shows, further, that under some circumstances states are not merely permitted compulsory to license inventions, but are actually obliged to do so, on pain of failure of their legitimacy as sovereign states. The argument draws freely on a loose interpretation of Thomas Hobbes's arguments in his Leviathan, and on an analogy between his state of War and the situation of public health disasters. [source] Preservation of sight in diabetes: developing a national risk reduction programmeDIABETIC MEDICINE, Issue 9 2000L. Garvican SUMMARY Background Early treatment for diabetic retinopathy is effective at saving sight, but dependent on pre-symptomatic detection. Although 60% of people with diabetes have their eyes examined annually, few UK health authorities have systematic programmes that meet the British Diabetic Association's standards for sensitivity (> 80%) and specificity (> 95%). Screening is generally performed by general practitioners and optometrists, with some camera-based schemes, operated by dedicated staff. The National Screening Committee commissioned a group to develop a model and cost estimates for a comprehensive national risk-reduction programme. Ophthalmoscopy Evidence indicates that direct ophthalmoscopy using a hand-held ophthalmoscope does not give adequate specificity and sensitivity, and should be abandoned as a systematic screening technique. Indirect ophthalmoscopy using a slit lamp is sensitive and specific enough to be viable, and widespread availability in high street optometrists is an advantage, but the method requires considerable skill. Photographic schemes The principal advantage of camera-based screening is the capturing of an image, for patient education, review of disease progression, and quality assurance. Digital cameras are becoming cheaper, and are now the preferred option. The image is satisfactory for screening and may be transmitted electronically. With appropriate training and equipment, different professional groups might participate in programme delivery, based on local decisions. Cost issues Considerable resources are already invested in ad hoc screening, with inevitable high referral rates incurring heavy outpatient costs. Treatment for advanced disease is expensive, but less likely to be effective. The costs of a new systematic screening and treatment programme appear similar to current expenditure, as a result of savings in treatment of late-presenting advanced retinopathy. Conclusion A systematic national programme based on digital photography is proposed. [source] Fine-needle aspiration of primary osseous lesions: A cost effectiveness studyDIAGNOSTIC CYTOPATHOLOGY, Issue 4 2010Lester J. Layfield M.D. Abstract Fine-needle aspiration (FNA) is not widely used in the work-up of osseous lesions because of concerns regarding its high incidence of nondiagnostic specimens. Although several studies have shown that FNA is less expensive than surgical biopsy, the authors are aware of only one prior study evaluating the cost effectiveness of FNA, which includes the cost of incisional or core needle biopsies necessary to establish a diagnosis when the initial FNA was noncontributory. A computerized search of the pathology records of three medical centers was performed to obtain all FNAs of primary osseous lesions. For each FNA case, all subsequent core needle, incisional or excisional biopsies were recorded as was the result of the definitive operative procedure. The cost of obtaining the definitive diagnosis was calculated for each case including the cost of FNA, imaging guidance utilized, and cost of subsequent surgical biopsy when necessary. The cost of an alternate approach using only surgical biopsy was calculated. The average per patient costs of these two protocols were compared. A total of 165 primary bone tumors underwent FNA. One hundred six of these yielded a definitive cytologic diagnosis. In 59 cases, FNA yielded a result insufficient for definitive therapy necessitating surgical biopsy. FNA investigation of the 165 bone lesions cost 575,932 (average of 3,490 per patient). Surgical biopsy alone would have cost 5,760 per patient. FNA resulted in a cost savings of 2,215 per patient. Diagn. Cytopathol. 2010 © 2009 Wiley-Liss, Inc. [source] Integration of Different Data Bodies for Humanitarian Decision Support: An Example from Mine ActionDISASTERS, Issue 4 2003Aldo A. Benini Geographic information systems (GIS) are increasingly used for integrating data from different sources and substantive areas, including in humanitarian action. The challenges of integration are particularly well illustrated by humanitarian mine action. The informational requirements of mine action are expensive, with socio,economic impact surveys costing over US$1.5 million per country, and are feeding a continuous debate on the merits of considering more factors or ,keeping it simple'. National census offices could, in theory, contribute relevant data, but in practice surveys have rarely overcome institutional obstacles to external data acquisition. A positive exception occurred in Lebanon, where the landmine impact survey had access to agricultural census data. The challenges, costs and benefits of this data integration exercise are analysed in a detailed case study. The benefits are considerable, but so are the costs, particularly the hidden ones. The Lebanon experience prompts some wider reflections. In the humanitarian community, data integration has been fostered not only by the diffusion of GIS technology, but also by institutional changes such as the creation of UN-led Humanitarian Information Centres. There is a question whether the analytic capacity is in step with aggressive data acquisition. Humanitarian action may yet have to build the kind of strong analytic tradition that public health and poverty alleviation have accomplished. [source] Predicting global abundance of a threatened species from its occurrence: implications for conservation planningDIVERSITY AND DISTRIBUTIONS, Issue 1 2009Marcos S. L. Figueiredo Abstract Aim, Global abundance is an important characteristic of a species that is correlated with geographical distribution and body size. Despite its importance these estimates are not available since reliable field estimates are either expensive or difficult to obtain. Based on the relationship between a species' local abundance and distribution, some authors propose that abundance can be obtained through spatial distribution data from maps plotted at different scales. This has never been tested over the entire geographical range of a species. Thus, the aim of this study was to estimate global abundance of the Neotropical primate Brachyteles hypoxanthus (northern muriqui) and compare the results with available field estimates. Location, From southern Bahia to Minas Gerais and Espírito Santo states, in the Brazilian Atlantic rain forest. Methods, We compiled 25 recent occurrence localities of B. hypoxanthus and plotted them in grid cells of five different sizes (1, 25, 50, 75 and 100 km per side) to evaluate the performance and accuracy of abundance estimates over a wide range of scales. The abundance estimates were obtained by the negative binomial distribution (NBD) method and corrected by average group size to take into account primate social habits. To assess the accuracy of the method, the predicted abundances were then compared to recent independent field estimates. Results, The NBD estimates were quite accurate in predicting B. hypoxanthus global abundance, once the gregarious habits of this species are taken into account. The predicted abundance estimates were not statistically different from those obtained from field estimates. Main conclusions, The NBD method seems to be a quick and reliable approach to estimate species abundance once several limiting factors are taken into account, and can greatly impact conservation planning, but further applications in macroecological and ecological theory testing needs improvement of the method. [source] Organization and mode of secretion of the granular prismatic microstructure of Entodesma navicula (Bivalvia: Mollusca)ACTA ZOOLOGICA, Issue 2 2009Elizabeth M. Harper The term homogeneous has been applied to molluscan microstructures that lack a readily discernible microstructure and as a result, it has become rather a ,dustbin' term, covering a multitude of unrelated finely crystalline textures. Here we investigate in detail the outer ,homogeneous' layer of the lyonsiid bivalve Entodesma navicula. The apparently equigranular crystals (up to 10 µm) are, in fact, short prisms which grow in a dense organic matrix with their c -axes and fibre axes coincident, perpendicular to the growth surface. These prisms are distinct from the aragonitic prisms grown by other bivalves in both their morphology and their mode of growth and so we propose the term granular prismatic microstructure. The organic content of granular prisms (7.4%) is the highest yet recorded for any molluscan microstructure and it is apparent that the short prisms have grown within a gel-filled space. Although this high organic content is likely to make the microstructure metabolically expensive to produce, it has the benefit of making the valves very flexible. This may be advantageous in the intertidal zone inhabited by E. navicula by allowing a tight seal between the valves. [source] Accelerating drug development: methodology to support first-in-man pharmacokinetic studies by the use of drug candidate microdosingDRUG DEVELOPMENT RESEARCH, Issue 1 2007Matthew A. McLean Abstract Microdosing of experimental therapeutics in humans offers a number of benefits to the drug development process. Microdosing, conducted under an exploratory Investigational New Drug (IND) application, entails administration of a sub-pharmacological dose of a new chemical entity (NCE) that allows for early evaluation of human pharmacokinetics. Such information can be pivotal for: (1) selecting a compound for full drug development from a small group of candidates; (2) defining the amount of material needed for early development; and (3) setting the initial Phase I dose regimen in humans. Appropriate safety studies must be conducted to support microdosing in humans, but the requirements are generally less extensive than those needed to support a traditional IND. To date, microdosing has not been broadly applied by the pharmaceutical industry due to concerns about analytical sensitivity and the possibility of non-linear pharmacokinetics at extremely low doses. The primary method for detecting analytes following microdosing until now has been accelerator mass spectrometry, which is expensive, not generally available, and requires test agents to be radiolabeled. Presented in this report is an example of pharmacokinetics analysis using LC/MS/MS following microdosing of an experimental agent in cynomolgus monkeys. The results show good linearity in plasma pharmacokinetics for oral doses of 10,mg/kg (therapeutic dose) and 0.0005,mg/kg (microdose) of the test agent. The results also demonstrate the feasibility of applying standard laboratory analytics to support microdosing in humans and raise the possibility of establishing an animal model to screen for compounds having non-linear pharmacokinetics at low dose levels. Drug Dev. Res. 68:14,22, 2007. © 2007 Wiley-Liss, Inc. [source] Suspended sediment load estimation and the problem of inadequate data sampling: a fractal viewEARTH SURFACE PROCESSES AND LANDFORMS, Issue 4 2006Bellie Sivakumar Abstract Suspended sediment load estimation at high resolutions is an extremely difficult task, because: (1) it depends on the availability of high-resolution water discharge and suspended sediment concentration measurements, which are often not available; (2) any errors in the measurements of these two components could significantly influence the accuracy of suspended sediment load estimation; and (3) direct measurements are very expensive. The purpose of this study is to approach this sampling problem from a new perspective of fractals (or scaling), which could provide important information on the transformation of suspended sediment load data from one scale to another. This is done by investigating the possible presence of fractal behaviour in the daily suspended sediment load data for the Mississippi River basin (at St. Louis, Missouri). The presence of fractal behaviour is investigated using five different methods, ranging from general to specific and from mono-fractal to multi-fractal: (1) autocorrelation function; (2) power spectrum; (3) probability distribution function; (4) box dimension; and (5) statistical moment scaling function. The results indicate the presence of multi-fractal behaviour in the suspended sediment load data, suggesting the possibility of transformation of data from one scale to another using a multi-dimensional model. Copyright © 2005 John Wiley & Sons, Ltd. [source] Implications of Liebig's law of the minimum for the use of ecological indicators based on abundanceECOGRAPHY, Issue 2 2005J. G. Hiddink Many ecological responses to environmental variables or anthropogenic agents are difficult and expensive to measure. Therefore it is attractive to describe such responses in terms of indicators that are easier to measure. In ecosystem management, indicators can be used to monitor spatial and temporal changes in an environmental feature. The aim of this paper is to show that it is important to take Liebig's law of the minimum into consideration to understand when it is appropriate or inappropriate to use ecological indicators based on abundance. When developing indicators that relate the abundance of an organism to an environmental factor, it is likely that this relationship will be polygonal rather than a simple linear relationship. The upper boundary of the distribution describes how abundance is limited by this factor, while the variation below the upper boundary is explained by situations when factors other than the factor of interest limit abundance. The variation below the upper boundary of the distribution means that the use of indicators to examine spatial patterns in the response of abundance to an environmental factor can be problematic. Thus, while abundance-based indicators can identify sites that are in a good condition, they are less useful to detect those affected by environmental degradation. In contrast, abundance-based ecological indicators may enable temporal monitoring of the impact of environmental factors, as it is expected that limiting factors are less variable in time than in space. In conclusion, when multiple factors are limiting, a significant correlation between an indicator and a variable is not enough to validate the status of a factor as an indicator. [source] |