Tracing

Distribution by Scientific Domains
Distribution within Medical Sciences

Kinds of Tracing

  • chain tracing
  • contact tracing
  • lineage tracing
  • process tracing
  • ray tracing
  • retrograde tracing
  • tract tracing

  • Terms modified by Tracing

  • tracing experiment
  • tracing studies
  • tracing study
  • tracing technique
  • tracing techniques

  • Selected Abstracts


    TRACING BACK THE ORIGIN OF THE INDO-PACIFIC MOLLUSC FAUNA: BASAL TRIDACNINAE FROM THE OLIGOCENE AND MIOCENE OF THE SULTANATE OF OMAN

    PALAEONTOLOGY, Issue 1 2008
    MATHIAS HARZHAUSER
    Abstract:, Two new tridacnine species are described from the Chattian and Aquitanian of the Arabian Peninsula. For these, the new names Omanidacna eos gen. et sp. nov. and Tridacna evae sp. nov. are erected. Omanidacna is interpreted as an Oligocene ancestor of Hippopus, being the oldest record of this tridacnine lineage. The Aquitanian Tridacna evae is the first occurrence of the genus Tridacna. These Arabian taxa imply that the modern tridacnine lineages are rooted in the Palaeogene and early Neogene of the East African-Arabian Province, although their Eocene ancestors, such as Byssocardium, are Western Tethyan taxa. During the Neogene they successfully settled the Indo-Polynesian Province and became typical elements of the entire Indo-West Pacific Region. The tridacnines are thus an example of a successive transformation and gradual eastward dispersal of an originally Tethyan element contributing to late Neogene diversity in the Indo-West Pacific. [source]


    Shallow Bounding Volume Hierarchies for Fast SIMD Ray Tracing of Incoherent Rays

    COMPUTER GRAPHICS FORUM, Issue 4 2008
    H. Dammertz
    Abstract Photorealistic image synthesis is a computationally demanding task that relies on ray tracing for the evaluation of integrals. Rendering time is dominated by tracing long paths that are very incoherent by construction. We therefore investigate the use of SIMD instructions to accelerate incoherent rays. SIMD is used in the hierarchy construction, the tree traversal and the leaf intersection. This is achieved by increasing the arity of acceleration structures, which also reduces memory requirements. We show that the resulting hierarchies can be built quickly and are smaller than acceleration structures known so far while at the same time outperforming them for incoherent rays. Our new acceleration structure speeds up ray tracing by a factor of 1.6 to 2.0 compared to a highly optimized bounding interval hierarchy implementation, and 1.3 to 1.6 compared to an efficient kd-tree. At the same time, the memory requirements are reduced by 10,50%. Additionally we show how a caching mechanism in conjunction with this memory efficient hierarchy can be used to speed up shadow rays in a global illumination algorithm without increasing the memory footprint. This optimization decreased the number of traversal steps up to 50%. [source]


    Accelerating Ray Tracing using Constrained Tetrahedralizations

    COMPUTER GRAPHICS FORUM, Issue 4 2008
    Ares Lagae
    Abstract In this paper we introduce the constrained tetrahedralization as a new acceleration structure for ray tracing. A constrained tetrahedralization of a scene is a tetrahedralization that respects the faces of the scene geometry. The closest intersection of a ray with a scene is found by traversing this tetrahedralization along the ray, one tetrahedron at a time. We show that constrained tetrahedralizations are a viable alternative to current acceleration structures, and that they have a number of unique properties that set them apart from other acceleration structures: constrained tetrahedralizations are not hierarchical yet adaptive; the complexity of traversing them is a function of local geometric complexity rather than global geometric complexity; constrained tetrahedralizations support deforming geometry without any effort; and they have the potential to unify several data structures currently used in global illumination. [source]


    ReduceM: Interactive and Memory Efficient Ray Tracing of Large Models

    COMPUTER GRAPHICS FORUM, Issue 4 2008
    Christian Lauterbach
    We present a novel representation and algorithm, ReduceM, for memory efficient ray tracing of large scenes. ReduceM exploits the connectivity between triangles in a mesh and decomposes the model into triangle strips. We also describe a new stripification algorithm, Strip-RT, that can generate long strips with high spatial coherence. Our approach uses a two-level traversal algorithm for ray-primitive intersection. In practice, ReduceM can significantly reduce the storage overhead and ray trace massive models with hundreds of millions of triangles at interactive rates on desktop PCs with 4-8GB of main memory. [source]


    GPU-Based Nonlinear Ray Tracing

    COMPUTER GRAPHICS FORUM, Issue 3 2004
    Daniel Weiskopf
    In this paper, we present a mapping of nonlinear ray tracing to the GPU which avoids any data transfer back to main memory. The rendering process consists of the following parts: ray setup according to the camera parameters, ray integration, ray-object intersection, and local illumination. Bent rays are approximated by polygonal lines that are represented by textures. Ray integration is based on an iterative numerical solution of ordinary differential equations whose initial values are determined during ray setup. To improve the rendering performance, we propose acceleration techniques such as early ray termination and adaptive ray integration. Finally, we discuss a variety of applications that range from the visualization of dynamical systems to the general relativistic visualization in astrophysics and the rendering of the continuous refraction in media with varying density. Categories and Subject Descriptors (according to ACM CCS): I.3.3 [Computer Graphics]: Picture/Image Generation I.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism [source]


    Tracing salmon-derived nutrients and contaminants in freshwater food webs across a pronounced spawner density gradient

    ENVIRONMENTAL TOXICOLOGY & CHEMISTRY, Issue 6 2007
    Irene Gregory-Eaves
    Abstract Many have demonstrated that anadromous Pacific salmon are significant vectors of nutrients from the ocean to freshwaters. Recently, however, it has been recognized that salmon spawners also input significant quantities of contaminants. The objectives of this paper are to delineate the extent to which salmon-derived nutrients are integrated into the freshwater food web using ,15N and ,13C and to assess the influence of the salmon pathway in the accumulation of contaminants in rainbow trout (Oncorhynchus mykiss). We found that the ,15N and ,13C of food web components were related positively and significantly to sockeye salmon (Oncorhynchus nerka) spawner density. Contaminant concentrations in rainbow trout also positively and significantly were related to sockeye salmon spawner density. These data suggest that the anadromous salmon nutrient and contaminant pathways are related and significantly impact the contaminant burden of resident fish. [source]


    GMO Food Labelling in the EU: Tracing ,the Seeds of Dispute'

    EUROCHOICES, Issue 1 2003
    Maria L. Loureiro
    Summary GMO Food Labelling in the EU: Tracinq ,the Seeds of Dispute' Genetically modified (GM) food labelling has become a critical issue in the international trade arena. Policymakers and consumers in the European Union (EU) seem to agree on the need to control the use of biotechnology in the food industry. As a consequence, recently the EU Commission approved a measure that establishes strict rules on genetically modified organisms (GMOs), but which lifts the moratorium on GMO production and marketing. This new Directive deals with mandatory labelling of GM foods and their traceability along the food chain. In spite of the substantial effort made to reconcile the different opinions in the escalating debate about biotechnology, the new GMO regulation seems to be unsatisfactory for too many interest groups. A system of total traceability from ,farm to fork' and mandatory labelling for genetically modified products may be considered too complex and too expensive to implement, particularly by those countries or industries that have produced GMO foods for many years. Yet, giving European consumers the freedom to choose GMOs may be the only option that there is until Europeans restore their confidence in the food system and food regulators. A market or consumer-driven solution may eventually terminate the GMO dispute between the two transatlantic trading blocks. , Assurance , Revenud a ns , Agriculture Européenne ,étiquetage des aliments contenant des organismes génétiquement modifyés (OGM) est devenu une question cruciale sur la scène du commerce international. Tant les décideurs politiques que les citoyens de , Union européenne semblent s'accorder sur la nécessité de soumettre à contrôle , utilisation des biotechnologies dans , industrie alimentaire. En conséquence, la Commission européenne a récemment approuvé une mesure qui établit des règies strictes sur les OGM, mais qui lève le moratoire sur leur production et leur commercialisation. Cette nouvelle directive concerne ,étiquetage obligatoire des aliments contenant des OGM et la façon ? en assurer le suivi dans les filières alimentaires. Ce nouveau règlement OGM, en dépit des efforts réels effectués pour réconcilier les différents points de vue dans la montée du débat sur les biotechnologies, semble inconciliable avec trop de groupes ? intérêts pour être satisfaisant. Un système assurant une traçabilité totale, ,du champ à la fourchette' et un étiquetage obligatoire pour tout produit contenant des OGM, paraît bien trop complexe et coûteux à mettre en ,uvre, en particulier pour les pays ou les industries qui produisent des aliments génétiquement modifyés depuis des années. Et pourtant, il se pourrait bien que la seule façpn de restaurer la confiance perdue des Européens dans le système alimentaire et ses institutions soit justement de leur donner le droit de choisir. La fin de la querelle des OGM entre les blocs commerciaux des deux rives de , Atlantique peut venir de solutions apportées par le marché et issues des consommateurs. Einkommenversicherung in der Europäischen Landwirtschaft Die Kennzeichnung von genetisch veränderten Lebensmitteln ist zu einer der bedeutendsten Streitfragen auf dem Gebiet des internationalen Handels geworden. Politische Entscheidungsträger und Verbraucher in der Europäischen Union scheinen dahingehend überein zu stimmen, dass der Einsatz von Biotechnologie in der Nahrungsmittel-industrie kontrolliert werden sollte. Als Reaktion darauf hat die EU-Kommission kürzlich einer Maßnahme zugestimmt, welche ein strenges Regelwerk für genetisch veränderte Organismen (GVO) festschreibt, mit der aber gleichzeitig das Moratorium für die Produktion und Vermarktung von GVO aufgehoben wird. Die neue Richtlinie beschäftigt sich mit der Pflichtkennzeichnung von genetisch veränderten Nahrungsmitteln und mit ihrer Rückverfolgbarkeit entlang der Nahrungsmittelkette. Trotz der erheblichen Anstrengungen, die verschiedenen Standpunkte in der eskalierenden Debatte um Biotechnologie zu berücksichtigen, scheint die neue GVO Richtlinie in den Augen (zu) vieler Interessengruppen unbefriedigend zu sein. Ein System der vollständigen Rückverfolgbarkeit vom Stall bis zum Teller und die Pflichtkennzeichnung von genetisch veränderten Nahrungsmitteln mag in der Umsetzung als zu komplex und zu teuer betrachtet werden, insbesondere von den Ländern oder Industriezweigen, welche seit vielen Jahren GVO-Nahrungsmittel hergestellt haben. Dennoch könnte der Ansatz, den europäischen Verbrauchern die freie Wahl für oder gegen GVO zu gewähren, der einzig gangbare Weg sein, bis die Europäer ihr Vertrauen in das Produktions- und Kontrollsystem für Nahrungsmittel zurückgewonnen haben. Eine markt- oder verbraucherorientierte Lösung könnte letztlich den Streit um GVO zwischen den beiden transatlantischen Handelsblöcken beenden. [source]


    Advanced Experimental and Simulation Approaches to Meet Reliability Challenges of New Electronics Systems

    ADVANCED ENGINEERING MATERIALS, Issue 4 2009
    Dietmar Vogel
    Abstract This paper focuses on some advanced aspects of physics of failure approaches. Tracing of failure modes under realistic loading is a key issue to separate relevant failure sites to be studied in more detail. In the past design of experiment (DoE) tools have been developed to handle this problem. They allow to optimize design and/or material selection with respect to different failure mechanisms and sites. The application of these methods is demonstrated by optimizations performed for fracture problems. Interface fracture has been chosen as one of the most important failure mechanisms. Finally, local stress and strain measurement tools developed over the past years are presented at the end of the paper. They are tools to validate simulation results and therefore the underlying mechanical modeling. Namely, local stress measurement tools under development are needed to make realistic assumptions of loading conditions and to provide residual stress data for FEA. [source]


    Effects of climate change on labile and structural carbon in Douglas-fir needles as estimated by ,13C and Carea measurements

    GLOBAL CHANGE BIOLOGY, Issue 11 2002
    ERIC A. HOBBIE
    Abstract Models of photosynthesis, respiration, and export predict that foliar labile carbon (C) should increase with elevated CO2 but decrease with elevated temperature. Sugars, starch, and protein can be compared between treatments, but these compounds make up only a fraction of the total labile pool. Moreover, it is difficult to assess the turnover of labile carbon between years for evergreen foliage. Here, we combined changes in foliar Carea (C concentration on an areal basis) as needles aged with changes in foliar isotopic composition (,13C) caused by inputs of 13C-depleted CO2 to estimate labile and structural C in needles of different ages in a four-year, closed-chamber mesocosm experiment in which Douglas-fir (Pseudotsuga menziesii (Mirb.) Franco) seedlings were exposed to elevated temperature (ambient + 3.5 °C) and CO2 (ambient + 179 ppm). Declines in ,13C of needle cohorts as they aged indicated incorporation of newly fixed labile or structural carbon. The ,13C calculations showed that new C was 41 ± 2% and 28 ± 3% of total needle carbon in second- and third-year needles, respectively, with higher proportions of new C in elevated than ambient CO2 chambers (e.g. 42 ± 2% vs. 37 ± 6%, respectively, for second-year needles). Relative to ambient CO2, elevated CO2 increased labile C in both first- and second-year needles. Relative to ambient temperature, elevated temperature diminished labile C in second-year needles but not in first-year needles, perhaps because of differences in sink strength between the two needle age classes. We hypothesize that plant-soil feedbacks on nitrogen supply contributed to higher photosynthetic rates under elevated temperatures that partly compensated for higher turnover rates of labile C. Strong positive correlations between labile C and sugar concentrations suggested that labile C was primarily determined by carbohydrates. Labile C was negatively correlated with concentrations of cellulose and protein. Elevated temperature increased foliar %C, possibly due to a shift of labile constituents from low %C carbohydrates to relatively high %C protein. Decreased sugar concentrations and increased nitrogen concentrations with elevated temperature were consistent with this explanation. Because foliar constituents that vary in isotopic signature also vary in concentrations with leaf age or environmental conditions, inferences of ci/ca values from ,13C of bulk leaf tissue should be done cautiously. Tracing of 13C through foliar carbon pools may provide new insight into foliar C constituents and turnover. [source]


    Development of axonal pathways in the human fetal fronto-limbic brain: histochemical characterization and diffusion tensor imaging

    JOURNAL OF ANATOMY, Issue 4 2010
    Lana Vasung
    Abstract The development of cortical axonal pathways in the human brain begins during the transition between the embryonic and fetal period, happens in a series of sequential events, and leads to the establishment of major long trajectories by the neonatal period. We have correlated histochemical markers (acetylcholinesterase (AChE) histochemistry, antibody against synaptic protein SNAP-25 (SNAP-25-immunoreactivity) and neurofilament 200) with the diffusion tensor imaging (DTI) database in order to make a reconstruction of the origin, growth pattern and termination of the pathways in the period between 8 and 34 postconceptual weeks (PCW). Histological sections revealed that the initial outgrowth and formation of joined trajectories of subcortico-frontal pathways (external capsule, cerebral stalk,internal capsule) and limbic bundles (fornix, stria terminalis, amygdaloid radiation) occur by 10 PCW. As early as 11 PCW, major afferent fibers invade the corticostriatal junction. At 13,14 PCW, axonal pathways from the thalamus and basal forebrain approach the deep moiety of the cortical plate, causing the first lamination. The period between 15 and 18 PCW is dominated by elaboration of the periventricular crossroads, sagittal strata and spread of fibers in the subplate and marginal zone. Tracing of fibers in the subplate with DTI is unsuccessful due to the isotropy of this zone. Penetration of the cortical plate occurs after 24,26 PCW. In conclusion, frontal axonal pathways form the periventricular crossroads, sagittal strata and ,waiting' compartments during the path-finding and penetration of the cortical plate. Histochemistry is advantageous in the demonstration of a growth pattern, whereas DTI is unique for demonstrating axonal trajectories. The complexity of fibers is the biological substrate of selective vulnerability of the fetal white matter. [source]


    Bayesian strategy assessment in multi-attribute decision making

    JOURNAL OF BEHAVIORAL DECISION MAKING, Issue 3 2003
    Arndt Bröder
    Abstract Behavioral Decision Research on multi-attribute decision making is plagued with the problem of drawing inferences from behavioral data on cognitive strategies. This bridging problem has been tackled by a range of methodical approaches, namely Structural Modeling (SM), Process Tracing (PT), and comparative model fitting. Whereas SM and PT have been criticized for a number of reasons, the comparative fitting approach has some theoretical advantages as long as the formal relation between theories and data is specified. A Bayesian method is developed that is able to assess, whether an empirical data vector was most likely generated by a ,Take The Best' heuristic (Gigerenzer et al., 1991), by an equal weight rule, or a compensatory strategy. Equations are derived for the two- and three-alternative cases, respectively, and a simulation study supports its validity. The classification also showed convergent validity with Process Tracing measures in an experiment. Potential extensions of the general approach to other applications in behavioral decision research are discussed. Copyright © 2003 John Wiley & Sons, Ltd. [source]


    Tracing back seed and pollen flow within the crop,wild Beta vulgaris complex: genetic distinctiveness vs. hot spots of hybridization over a regional scale

    MOLECULAR ECOLOGY, Issue 6 2004
    Frédérique Viard
    Abstract Hybrids between transgenic crops and wild relatives have been documented successfully in a wide range of cultivated species, having implications on conservation and biosafety management. Nonetheless, the magnitude and frequency of hybridization in the wild is still an open question, in particular when considering several populations at the landscape level. The Beta vulgaris complex provides an excellent biological model to tackle this issue. Weed beets contaminating sugar beet fields are expected to act as a relay between wild populations and crops and from crops-to-crops. In one major European sugar beet production area, nine wild populations and 12 weed populations were genetically characterized using cytoplasmic markers specific to the cultivated lines and nuclear microsatellite loci. A tremendous overall genetic differentiation between neighbouring wild and weed populations was depicted. However, genetic admixture analyses at the individual level revealed clear evidence for gene flow between wild and weed populations. In particular, one wild population displayed a high magnitude of nuclear genetic admixture, reinforced by direct seed flow as evidenced by cytoplasmic markers. Altogether, weed beets were shown to act as relay for gene flow between crops to wild populations and crops to crops by pollen and seeds at a landscape level. [source]


    Conversations with a Polish populist: Tracing hidden histories of globalization, class, and dispossession in postsocialism (and beyond)

    AMERICAN ETHNOLOGIST, Issue 2 2009
    DON KALB
    ABSTRACT Building on the work of Jonathan Friedman and of Andre Gingrich and Marcus Banks, I explain the rise of populist, neonationalist sensibilities in Poland as a set of defensive responses by working-class people to the silences imposed by liberal rule. I trace in detail a sequence of all-around dispossessions experienced by Polish working-class sodalities since 1989, when activists with substantial legitimacy among organized workers had claimed de facto and de jure control over assets crucial for working-class reproduction. "Democratization" and "markets" were shrewd legal ways by which the new liberal capitalist state reappropriated and recentralized those assets from local constituencies. Meanwhile, the reputation of workers, whose fights with the party-state had been essential for regaining national sovereignty and establishing parliamentary democracy, was systematically annihilated in the public sphere by discourses of "internal orientalism."[postsocialism, dispossession, class, neonationalism, populism, neoliberalism, globalization, privatization, Europe] [source]


    Bilateral testicular tuberculomas: a case detection

    ANDROLOGIA, Issue 2 2009
    A. Hassan
    Summary Genitourinary tuberculosis (TB) is the most frequent manifestation of extrapulmonary TB, where the epididymides, seminal vesicles and prostate are the commonly infected sites, followed by the testes. We report a 29-year-old man who presented with primary infertility since 2 years. He had a history of bilateral painful scrotal swelling with fever since 4 years, diagnosed as pyogenic scrotal abscess, which was managed by incision and drainage. At presentation, fever, weight loss and night sweats were absent. On examination, he had ovoid slightly tender, firm to hard irregular masses in the lower poles of both testes with no line of separation encroaching on both epididymes. Both testes were not felt distinctly and the overlying scrotal skin showed no signs of inflammation. Semen analysis revealed azoospermia. Scrotal colour coded duplex ultrasonography demonstrated moderately enlarged testes having well defined hypoechoic masses with foci of calcifications. Magnetic resonance imaging confirmed these findings. Biopsy and histopathology detected the presence of caseating granuloma and Ziehl,Neelsen staining of paraffin sections demonstrated acid-fast bacilli. The patient was treated with combination therapy. Tracing of the condition is discussed. [source]


    Homing of transplanted bone marrow cells in livers of Schistosoma mansoni -infected mice

    APMIS, Issue 4 2010
    NAGWA ELKHAFIF
    Elkhafif N, Voss B, Hammam O, Yehia H, Mansy S, Akl M, Boehm S, Mahmoud S, El Bendary O, El Fandy G. Homing of transplanted bone marrow cells in livers of Schistosoma mansoni -infected mice. APMIS 2010; 118: 277,87. The efficiency of differentiation of bone marrow cells (BMCs) into hepatocytes in vivo and its importance in physiopathological processes is still debated. Murine schistosomiasis was used as a liver injury model and unfractionated male mice BMCs were transplanted through intrahepatic injection into non-irradiated Schistosoma mansoni -infected female mice on their 16th week post-infection. Two weeks after bone marrow transplantation, mice were sacrificed on a weekly basis until 10 weeks. Tracing of male donor-derived cells in female recipient mice livers was carried out by the detection of Y chromosome expression by fluorescent in situ hybridization (FISH) and also of chromodomain Y-linked (CDYL) protein by indirect immunofluorescence (IF). Their transformation into hepatocytes was studied by double labelling indirect IF using antibodies directed against CDYL and mouse albumin. Histopathological and electron microscopic examinations revealed the presence of small hepatocyte-like cells in the periportal tracts and in between the hepatocytes facing the sinusoids. Donor-derived cells showing Y chromosome by FISH and expressing CDYL protein by IF were recovered in the infected transplanted livers. The initial number of these cells increased with increased post-transplantation time. Cells were mainly localized in the periphery of schistosoma granuloma. Few donor-derived cells appeared within the hepatic parenchymal tissue and showed positivity for albumin secretion by double labelling with IF. We suggest that transplanted bone marrow stem cells can repopulate the Schistosoma -infected liver of immunocompetent mice. Their differentiation is a complex event controlled by many factors and needs to be further characterized extensively. The extent and type of liver injury and the number of transplanted cells are important variables in the process of stem cell engraftment and differentiation into functioning hepatic cells that still need to be defined. [source]


    Tracing 8,600 participants 36 years after recruitment at age seven for the Tasmanian Asthma Study

    AUSTRALIAN AND NEW ZEALAND JOURNAL OF PUBLIC HEALTH, Issue 2 2006
    Cathryn Wharton
    Objective: To trace all participants 36 years after the original Tasmanian Asthma Study (TAS). Methods: In 1968, the TAS investigated asthma in 8,583 children who were born in 1961. We attempted to trace these participants in 2002,04 using names, dates of birth and gender. Current addresses were sought by computer linkage to the Commonwealth Electoral Roll, the Medicare database and the Tasmanian marriage records. Computer linkage was conducted with the National Death Index (NDI). Siblings of participants were also linked to the Commonwealth Electoral Roll and those identified were sent a letter requesting the participant's address. The Australian Twin Registry (ATR) and the 1991,93 TAS substudy were used to locate participant addresses. Results: After three rounds of electoral roll linkage, 56% of all cohort members were traced. Name changes were identified for 49% of the 3,477 females not initially matched to the electoral roll using linkage to marriage records. NDI linkage yielded a 0.7% match. Medicare linkage identified addresses for 27% of the 1,982 remaining participants. Writing to siblings located 60% of 1,661 participants. One hundred and eighty-three participants were matched to the 1991,93 TAS and 23 twins matched to the ATR. Overall, 81.5% of the cohort members were identified. Conclusions: With these methods, we have been able to trace a possible address for a large portion of the original participants, with the electoral roll linkage being the most useful. Implications: It is possible to trace Australians for follow-up studies using electronic linkage, although without unique identifiers it is labour and resource intensive and requires matching to several databases. [source]


    Tracing of intracellular zinc(II) fluorescence flux to monitor cell apoptosis by using FluoZin-3AM

    CELL BIOCHEMISTRY AND FUNCTION, Issue 7 2009
    Yi-Ming Li
    Abstract Changes in the free zinc(II) concentration are closely related to cell proliferation and apoptosis, especially during the early apoptotic process. In the present paper, we demonstrated that zinc(II) probe FluoZin-3AM owns sensitive properties to distinguish different stages of apoptotic cell (induced by an anticancer agent, etoposide) according to trace intracellular zinc(II) fluorescence flux. When apoptosis in HeLa or K562 cells was artificially induced, FluoZin-3AM selectively and strongly stained apoptotic cells only at early and middle stages, which was attributed to significantly increased free zinc(II) flux during these stages. This conclusion was further verified by comparing it with the conventional apoptosis detector probe Annexin-V-FITC and PI. Furthermore, FluoZin-3AM was found cell permeable to detect the intracellular zinc(II) fluorescence enhancement to threefolds within 120,s with low cytotoxicity when zinc(II) was incorporated into the cell by zinc(II) ionophore pyrithione. All the above implied that monitoring intracellular zinc fluorescence flux was an effective method to distinguish cell apoptosis from necrosis, and FluoZin-3AM was found to be a suitable probe acting alone to fulfill the work. Copyright © 2009 John Wiley & Sons, Ltd. [source]


    Replica Exchange Light Transport

    COMPUTER GRAPHICS FORUM, Issue 8 2009
    Shinya Kitaoka
    I.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism; I.3.3 [Computer Graphics]: Picture/Image Generation Abstract We solve the light transport problem by introducing a novel unbiased Monte Carlo algorithm called replica exchange light transport, inspired by the replica exchange Monte Carlo method in the fields of computational physics and statistical information processing. The replica exchange Monte Carlo method is a sampling technique whose operation resembles simulated annealing in optimization algorithms using a set of sampling distributions. We apply it to the solution of light transport integration by extending the probability density function of an integrand of the integration to a set of distributions. That set of distributions is composed of combinations of the path densities of different path generation types: uniform distributions in the integral domain, explicit and implicit paths in light (particle/photon) tracing, indirect paths in bidirectional path tracing, explicit and implicit paths in path tracing, and implicit caustics paths seen through specular surfaces including the delta function in path tracing. The replica-exchange light transport algorithm generates a sequence of path samples from each distribution and samples the simultaneous distribution of those distributions as a stationary distribution by using the Markov chain Monte Carlo method. Then the algorithm combines the obtained path samples from each distribution using multiple importance sampling. We compare the images generated with our algorithm to those generated with bidirectional path tracing and Metropolis light transport based on the primary sample space. Our proposing algorithm has better convergence property than bidirectional path tracing and the Metropolis light transport, and it is easy to implement by extending the Metropolis light transport. [source]


    Fast BVH Construction on GPUs

    COMPUTER GRAPHICS FORUM, Issue 2 2009
    C. Lauterbach
    We present two novel parallel algorithms for rapidly constructing bounding volume hierarchies on manycore GPUs. The first uses a linear ordering derived from spatial Morton codes to build hierarchies extremely quickly and with high parallel scalability. The second is a top-down approach that uses the surface area heuristic (SAH) to build hierarchies optimized for fast ray tracing. Both algorithms are combined into a hybrid algorithm that removes existing bottlenecks in the algorithm for GPU construction performance and scalability leading to significantly decreased build time. The resulting hierarchies are close in to optimized SAH hierarchies, but the construction process is substantially faster, leading to a significant net benefit when both construction and traversal cost are accounted for. Our preliminary results show that current GPU architectures can compete with CPU implementations of hierarchy construction running on multicore systems. In practice, we can construct hierarchies of models with up to several million triangles and use them for fast ray tracing or other applications. [source]


    Gradient-based Interpolation and Sampling for Real-time Rendering of Inhomogeneous, Single-scattering Media

    COMPUTER GRAPHICS FORUM, Issue 7 2008
    Zhong Ren
    Abstract We present a real-time rendering algorithm for inhomogeneous, single scattering media, where all-frequency shading effects such as glows, light shafts, and volumetric shadows can all be captured. The algorithm first computes source radiance at a small number of sample points in the medium, then interpolates these values at other points in the volume using a gradient-based scheme that is efficiently applied by sample splatting. The sample points are dynamically determined based on a recursive sample splitting procedure that adapts the number and locations of sample points for accurate and efficient reproduction of shading variations in the medium. The entire pipeline can be easily implemented on the GPU to achieve real-time performance for dynamic lighting and scenes. Rendering results of our method are shown to be comparable to those from ray tracing. [source]


    Shallow Bounding Volume Hierarchies for Fast SIMD Ray Tracing of Incoherent Rays

    COMPUTER GRAPHICS FORUM, Issue 4 2008
    H. Dammertz
    Abstract Photorealistic image synthesis is a computationally demanding task that relies on ray tracing for the evaluation of integrals. Rendering time is dominated by tracing long paths that are very incoherent by construction. We therefore investigate the use of SIMD instructions to accelerate incoherent rays. SIMD is used in the hierarchy construction, the tree traversal and the leaf intersection. This is achieved by increasing the arity of acceleration structures, which also reduces memory requirements. We show that the resulting hierarchies can be built quickly and are smaller than acceleration structures known so far while at the same time outperforming them for incoherent rays. Our new acceleration structure speeds up ray tracing by a factor of 1.6 to 2.0 compared to a highly optimized bounding interval hierarchy implementation, and 1.3 to 1.6 compared to an efficient kd-tree. At the same time, the memory requirements are reduced by 10,50%. Additionally we show how a caching mechanism in conjunction with this memory efficient hierarchy can be used to speed up shadow rays in a global illumination algorithm without increasing the memory footprint. This optimization decreased the number of traversal steps up to 50%. [source]


    Accelerating Ray Tracing using Constrained Tetrahedralizations

    COMPUTER GRAPHICS FORUM, Issue 4 2008
    Ares Lagae
    Abstract In this paper we introduce the constrained tetrahedralization as a new acceleration structure for ray tracing. A constrained tetrahedralization of a scene is a tetrahedralization that respects the faces of the scene geometry. The closest intersection of a ray with a scene is found by traversing this tetrahedralization along the ray, one tetrahedron at a time. We show that constrained tetrahedralizations are a viable alternative to current acceleration structures, and that they have a number of unique properties that set them apart from other acceleration structures: constrained tetrahedralizations are not hierarchical yet adaptive; the complexity of traversing them is a function of local geometric complexity rather than global geometric complexity; constrained tetrahedralizations support deforming geometry without any effort; and they have the potential to unify several data structures currently used in global illumination. [source]


    ReduceM: Interactive and Memory Efficient Ray Tracing of Large Models

    COMPUTER GRAPHICS FORUM, Issue 4 2008
    Christian Lauterbach
    We present a novel representation and algorithm, ReduceM, for memory efficient ray tracing of large scenes. ReduceM exploits the connectivity between triangles in a mesh and decomposes the model into triangle strips. We also describe a new stripification algorithm, Strip-RT, that can generate long strips with high spatial coherence. Our approach uses a two-level traversal algorithm for ray-primitive intersection. In practice, ReduceM can significantly reduce the storage overhead and ray trace massive models with hundreds of millions of triangles at interactive rates on desktop PCs with 4-8GB of main memory. [source]


    GPU-Based Nonlinear Ray Tracing

    COMPUTER GRAPHICS FORUM, Issue 3 2004
    Daniel Weiskopf
    In this paper, we present a mapping of nonlinear ray tracing to the GPU which avoids any data transfer back to main memory. The rendering process consists of the following parts: ray setup according to the camera parameters, ray integration, ray-object intersection, and local illumination. Bent rays are approximated by polygonal lines that are represented by textures. Ray integration is based on an iterative numerical solution of ordinary differential equations whose initial values are determined during ray setup. To improve the rendering performance, we propose acceleration techniques such as early ray termination and adaptive ray integration. Finally, we discuss a variety of applications that range from the visualization of dynamical systems to the general relativistic visualization in astrophysics and the rendering of the continuous refraction in media with varying density. Categories and Subject Descriptors (according to ACM CCS): I.3.3 [Computer Graphics]: Picture/Image Generation I.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism [source]


    The Scalasca performance toolset architecture

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 6 2010
    Markus Geimer
    Abstract Scalasca is a performance toolset that has been specifically designed to analyze parallel application execution behavior on large-scale systems with many thousands of processors. It offers an incremental performance-analysis procedure that integrates runtime summaries with in-depth studies of concurrent behavior via event tracing, adopting a strategy of successively refined measurement configurations. Distinctive features are its ability to identify wait states in applications with very large numbers of processes and to combine these with efficiently summarized local measurements. In this article, we review the current toolset architecture, emphasizing its scalable design and the role of the different components in transforming raw measurement data into knowledge of application execution behavior. The scalability and effectiveness of Scalasca are then surveyed from experience measuring and analyzing real-world applications on a range of computer systems. Copyright © 2010 John Wiley & Sons, Ltd. [source]


    Clock synchronization in Cell/B.E. traces

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 14 2009
    M. Biberstein
    Abstract Cell/B.E. is a heterogeneous multicore processor that was designed for the efficient execution of parallel and vectorizable applications with high computation and memory requirements. The transition to multicores introduces the challenge of providing tools that help programmers tune the code running on these architectures. Tracing tools, in particular, often help locate performance problems related to thread and process communication. A major impediment to implementing tracing on Cell is the absence of a common clock that can be accessed at low cost from all cores. The OS clock is costly to access from the auxiliary cores and the hardware timers cannot be simultaneously set on all the cores. In this paper, we describe an offline trace analysis algorithm that assigns wall-clock time to trace records based on their thread-local time stamps and event order. Our experiments on several Cell SDK workloads show that the indeterminism in assigning wall-clock time to events is low, on average 20,40 clock ticks (translating into 1.4,2.8,µs on the system used in our experiments). We also show how various practical problems, such as the imprecision of time measurement, can be overcome. Copyright © 2009 John Wiley & Sons, Ltd. [source]


    Individual-based Computational Modeling of Smallpox Epidemic Control Strategies

    ACADEMIC EMERGENCY MEDICINE, Issue 11 2006
    Donald S. Burke MD
    In response to concerns about possible bioterrorism, the authors developed an individual-based (or "agent-based") computational model of smallpox epidemic transmission and control. The model explicitly represents an "artificial society" of individual human beings, each implemented as a distinct object, or data structure in a computer program. These agents interact locally with one another in code-represented social units such as homes, workplaces, schools, and hospitals. Over many iterations, these microinteractions generate large-scale macroscopic phenomena of fundamental interest such as the course of an epidemic in space and time. Model variables (incubation periods, clinical disease expression, contagiousness, and physical mobility) were assigned following realistic values agreed on by an advisory group of experts on smallpox. Eight response scenarios were evaluated at two epidemic scales, one being an introduction of ten smallpox cases into a 6,000-person town and the other an introduction of 500 smallpox cases into a 50,000-person town. The modeling exercise showed that contact tracing and vaccination of household, workplace, and school contacts, along with prompt reactive vaccination of hospital workers and isolation of diagnosed cases, could contain smallpox at both epidemic scales examined. [source]


    Management of cutaneous tuberculosis

    DERMATOLOGIC THERAPY, Issue 3 2008
    Evangeline B Handog
    ABSTRACT: Cutaneous tuberculosis (TB) is an extrapulmonary form of tuberculosis, which may be classified based on the immunologic state of the host. Chemotherapy still remains the treatment of choice. The management of cutaneous TB follows the same guidelines as that of TB of other organs, which can be treated with a short course four-agent chemotherapeutic regimen given for 2 months followed by a two-drug regimen for the next 4 months. This chapter highlights current treatment recommendations for cutaneous TB. The important factors to consider in the choice of optimal treatment includes the type of cutaneous involvement, stage of the disease, level of immunity, and general condition of the patient. The highest priority in any cutaneous TB control program is the proper, accurate, and rapid detection of cases and the availability of chemotherapy to all tuberculosis patients until cure. Contact tracing is also an important component of efficient tuberculosis control. [source]


    R-wave Amplitude in Lead II of an Electrocardiograph Correlates with Central Hypovolemia in Human Beings

    ACADEMIC EMERGENCY MEDICINE, Issue 10 2006
    John G. McManus MD
    Abstract Objectives Previous animal and human experiments have suggested that reduction in central blood volume either increases or decreases the amplitude of R waves in various electrocardiograph (ECG) leads depending on underlying pathophysiology. In this investigation, we used graded central hypovolemia in adult volunteer subjects to test the hypothesis that moderate reductions in central blood volume increases R-wave amplitude in lead II of an ECG. Methods A four-lead ECG tracing, heart rate (HR), estimated stroke volume (SV), systolic blood pressure, diastolic blood pressure, and mean arterial pressure were measured during baseline supine rest and during progressive reductions of central blood volume to an estimated volume loss of >1,000 mL with application of lower-body negative pressure (LBNP) in 13 healthy human volunteer subjects. Results Lower-body negative pressure resulted in a significant progressive reduction in central blood volume, as indicated by a maximal decrease of 65% in SV and maximal elevation of 56% in HR from baseline to ,60 mm Hg LBNP. R-wave amplitude increased (p < 0.0001) linearly with progressive LBNP. The amalgamated correlation (R2) between average stroke volume and average R-wave amplitude at each LBNP stage was ,0.989. Conclusions These results support our hypothesis that reduction of central blood volume in human beings is associated with increased R-wave amplitude in lead II of an ECG. [source]


    Progressive neurogenesis defines lateralis somatotopy

    DEVELOPMENTAL DYNAMICS, Issue 7 2010
    Jesús Pujol-Martí
    Abstract Fishes and amphibians localize hydromechanical variations along their bodies using the lateral-line sensory system. This is possible because the spatial distribution of neuromasts is represented in the hindbrain by a somatotopic organization of the lateralis afferent neurons' central projections. The mechanisms that establish lateralis somatotopy are not known. Using BAPTI and neuronal tracing in the zebrafish, we demonstrate growth anisotropy of the posterior lateralis ganglion. We characterized a new transgenic line for in vivo imaging to show that although peripheral growth-cone structure adumbrates somatotopy, the order of neurogenesis represents a more accurate predictor of the position of a neuron's central axon along the somatotopic axis in the hindbrain. We conclude that progressive neurogenesis defines lateralis somatotopy. Developmental Dynamics 239:1919,1930, 2010. © 2010 Wiley-Liss, Inc. [source]