Home About us Contact | |||
Generation
Kinds of Generation Terms modified by Generation Selected AbstractsINNOVATION AND COMPETITION IN GENERATION AND RETAIL POWER MARKETSECONOMIC AFFAIRS, Issue 2 2010Elizabeth Hooper There has been considerable merger activity in EU energy markets in recent years. It could be argued that competition authorities should be required to take into account potential innovation effects of mergers. In the UK, regulators are now trying to achieve multiple objectives within the current framework. There is a danger that if markets are expected to deliver mutually incompatible objectives they will be unable to achieve any of them. [source] THE INTERSECTIONS OF GENDER AND GENERATION IN ALBANIAN MIGRATION, REMITTANCES AND TRANSNATIONAL CAREGEOGRAFISKA ANNALER SERIES B: HUMAN GEOGRAPHY, Issue 1 2009Russell King ABSTRACT. The Albanian case represents the most dramatic instance of post-communist migration: about one million Albanians, a quarter of the country's total population, are now living abroad, most of them in Greece and Italy, with the UK becoming increasingly popular since the late 1990s. This paper draws on three research projects based on fieldwork in Italy, Greece, the UK and Albania. These projects have involved in-depth interviews with Albanian migrants in several cities, as well as with migrant-sending households in different parts of Albania. In this paper we draw out those findings which shed light on the intersections of gender and generations in three aspects of the migration process: the emigration itself, the sending and receiving of remittances, and the care of family members (mainly the migrants' elderly parents) who remain in Albania. Theoretically, we draw on the notion of ,gendered geographies of power' and on how spatial change and separation through migration reshapes gender and generational relations. We find that, at all stages of the migration, Albanian migrants are faced with conflicting and confusing models of gender, behavioural and generational norms, as well as unresolved questions about their legal status and the likely economic, social and political developments in Albania, which make their future life plans uncertain. Legal barriers often prevent migrants and their families from enjoying the kinds of transnational family lives they would like. [source] NEW GENERATION OF HEALTHY SNACK FOOD BY SUPERCRITICAL FLUID EXTRUSIONJOURNAL OF FOOD PROCESSING AND PRESERVATION, Issue 2 2010K.Y. CHO ABSTRACT A supercritical fluid extrusion (SCFX) process has been successfully developed for the production of a novel healthy snack containing 40,60 wt% protein with unique porous structure and texture. Supercritical carbon dioxide (SC-CO2) injection rate and product temperature at the die were found to be critical to control the expansion and texture of the final product. Maximum cross-sectional expansion was obtained at 0.3 wt% added SC-CO2, whereas more uniform internal structure was achieved at 0.7 wt% SC-CO2 level. As whey protein concentrate (80 wt%) concentration was increased from 52.8 to 78.2 wt% in the formulation, the cross-sectional expansion of baked and fried products increased by 65.8 and 44.4%, respectively. It was observed that lower viscosity of whey protein compared with starch decreased expansion but helped enhance further expansion during post-extrusion drying. The finding showed that an extrusion process at the temperature below protein denaturation temperature using SC-CO2 can help to prevent hard texture due to the thermosetting property of whey protein and to create a uniformly expanded structure. The textural properties of SCFX chips were comparable to commercial extruded or fried chip products. PRACTICAL APPLICATIONS The American snack market is one of fast-growing markets in the world as snacking becomes more popular. Because of the increasing concerns about health, there is also an increasing demand for new healthy snacks as an alternative for fried starch-based snacks with low nutrient density. This study shows the potential of supercritical fluid extrusion (SCFX) technology for healthy snack food production containing whey protein. SCFX chips had uniform cellular microstructure that cannot be obtained using conventional steam-based extrusion. As supercritical carbon dioxide can deliver certain flavors, an expanded snack not only with high nutrient density and unique texture but also with encapsulated flavors can be produced using the SCFX process and can be marketed as a novel healthy snack. [source] GENERATION OF BIOLUMINESCENT MORGANELLA MORGANII AND ITS POTENTIAL USAGE IN DETERMINATION OF GROWTH LIMITS AND HISTAMINE PRODUCTIONJOURNAL OF FOOD SAFETY, Issue 2 2009MEHDI ZAREI ABSTRACT A mini-Tn5 promoter probe carrying the intact lux operon of Photorhabdus luminescens (pUT mini-Tn5 luxCDABE) which allowed measurement of light output without the addition of exogenous substrate was constructed. It was used to create a pool of chromosomally lux -marked strains of Morganella morganii. Also plasmid-mediated expression of bioluminescence in M. morganii was assessed using plasmid pT7-3 luxCDABE. No significant differences in growth and histamine formation characteristics of the lux -marked strains and wild type M. morganii strain were observed. Luminescent strain of M. morganii was used in experiments in which the correlation between light output, viable cell count and histamine formation was assessed. During the exponential growth phase, a positive linear correlation was observed between these three parameters in trypticase soy broth-histidine medium at 37C. It was demonstrated that expression of bioluminescence had not had a significant effect upon both growth rate and histamine production. Thus, the measurement of bioluminescence was found to be a simple, fast and reliable method for determination of viable cell count and histamine content. PRACTICAL APPLICATIONS Constructing predictive models in microbiology requires a large number of data on desired factors. Commonly used traditional methods of counting viable cells and measuring histamine, e.g., to model the growth limits of M. morganii as a function of different intrinsic and extrinsic factors, are time consuming and laborious, and require a lot of laboratory space and materials. According to the results of this research, measurement of bioluminescence is a simple, fast and reliable method for the determination of viable cell count and histamine content during the exponential growth phase. Thus, it can be used as a labor- and material-saving selective data capture method for constructing predictive models in many different areas. [source] RELATIONSHIP BETWEEN ORGANIC MATTER, SULPHUR AND PHOSPHATE CONTENTS IN UPPER CRETACEOUS MARINE CARBONATES (KARABOGAZ FORMATION, SE TURKEY): IMPLICATIONS FOR EARLY OIL GENERATIONJOURNAL OF PETROLEUM GEOLOGY, Issue 4 2010S. Inan In this paper, we discuss the relationship between the organic matter, sulphur and phosphate contents of Upper Cretaceous marine carbonates (Karabogaz Formation) in the Adiyaman Petroleum Province of SE Turkey. The results of organic geochemical analyses of core samples obtained from the Karabogaz Formation suggest that phosphate deposition occurred in settings where the water column was oxic to sub-oxic. However, the preservation of organic matter was favoured in anoxic environments. Moreover, the presence of sulphur (especially sulphur incorporated into kerogen) in organic matter-rich layers led to early oil generation. The results of stepwise py-gc analyses are consistent with a model in which, with increasing maturity, S-S and C-S bonds are the first to be eliminated from the macromolecular kerogen structure. Study of the maturity evolution of S-rich kerogen by laboratory pyrolysis implies that marginally mature and/or mature kerogen in the Karabogaz Formation, which may be classified as classic "Type II" kerogen, was most probably Type II/S at lower maturity stages. This enabled oil generation to occur at relatively shallow burial depths and relatively early stages of maturation. It is reasonable to conclude that Type II/S kerogen, overlooked in previous studies, was abundant in TOC-rich intervals in the Karabogaz Formation. Early generation (and expulsion) from Type II/S kerogen may have sourced the sulphur-rich oils in the Adiyaman area oilfields. [source] KINETICS OF HYDROCARBON GAS GENERATION FROM MARINE KEROGEN AND OIL: IMPLICATIONS FOR THE ORIGIN OF NATURAL GASES IN THE HETIANHE GASFIELD, TARIM BASIN, NW CHINAJOURNAL OF PETROLEUM GEOLOGY, Issue 4 2007Yunpeng Wang In this paper we derive kinetic parameters for the generation of gaseous hydrocarbons (C1-5) and methane (C1) from closed-system laboratory pyrolysis of selected samples of marine kerogen and oil from the SW Tarim Basin. The activation energy distributions for the generation of both C1-5 (Ea = 59-72kcal, A = 1.0×1014 s,1) and C1 (Ea = 61-78kcal, A = 6.06×1014 s,1) hydrocarbons from the marine oil are narrower than those for the generation of these hydrocarbons from marine kerogen (Ea = 50-74kcal, A = 1.0×1014 s,1 for C1-5; and Ea = 48-72kcal, A=3.9×1013 s,1 for C1, respectively). Using these kinetic parameters, both the yields and timings of C1-5 and C1 hydrocarbons generated from Cambrian source rocks and from in-reservoir cracking of oil in Ordovician strata were predicted for selected wells along a north-south profile in the SW of the basin. Thermodynamic conditions for the cracking of oil and kerogen were modelled within the context of the geological framework. It is suggested that marine kerogen began to crack at temperatures of around 120°C (or 0.8 %Ro) and entered the gas window at 138°C (or 1.05 %Ro); whereas the marine oil began to crack at about 140 °C (or 1.1 %Ro) and entered the gas window at 158 °C (or 1.6%Ro). The main geological controls identified for gas accumulations in the Bachu Arch (Southwest Depression, SW Tarim Basin) include the remaining gas potential following Caledonian uplift; oil trapping and preservation in basal Ordovician strata; the extent of breaching of Ordovician reservoirs; and whether reservoir burial depths are sufficiently deep for oil cracking to have occurred. In the Maigaiti Slope and Southwest Depression, the timing of gas generation was later than that in the Bachu Arch, with much higher yields and generation rates, and hence better prospects for gas exploration. It appears from the gas generation kinetics that the primary source for the gases in the Hetianhe gasfield was the Southwest Depression. [source] A REVIEW OF GEOLOGICAL DATA THAT CONFLICT WITH THE PARADIGM OF CATAGENIC GENERATION AND MIGRATION OF OILJOURNAL OF PETROLEUM GEOLOGY, Issue 3 2005H. Hugh Wilson The majority of petroleum geologists today agree that the complex problems that surround the origin, generation, migration and accumulation of hydrocarbons can be resolved by accepting the geochemical conclusion that the process originates by catagenic generation in deeply-buried organically-rich source rocks. These limited source rock intervals are believed to expel hydrocarbons when they reach organic maturity in oil kitchens. The expelled oil and gas then follow migration pathways to traps at shallower levels. However, there are major geological obstacles that cast doubt upon this interpretation. The restriction of the source rock to a few organically rich levels in a basin forces the conclusion that the basin plumbing system is leaky and allows secondary horizontal and vertical migration through great thicknesses of consolidated sedimentary rocks in which there are numerous permeability barriers that are known to effectively prevent hydrocarbon escape from traps. The sourcing of lenticular traps points to the enclosing impermeable envelope as the logical origin of the trapped hydrocarbons. The lynch-pin of the catagenic theory of hydrocarbon origin is the expulsion mechanism from deeply-buried consolidated source rock under high confining pressures. This mechanism is not understood and is termed an "enigma". Assuming that expulsion does occur, the pathways taken by the hydrocarbons to waiting traps can be ascertained by computer modelling of the basin. However, subsurface and field geological support for purported migration pathways has yet to be provided. Many oilfield studies have shown that oil and gas are preferentially trapped in synchronous highs that were formed during, or very shortly after, the deposition of the charged reservoir. An unresolved problem is how catagenically generated hydrocarbons, expelled during a long-drawn-out maturation period, can have filled synchronous highs but have avoided later traps along the assumed migration pathways. From many oilfield studies, it has also been shown that the presence of hydrocarbons inhibits diagenesis and compaction of the reservoir rock. This "Füchtbauer effect" points to not only the early charging of clastic and carbonate reservoirs, but also to the development of permeability barriers below the early-formed accumulations. These barriers would prevent later hydrocarbon additions during the supposed extended period of expulsion from an oil kitchen. Early-formed traps that have been sealed diagenetically will retain their charge even if the trap is opened by later structural tilting. Diagenetic traps have been discovered in clastic and carbonate provinces but their recognition as viable exploration targets is discouraged by present-day assumptions of late hydrocarbon generation and a leaky basin plumbing system. Because there are so many geological realities that cast doubt upon the assumptions that devolve from the paradigm of catagenic generation, the alternative concept of early biogenic generation and accumulation of immature oil, with in-reservoir cracking during burial, is again worthy of serious consideration. This concept envisages hydrocarbon generation by bacterial activity in many anoxic environments and the charging of synchronous highs from adjacent sources. The resolution of the fundamental problem of hydrocarbon generation and accumulation, which is critical to exploration strategies, should be sought in the light of a thorough knowledge of the geologic factors involved, rather than by computer modelling which may be guided by questionable geochemical assumptions. [source] SO YOU ALREADY HAVE A SURVEY DATABASE?,A SEVEN-STEP METHODOLOGY FOR THEORY BUILDING FROM SURVEY DATABASES: AN ILLUSTRATION FROM INCREMENTAL INNOVATION GENERATION IN BUYER,SELLER RELATIONSHIPSJOURNAL OF SUPPLY CHAIN MANAGEMENT, Issue 4 2010SUBROTO ROY Across business disciplines, the importance of database research for theory testing continues to increase. The availability of data also has increased, though methods to analyze and interpret these data lag. This research proposes a method for extracting strong measures from survey databases by a progression from qualitative to quantitative techniques. To test the proposed method, this study uses the Industrial Marketing and Purchasing (IMP) survey database, which includes data from firms in several European countries. The proposed method consists of two phases and seven steps, as illustrated in the context of the firm's incremental innovation generation for buyer,seller relationships. This systematic progression moves from a broad but valid empirical case study to the development of a narrow and reliable measure of incremental innovation generation in the IMP database. The proposed method can use supply chain survey databases for theory development without requiring primary data collection, assuming certain conditions. [source] FINDING FAITH: THE SPIRITUAL QUEST OF THE POST-BOOMER GENERATION by Richard Flory and Donald E. MillerNEW BLACKFRIARS, Issue 1028 2009KIERAN FLANAGAN No abstract is available for this article. [source] DYNAMIC SEARCH SPACE TRANSFORMATIONS FOR SOFTWARE TEST DATA GENERATIONCOMPUTATIONAL INTELLIGENCE, Issue 1 2008Ramón Sagarna Among the tasks in software testing, test data generation is particularly difficult and costly. In recent years, several approaches that use metaheuristic search techniques to automatically obtain the test inputs have been proposed. Although work in this field is very active, little attention has been paid to the selection of an appropriate search space. The present work describes an alternative to this issue. More precisely, two approaches which employ an Estimation of Distribution Algorithm as the metaheuristic technique are explained. In both cases, different regions are considered in the search for the test inputs. Moreover, to depart from a region near to the one containing the optimum, the definition of the initial search space incorporates static information extracted from the source code of the software under test. If this information is not enough to complete the definition, then a grid search method is used. According to the results of the experiments conducted, it is concluded that this is a promising option that can be used to enhance the test data generation process. [source] OPTIMAL DISCOUNTING IN CONTROL PROBLEMS THAT SPAN MULTIPLE GENERATIONSNATURAL RESOURCE MODELING, Issue 3 2005FRANK CALIENDO ABSTRACT. The principal contribution of this paper is the linking together of separate control problems across multiple generations using the bequest motive, intergenerational altruism, rational expectations, and solution boundary conditions. We demonstrate that discounting at the market rate of interest is an endogenous characteristic of a general equilibrium, optimal control problem that spans multiple generations. Within the confines of our model, we prove that it is optimal to discount at the market rate of interest the social benefits to distant generations from immediate clean up at toxic waste sites if the current generation that bears the cleanup cost is perfectly altruistic towards future generations. Also, we show that this result holds for alternative assumptions regarding pure time preference. Moreover, the result holds regardless of whether selfish interim generations attempt to undo the provisions made for distant generations. In our distortion-free deterministic model, the evidence for intergenerational discounting at the market rate of interest is compelling. [source] BISHOPS, WIVES AND CHILDREN: SPIRITUAL CAPITAL ACROSS THE GENERATIONS by Douglas J. Davies and Mathew GuestNEW BLACKFRIARS, Issue 1019 2008KIERAN FLANAGAN No abstract is available for this article. [source] Generation of tree movement sound effectsCOMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 5 2005Katsutsugu Matsuyama Abstract This paper presents a method for automatically generating sound effects for an animation of branches and leaves moving in the wind. Each tree is divided into branches and leaves, and an independent sound effect generation process is employed for each element. The individual results are then compounded into one sound effect. For the branches, we employ an approach based on the frequencies of experimentally obtained Karman vortex streets. For the leaves, we use the leaf blade state as the input and assume a virtual musical instrument that uses wave tables as the sound source. All computations can be performed independently for each frame step. Therefore, each frame step can be executed on completion of the animation step. The results of the implementation of the approach are presented and it is shown that the process offers the possibility of real-time operation through the use of parallel computing techniques. Copyright © 2005 John Wiley & Sons, Ltd. [source] Generation of a virtual reality-based automotive driving training system for CAD educationCOMPUTER APPLICATIONS IN ENGINEERING EDUCATION, Issue 2 2009Janus Liang Abstract Designing and constructing a virtual reality-based system is useful for educating students about scenario planning, geometric modeling and computer graphics. In particular, students are exposed to the practical issues surrounding topics such as geometric modeling, rendering, collision detection, model animation and graphical design. Meanwhile, building an application system provides students exposure to the real-world side of software engineering that they are typically shielded from in the traditional computer class. This study is a description of the experiences with instructing "Computer-aided Industrial design" and "OOP," two introductory classes that focus on designing and generating the VR based system possible in the course of a semester and then "VR System," an advanced course in the next semester. This study emphasizes the continuing evolution in the training and educational needs of students of CAD-systems. This study breaks down an automobile driving training system into different components that are suitable for individual student projects and discusses the use of modern graphical design tools such as 3ds MAX for artistic design in this system. The conclusion of this study proposes a rough schedule for developing a VR based system during the course of a semester and an overview is given of a concept of a virtual reality-based design and constructing system that is being developed. © 2008 Wiley Periodicals, Inc. Comput Appl Eng Educ 17: 148,166, 2009; Published online in Wiley InterScience (www.interscience.wiley.com); DOI 10.1002/cae20178 [source] Application of Visual Analytics for Thermal State Management in Large Data CentresCOMPUTER GRAPHICS FORUM, Issue 6 2010M. C. Hao I.3.3 [Computer Graphics]: Picture/Image Generation,Display Algorithms; H.5.0 [Information Systems]: Information Interfaces and Presentation,General Abstract Today's large data centres are the computational hubs of the next generation of IT services. With the advent of dynamic smart cooling and rack level sensing, the need for visual data exploration is growing. If administrators know the rack level thermal state changes and catch problems in real time, energy consumption can be greatly reduced. In this paper, we apply a cell-based spatio-temporal overall view with high-resolution time series to simultaneously analyze complex thermal state changes over time across hundreds of racks. We employ cell-based visualization techniques for trouble shooting and abnormal state detection. These techniques are based on the detection of sensor temperature relations and events to help identify the root causes of problems. In order to optimize the data centre cooling system performance, we derive new non-overlapped scatter plots to visualize the correlations between the temperatures and chiller utilization. All these techniques have been used successfully to monitor various time-critical thermal states in real-world large-scale production data centres and to derive cooling policies. We are starting to embed these visualization techniques into a handheld device to add mobile monitoring capability. [source] Time-Adaptive Lines for the Interactive Visualization of Unsteady Flow Data SetsCOMPUTER GRAPHICS FORUM, Issue 8 2009N. Cuntz I.3.3 [Computer Graphics]: Line and Curve Generation; I.3.1 [Computer Graphics]: Parallel Processing Abstract The quest for the ideal flow visualization reveals two major challenges: interactivity and accuracy. Interactivity stands for explorative capabilities and real-time control. Accuracy is a prerequisite for every professional visualization in order to provide a reliable base for analysis of a data set. Geometric flow visualization has a long tradition and comes in very different flavors. Among these, stream, path and streak lines are known to be very useful for both 2D and 3D flows. Despite their importance in practice, appropriate algorithms suited for contemporary hardware are rare. In particular, the adaptive construction of the different line types is not sufficiently studied. This study provides a profound representation and discussion of stream, path and streak lines. Two algorithms are proposed for efficiently and accurately generating these lines using modern graphics hardware. Each includes a scheme for adaptive time-stepping. The adaptivity for stream and path lines is achieved through a new processing idea we call ,selective transform feedback'. The adaptivity for streak lines combines adaptive time-stepping and a geometric refinement of the curve itself. Our visualization is applied, among others, to a data set representing a simulated typhoon. The storage as a set of 3D textures requires special attention. Both algorithms explicitly support this storage, as well as the use of precomputed adaptivity information. [source] Replica Exchange Light TransportCOMPUTER GRAPHICS FORUM, Issue 8 2009Shinya Kitaoka I.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism; I.3.3 [Computer Graphics]: Picture/Image Generation Abstract We solve the light transport problem by introducing a novel unbiased Monte Carlo algorithm called replica exchange light transport, inspired by the replica exchange Monte Carlo method in the fields of computational physics and statistical information processing. The replica exchange Monte Carlo method is a sampling technique whose operation resembles simulated annealing in optimization algorithms using a set of sampling distributions. We apply it to the solution of light transport integration by extending the probability density function of an integrand of the integration to a set of distributions. That set of distributions is composed of combinations of the path densities of different path generation types: uniform distributions in the integral domain, explicit and implicit paths in light (particle/photon) tracing, indirect paths in bidirectional path tracing, explicit and implicit paths in path tracing, and implicit caustics paths seen through specular surfaces including the delta function in path tracing. The replica-exchange light transport algorithm generates a sequence of path samples from each distribution and samples the simultaneous distribution of those distributions as a stationary distribution by using the Markov chain Monte Carlo method. Then the algorithm combines the obtained path samples from each distribution using multiple importance sampling. We compare the images generated with our algorithm to those generated with bidirectional path tracing and Metropolis light transport based on the primary sample space. Our proposing algorithm has better convergence property than bidirectional path tracing and the Metropolis light transport, and it is easy to implement by extending the Metropolis light transport. [source] Flow-Based Automatic Generation of Hybrid Picture MazesCOMPUTER GRAPHICS FORUM, Issue 7 2009Fernando J. Wong Abstract A method for automatically generating a picture maze from two different images is introduced throughout this paper. The process begins with the extraction of salient contours and edge tangent flow information from the primary image in order to build the overall maze. Thus, mazes with passages flowing in the main edge directions and walls that effectively represent an abstract version of the primary image can be successfully created. Furthermore, our proposed approach makes possible the use of their solution path as a means of illustrating the main features of the secondary image, while attempting to keep its image motif concealed until the maze has been finally solved. The contour features and intensity of the secondary image are also incorporated into our method in order to determine the areas of the maze to be shaded by allowing the solution path to go through them. Moreover, an experiment has been conducted to confirm that solution paths can be successfully hidden from the participants in the mazes generated using our method. [source] Automatic Generation of Structure Preserving Multiresolution ModelsCOMPUTER GRAPHICS FORUM, Issue 3 2005M. Marinov First page of article [source] Fast Summed-Area Table Generation and its ApplicationsCOMPUTER GRAPHICS FORUM, Issue 3 2005Justin Hensley First page of article [source] RenderBots,Multi-Agent Systems for Direct Image GenerationCOMPUTER GRAPHICS FORUM, Issue 2 2005Stefan Schlechtweg Abstract The term stroke-based rendering collectively describes techniques where images are generated from elements that are usually larger than a pixel. These techniques lend themselves well for rendering artistic styles such as stippling and hatching. This paper presents a novel approach for stroke-based rendering that exploits multi-agent systems. RenderBots are individual agents each of which in general represents one stroke. They form a multi-agent system and undergo a simulation to distribute themselves in the environment. The environment consists of a source image and possibly additional G-buffers. The final image is created when the simulation is finished by having each RenderBot execute its painting function. RenderBot classes differ in their physical behavior as well as their way of painting so that different styles can be created in a very flexible way. [source] Instant Volumetric Understanding with Order-Independent Volume RenderingCOMPUTER GRAPHICS FORUM, Issue 3 2004Benjamin Mora Rapid, visual understanding of volumetric datasets is a crucial outcome of a good volume rendering application, but few current volume rendering systems deliver this result. Our goal is to reduce the volumetric surfing that is required to understand volumetric features by conveying more information in fewer images. In order to achieve this goal, and in contrast with most current methods which still use optical models and alpha blending, our approach reintroduces the order-independent contribution of every sample along the ray in order to have an equiprobable visualization of all the volume samples. Therefore, we demonstrate how order independent sampling can be suitable for fast volume understanding, show useful extensions to MIP and X-ray like renderings, and, finally, point out the special advantage of using stereo visualization in these models to circumvent the lack of depth cues. Categories and Subject Descriptors: I.3.3 [Computer Graphics]: Picture/Image, Generation, I.3.7 [Computer Graphics]: Three-Dimensional graphics and realism. [source] Hardware-Accelerated Rendering of Photo HullsCOMPUTER GRAPHICS FORUM, Issue 3 2004Ming Li This paper presents an efficient hardware-accelerated method for novel view synthesis from a set of images or videos. Our method is based on the photo hull representation, which is the maximal photo-consistent shape. We avoid the explicit reconstruction of photo hulls by adopting a view-dependent plane-sweeping strategy. From the target viewpoint slicing planes are rendered with reference views projected onto them. Graphics hardware is exploited to verify the photo-consistency of each rasterized fragment. Visibilities with respect to reference views are properly modeled, and only photo-consistent fragments are kept and colored in the target view. We present experiments with real images and animation sequences. Thanks to the more accurate shape of the photo hull representation, our method generates more realistic rendering results than methods based on visual hulls. Currently, we achieve rendering frame rates of 2,3 fps. Compared to a pure software implementation, the performance of our hardware-accelerated method is approximately 7 times faster. Categories and Subject Descriptors (according to ACM CCS): CR Categories: I.3.3 [Computer Graphics]: Picture/Image Generation; I.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism. [source] Hierarchical Higher Order Face Cluster Radiosity for Global Illumination Walkthroughs of Complex Non-Diffuse EnvironmentsCOMPUTER GRAPHICS FORUM, Issue 3 2003Enrico Gobbetti We present an algorithm for simulating global illumination in scenes composed of highly tessellated objects withdiffuse or moderately glossy reflectance. The solution method is a higher order extension of the face cluster radiositytechnique. It combines face clustering, multiresolution visibility, vector radiosity, and higher order baseswith a modified progressive shooting iteration to rapidly produce visually continuous solutions with limited memoryrequirements. The output of the method is a vector irradiance map that partitions input models into areaswhere global illumination is well approximated using the selected basis. The programming capabilities of moderncommodity graphics architectures are exploited to render illuminated models directly from the vector irradiancemap, exploiting hardware acceleration for approximating view dependent illumination during interactive walkthroughs.Using this algorithm, visually compelling global illumination solutions for scenes of over one millioninput polygons can be computed in minutes and examined interactively on common graphics personal computers. Categories and Subject Descriptors (according to ACM CCS): I.3.3 [Computer Graphics]: Picture and Image Generation; I.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism. [source] Progressive Hulls for Intersection ApplicationsCOMPUTER GRAPHICS FORUM, Issue 2 2003Nikos Platis Abstract Progressive meshes are an established tool for triangle mesh simplification. By suitably adapting the simplification process, progressive hulls can be generated which enclose the original mesh in gradually simpler, nested meshes. We couple progressive hulls with a selective refinement framework and use them in applications involving intersection queries on the mesh. We demonstrate that selectively refinable progressive hulls considerably speed up intersection queries by efficiently locating intersection points on the mesh. Concerning the progressive hull construction, we propose a new formula for assigning edge collapse priorities that significantly accelerates the simplification process, and enhance the existing algorithm with several conditions aimed at producing higher quality hulls. Using progressive hulls has the added advantage that they can be used instead of the enclosed object when a lower resolution of display can be tolerated, thus speeding up the rendering process. ACM CSS: I.3.3 Computer Graphics,Picture/Image Generation, I.3.5 Computer Graphics,Computational Geometry and Object Modeling, I.3.7 Computer Graphics,Three-Dimensional Graphics and Realism [source] A Note on the Interdependence between Hypothesis Generation and Information Search in Conducting Analytical Procedures,CONTEMPORARY ACCOUNTING RESEARCH, Issue 2 2003Stephen K. Asare Abstract This study examines the linkage among the initial hypothesis set, the information search, and decision performance in performing analytical procedures. We manipulated the quality of the initial hypothesis set and the quality of the information search to investigate the extent to which deficiencies (or benefits) in either process can be remedied (or negated) by the other phase. The hypothesis set manipulation entailed inheriting a correct hypothesis set, inheriting an incorrect hypothesis set, or generating a hypothesis set. The information search was manipulated by providing a balanced evidence set to auditors (i.e., evidence on a range of likely causes including the actual cause - analogous to a standard audit program) or asking them to conduct their own search. One hundred and two auditors participated in the study. The results show that auditors who inherit a correct hypothesis set and receive balanced evidence performed better than those who inherit a correct hypothesis set and did their own search, as well as those who inherited an incorrect hypothesis set and were provided a balanced evidence set. The former performance difference arose because auditors who conducted their own search were found to do repeated testing of non-errors and truncated their search. This suggests that having a correct hypothesis set does not ensure that a balanced testing strategy is employed, which, in turn, diminishes part of the presumed benefits of a correct hypothesis set. The latter performance difference was attributable to auditors' failure to generate new hypotheses when they received evidence about a hypothesis that was not in the current hypothesis set. This demonstrates that balanced evidence does not fully compensate for having an initial incorrect hypothesis set. These findings suggest the need for firm training and/or decision aids to facilitate both a balanced information search and an iterative hypothesis generation process. [source] Spectroscopic Diagnostics of Pulsed arc Plasmas for Particle GenerationCONTRIBUTIONS TO PLASMA PHYSICS, Issue 8 2008K. Behringer Abstract Pulsed arc plasmas were diagnosed by means of emission spectroscopy. A capacitor was discharged through argon and hydrogen leading to a few cycles of damped current oscillation with ,120 ,s period and 5-12 kA maximum current. Spectroscopic measurements in the visible range were carried out in order to characterise the electron temperature and density in the arc channel as well as electron and gas temperatures in the afterglow plasmas. Spectra were integrated over 10 ,s time windows and shifted in time from pulse to pulse. The plasmas also contained substantial fractions of electrode material (brass), namely copper and zinc. The electron density was measured in the conventional way from the broadening of H, or from the Ar I Stark width. In the arc channel, it ranged from about 3 · 1022 to 2 · 1023 m,3. The broadening of Zn II lines could also be used. Ratios of Ar I to Ar II and of Zn I to Zn II line intensities were analysed for the electron temperature. Line pairs were found which lay conveniently close in one frame of the spectrometer allowing automatic on-line analysis without relying on reproducibility. Atomic physics models including opacity were developed for Ar II and Zn II in order to check the existence of a Boltzmann distribution of their excited states. These calculations showed that the observed levels were in fact close to thermodynamic equilibrium, in particular, if the resonance lines were optically thick. Electron temperature measurements yielded values between 14000 K and 21000 K. The gas temperature in the afterglow, where particles should have formed, was derived from the rotational and vibrational temperatures of C2 molecular bands. Ratios between Cu I line intensities yielded the electron temperatures. Both were found to be a few 1000 K. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] Generation of Thin Surface Plasma Layers for Atmospheric-Pressure Surface TreatmentsCONTRIBUTIONS TO PLASMA PHYSICS, Issue 5-6 2004ernák Abstract Thin layers of atmospheric-pressure non-equilibrium plasma can be generated by pulse surface corona discharges and surface barrier discharges developing on the treated surfaces or brought into a close contact with the treated surfaces. Plasma sources based on these discharge types have the potential of meeting the basic on-line production requirements in the industry and can be useful for a wide range of surface treatments and deposition processes including continuous treatment of textiles. Comparing with atmospheric pressure glow discharge sources, the potential advantages of these plasma sources include their simplicity, robustness, and capability to process in a wide range of working gases. (© 2004 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] Drosophila multiplexin (Dmp) modulates motor axon pathfinding accuracyDEVELOPMENT GROWTH & DIFFERENTIATION, Issue 5 2009Frauke Meyer Multiplexins are multidomain collagens typically composed of an N-terminal thrombospondin-related domain, an interrupted triple helix and a C-terminal endostatin domain. They feature a clear regulatory function in the development of different tissues, which is chiefly conveyed by the endostatin domain. This domain can be found in proteolytically released monomeric and trimeric versions, and their diverse and opposed effects on the migratory behavior of epithelial and endothelial cell types have been demonstrated in cell culture experiments. The only Drosophila multiplexin displays specific features of both vertebrate multiplexins, collagens XV and XVIII. We characterized the Drosophila multiplexin (dmp) gene and found that three main isoforms are expressed from it, one of which is the monomeric endostatin version. Generation of dmp deletion alleles revealed that Dmp plays a role in motor axon pathfinding, as the mutants exhibit ventral bypass defects of the intersegmental nerve b (ISNb) similar to other motor axon guidance mutants. Transgenic overexpression of monomeric endostatin as well as of full-length Dmp, but not trimeric endostatin, were able to rescue these defects. In contrast, trimeric endostatin increased axon pathfinding accuracy in wild type background. We conclude that Dmp plays a modulating role in motor axon pathfinding and may be part of a buffering system that functions to avoid innervation errors. [source] Generation and characterization of a novel neural crest marker allele, Inka1-LacZ, reveals a role for Inka1 in mouse neural tube closureDEVELOPMENTAL DYNAMICS, Issue 4 2010Bethany S. Reid Abstract Previous studies identified Inka1 as a gene regulated by AP-2, in the neural crest required for craniofacial morphogenesis in fish and frog. Here, we extend the analysis of Inka1 function and regulation to the mouse by generating a LacZ knock-in allele. Inka1-LacZ allele expression occurs in the cephalic mesenchyme, heart, and paraxial mesoderm prior to E8.5. Subsequently, expression is observed in the migratory neural crest cells and their derivatives. Consistent with expression of Inka1 in tissues of the developing head during neurulation, a low percentage of Inka1,/, mice show exencephaly while the remainder are viable and fertile. Further studies indicate that AP-2, is not required for Inka1 expression in the mouse, and suggest that there is no significant genetic interaction between these two factors during embryogenesis. Together, these data demonstrate that while the expression domain of Inka1 is conserved among vertebrates, its function and regulation are not. Developmental Dynamics 239:1188,1196, 2010. © 2010 Wiley-Liss, Inc. [source] |