Home About us Contact | |||
Pipeline
Kinds of Pipeline Terms modified by Pipeline Selected AbstractsASD/PFO Devices: What Is in the Pipeline?JOURNAL OF INTERVENTIONAL CARDIOLOGY, Issue 6 2007NICOLAS MAJUNKE Since the initial description of an atrial septal defect (ASD) occluding device in the mid-1970s by King and Mills,a number of devices have been developed. To date, various transcatheter devices and methods to close congenital heart defects are currently available commercially or within clinical trials. Devices have been designed specifically for the ASD and patent foramen ovale (PFO). The trend in interventional treatment of intracardiac shunts is toward defect-specific systems and new devices minimizing the foreign material left in the atria. This review first focuses on new devices that are not approved in the United States but are elsewhere, and then reviews the experimental devices for PFO and ASD closure. [source] A simulation-optimization framework for research and development pipeline managementAICHE JOURNAL, Issue 10 2001Dharmashankar Subramanian The Research and Development Pipeline management problem has far-reaching economic implications for new-product-development-driven industries, such as pharmaceutical, biotechnology and agrochemical industries. Effective decision-making is required with respect to portfolio selection and project task scheduling in the face of significant uncertainty and an ever-constrained resource pool. The here-and-now stochastic optimization problem inherent to the management of an R&D Pipeline is described in its most general form, as well as a computing architecture, Sim-Opt, that combines mathematical programming and discrete event system simulation to assess the uncertainty and control the risk present in the pipeline. The R&D Pipeline management problem is viewed in Sim-Opt as the control problem of a performance-oriented, resource-constrained, stochastic, discrete-event, dynamic system. The concept of time lines is used to study multiple unique realizations of the controlled evolution of the discrete-event pipeline system. Four approaches using various degrees of rigor were investigated for the optimization module in Sim-Opt, and their relative performance is explored through an industrially motivated case study. Methods are presented to efficiently integrate information across the time lines from this framework. This integration of information demonstrated in a case study was used to infer a creative operational policy for the corresponding here-and-now stochastic optimization problem. [source] The Geopolitics of Natural Gas in AsiaOPEC ENERGY REVIEW, Issue 3 2001Gawdat Bahgat Over the last few years, natural gas has been the fastest-growing component of primary world energy consumption. This study seeks to examine the recent efforts by the Islamic Republic of Iran, Qatar, the United Arab Emirates and Saudi Arabia to develop their natural gas resources and capture a large share of the Asian market, particularly in Turkey, India, China, Japan and South Korea. Counter-efforts by rivals, such as the Russian Federation and the Caspian Basin states, are analysed. Finally, international ventures to transport natural gas from producers to consumers, including the Dolphin Project, the Trans-Caspian Pipeline and Blue Stream, are discussed. [source] Wirkstoffe in der Pipeline.PHARMAZIE IN UNSERER ZEIT (PHARMUZ), Issue 3 2007Können wir die Schlaflosigkeit demnächst besser behandeln? Nachdem die Benzodiazepine weitgehend von den Z-Hypnotika (Zolpidem, Zopiclon und Zaleplon) verdrängt wurden, kommen auf den Schlafmittelmarkt große Veränderungen zu. In den nächsten Jahren ist mit der Einführung von neuen Hypnotika mit z.T. innovativen Wirkprinzipien zu rechnen. Bei den Arzneistoffkandidaten handelt es sich um GABAerge-, serotoninerge- und melatoninerge Wirkstoffe. [source] Surfactant Systems: A Survey of the Transcontinental Gas Pipeline from Houston to New JerseyARCHITECTURAL DESIGN, Issue 6 2005Petia Morozov Abstract The Transcontinental Pipeline, Transco, is a 10,560-mile line that traverses the US, transporting natural gas from its source in the Gulf of Mexico to the East Coast. Petia Morozov describes the postwar engineering feat that made the pipeline a reality, and also reveals the web of myriad agreements, with often diametrically opposed parties or interests, that support its rights of way and management. Copyright © 2005 John Wiley & Sons, Ltd. [source] The Sloan Digital Sky Survey monitor telescope pipelineASTRONOMISCHE NACHRICHTEN, Issue 9 2006D.L. Tucker Abstract The photometric calibration of the Sloan Digital Sky Survey (SDSS) is a multi-step process which involves data from three different telescopes: the 1.0-m telescope at the US Naval Observatory (USNO), Flagstaff Station, Arizona (which was used to establish the SDSS standard star network); the SDSS 0.5-m Photometric Telescope (PT) at the Apache Point Observatory (APO), NewMexico (which calculates nightly extinctions and calibrates secondary patch transfer fields); and the SDSS 2.5-m telescope at APO (which obtains the imaging data for the SDSS proper). In this paper, we describe the Monitor Telescope Pipeline, MTPIPE, the software pipeline used in processing the data from the single-CCD telescopes used in the photometric calibration of the SDSS (i.e., the USNO 1.0-m and the PT). We also describe transformation equations that convert photometry on the USNO-1.0m u ,g ,r ,i ,z , system to photometry the SDSS 2.5m ugriz system and the results of various validation tests of the MTPIPE software. Further, we discuss the semi-automated PT factory, which runs MTPIPE in the day-to-day standard SDSS operations at Fermilab. Finally, we discuss the use of MTPIPE in current SDSS-related projects, including the Southern u ,g ,r ,i ,z , Standard Star project, the u ,g ,r ,i ,z , Open Star Clusters project, and the SDSS extension (SDSS-II). (© 2006 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] Assessment of Protection Systems for Buried Steel Pipelines Endangered by RockfallCOMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 5 2005Bernhard Pichler First, a gravel-based protection system (GBPS) is investigated, that is, a pipeline buried in sandy gravel is considered. To assess the load-carrying behavior of this structure when subjected to rockfall, a finite element (FE) model has been developed. The development and the validation of this structural model are strictly separated, that is, they are based on two physically and statistically independent sets of experiments. Subsequently, scenarios of rockfall onto a gravel-buried steel pipe are analyzed considering different boundary conditions and structural dimensions. Following the conclusions drawn from these numerical analyses, an enhanced protection system (EPS) is proposed. It consists of gravel as an energy-absorbing and impact-damping system and a buried steel plate resting on walls made of concrete representing a load-carrying structural component. The potential and the limitations of both protection systems are discussed in detail. [source] Probabilistic Neural Network for Reliability Assessment of Oil and Gas PipelinesCOMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 5 2002Sunil K. Sinha A fuzzy artificial neural network (ANN),based approach is proposed for reliability assessment of oil and gas pipelines. The proposed ANN model is trained with field observation data collected using magnetic flux leakage (MFL) tools to characterize the actual condition of aging pipelines vulnerable to metal loss corrosion. The objective of this paper is to develop a simulation-based probabilistic neural network model to estimate the probability of failure of aging pipelines vulnerable to corrosion. The approach is to transform a simulation-based probabilistic analysis framework to estimate the pipeline reliability into an adaptable connectionist representation, using supervised training to initialize the weights so that the adaptable neural network predicts the probability of failure for oil and gas pipelines. This ANN model uses eight pipe parameters as input variables. The output variable is the probability of failure. The proposed method is generic, and it can be applied to several decision problems related with the maintenance of aging engineering systems. [source] Pipelines on heterogeneous systems: models and toolsCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 9 2005F. Almeida Abstract We study the performance of pipeline algorithms in heterogeneous networks. The concept of heterogeneity is not only restricted to the differences in computational power of the nodes, but also refers to the network capabilities. We develop a skeleton tool that allows us an efficient block-cyclic mapping of pipelines on heterogeneous systems. The tool supports pipelines with a number of stages much larger than the number of physical processors available. We derive an analytical formula that allows us to predict the performance of pipelines in heterogeneous systems. According to the analytical complexity formula, numerical strategies to solve the optimal mapping problem are proposed. The computational results prove the accuracy of the predictions and effectiveness of the approach. Copyright © 2005 John Wiley & Sons, Ltd. [source] Principles and Practices of Knowledge Creation: On the Organization of "Buzz" and "Pipelines" in Life Science CommunitiesECONOMIC GEOGRAPHY, Issue 4 2008Jerker Moodysson abstract This article links up with the debate in economic geography on "local buzz" and "global pipelines" as two distinct forms of interactive knowledge creation among firms and related actors and argues for a rethinking of the way social scientists should approach interactive knowledge creation. It highlights the importance of combining the insights from studies of clusters and innovation systems with an activity-oriented approach in which more attention is paid to the specific characteristics of the innovation processes and the conditions underpinning their organization. To illustrate the applicability and added value of such an alternative approach, the notion of embeddedness is linked with some basic ideas adopted from the literature on knowledge communities. The framework is then applied to a study of innovation activities conducted by firms and academic research groups working with biotechnology-related applications in the Swedish part of the Medicon Valley life science region. The findings reveal that local buzz is largely absent in these types of activities. Most interactive knowledge creation, which appears to be spontaneous and unregulated, is, on closer examination, found safely embedded in globally configured professional knowledge communities and attainable only by those who qualify. [source] Graduate Medical Education and Knowledge Translation: Role Models, Information Pipelines, and Practice Change ThresholdsACADEMIC EMERGENCY MEDICINE, Issue 11 2007Barry M. Diner MD This article reflects the proceedings of a workshop session, Postgraduate Education and Knowledge Translation, at the 2007 Academic Emergency Medicine Consensus Conference on knowledge translation (KT) in emergency medicine (EM). The objective was to develop a research strategy that incorporates KT into EM graduate medical education (GME). To bridge the gap between the best evidence and optimal patient care, Pathman et al. suggested a multistage model for moving from evidence to action. Using this theoretical knowledge-to-action framework, the KT consensus conference group focused on four key components: acceptance, application, ability, and remembering to act on the existing evidence. The possibility that basic familiarity, along with the pipeline by Pathman et al., may improve KT uptake may be an initial starting point for research on GME and KT. Current residents are limited by faculty GME role models to demonstrate bedside KT principles. The rapid uptake of KT theory will depend on developing KT champions locally and internationally for resident physicians to emulate. The consensus participants combined published evidence with expert opinion to outline recommendations for identifying the barriers to KT by asking four specific questions: 1) What are the barriers that influence a resident's ability to act on valid health care evidence? 2) How do we break down these barriers? 3) How do we incorporate this into residency training? 4) How do we monitor the longevity of this intervention? Research in the fields of GME and KT is currently limited. GME educators assume that if we teach residents, they will learn and apply what they have been taught. This is a bold assumption with very little supporting evidence. This article is not an attempt to provide a complete overview of KT and GME, but, instead, aims to create a starting point for future work and discussions in the realm of KT and GM. [source] Fast simulation of skin slidingCOMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 2-3 2009Xiaosong Yang Abstract Skin sliding is the phenomenon of the skin moving over underlying layers of fat, muscle and bone. Due to the complex interconnections between these separate layers and their differing elasticity properties, it is difficult to model and expensive to compute. We present a novel method to simulate this phenomenon at real-time by remeshing the surface based on a parameter space resampling. In order to evaluate the surface parametrization, we borrow a technique from structural engineering known as the force density method (FDM)which solves for an energy minimizing form with a sparse linear system. Our method creates a realistic approximation of skin sliding in real-time, reducing texture distortions in the region of the deformation. In addition it is flexible, simple to use, and can be incorporated into any animation pipeline. Copyright © 2009 John Wiley & Sons, Ltd. [source] Interactive shadowing for 2D AnimeCOMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 2-3 2009Eiji Sugisaki Abstract In this paper, we propose an instant shadow generation technique for 2D animation, especially Japanese Anime. In traditional 2D Anime production, the entire animation including shadows is drawn by hand so that it takes long time to complete. Shadows play an important role in the creation of symbolic visual effects. However shadows are not always drawn due to time constraints and lack of animators especially when the production schedule is tight. To solve this problem, we develop an easy shadowing approach that enables animators to easily create a layer of shadow and its animation based on the character's shapes. Our approach is both instant and intuitive. The only inputs required are character or object shapes in input animation sequence with alpha value generally used in the Anime production pipeline. First, shadows are automatically rendered on a virtual plane by using a Shadow Map1 based on these inputs. Then the rendered shadows can be edited by simple operations and simplified by the Gaussian Filter. Several special effects such as blurring can be applied to the rendered shadow at the same time. Compared to existing approaches, ours is more efficient and effective to handle automatic shadowing in real-time. Copyright © 2009 John Wiley & Sons, Ltd. [source] A framework for fusion methods and rendering techniques of multimodal volume dataCOMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 2 2004Maria Ferre Abstract Many different direct volume rendering methods have been developed to visualize 3D scalar fields on uniform rectilinear grids. However, little work has been done on rendering simultaneously various properties of the same 3D region measured with different registration devices or at different instants of time. The demand for this type of visualization is rapidly increasing in scientific applications such as medicine in which the visual integration of multiple modalities allows a better comprehension of the anatomy and a perception of its relationships with activity. This paper presents different strategies of direct multimodal volume rendering (DMVR). It is restricted to voxel models with a known 3D rigid alignment transformation. The paper evaluates at which steps of the rendering pipeline the data fusion must be realized in order to accomplish the desired visual integration and to provide fast re-renders when some fusion parameters are modified. In addition, it analyses how existing monomodal visualization algorithms can be extended to multiple datasets and it compares their efficiency and their computational cost. Copyright © 2004 John Wiley & Sons, Ltd. [source] Visual modelling: from images to imagesCOMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 4 2002Marc Pollefeys Abstract This paper contains two parts. In the first part an automatic processing pipeline is presented that analyses an image sequence and automatically extracts camera motion, calibration and scene geometry. The system combines state-of-the-art algorithms developed in computer vision, computer graphics and photogrammetry. The approach consists of two stages. Salient features are extracted and tracked throughout the sequence to compute the camera motion and calibration and the 3D structure of the observed features. Then a dense estimate of the surface geometry of the observed scene is computed using stereo matching. The second part of the paper discusses how this information can be used for visualization. Traditionally, a textured 3D model is constructed from the computed information and used to render new images. Alternatively, it is also possible to avoid the need for an explicit 3D model and to obtain new views directly by combining the appropriate pixels from recorded views. It is interesting to note that even when there is an ambiguity on the reconstructed geometry, correct new images can often still be generated. Copyright © 2002 John Wiley & Sons, Ltd. [source] Fragment-Parallel Composite and FilterCOMPUTER GRAPHICS FORUM, Issue 4 2010Anjul Patney We present a strategy for parallelizing the composite and filter operations suitable for an order-independent rendering pipeline implemented on a modern graphics processor. Conventionally, this task is parallelized across pixels/subpixels, but serialized along individual depth layers. However, our technique extends the domain of parallelization to individual fragments (samples), avoiding a serial dependence on the number of depth layers, which can be a constraint for scenes with high depth complexity. As a result, our technique scales with the number of fragments and can sustain a consistent and predictable throughput in scenes with both low and high depth complexity, including those with a high variability of depth complexity within a single frame. We demonstrate composite/filter performance in excess of 50M fragments/sec for scenes with more than 1500 semi-transparent layers. [source] A Data-driven Segmentation for the Shoulder ComplexCOMPUTER GRAPHICS FORUM, Issue 2 2010Q Youn Hong Abstract The human shoulder complex is perhaps the most complicated joint in the human body being comprised of a set of three bones, muscles, tendons, and ligaments. Despite this anatomical complexity, computer graphics models for motion capture most often represent this joint as a simple ball and socket. In this paper, we present a method to determine a shoulder skeletal model that, when combined with standard skinning algorithms, generates a more visually pleasing animation that is a closer approximation to the actual skin deformations of the human body. We use a data-driven approach and collect ground truth skin deformation data with an optical motion capture system with a large number of markers (200 markers on the shoulder complex alone). We cluster these markers during movement sequences and discover that adding one extra joint around the shoulder improves the resulting animation qualitatively and quantitatively yielding a marker set of approximately 70 markers for the complete skeleton. We demonstrate the effectiveness of our skeletal model by comparing it with ground truth data as well as with recorded video. We show its practicality by integrating it with the conventional rendering/animation pipeline. [source] Adding Depth to Cartoons Using Sparse Depth (In)equalitiesCOMPUTER GRAPHICS FORUM, Issue 2 2010D. Sýkora Abstract This paper presents a novel interactive approach for adding depth information into hand-drawn cartoon images and animations. In comparison to previous depth assignment techniques our solution requires minimal user effort and enables creation of consistent pop-ups in a matter of seconds. Inspired by perceptual studies we formulate a custom tailored optimization framework that tries to mimic the way that a human reconstructs depth information from a single image. Its key advantage is that it completely avoids inputs requiring knowledge of absolute depth and instead uses a set of sparse depth (in)equalities that are much easier to specify. Since these constraints lead to a solution based on quadratic programming that is time consuming to evaluate we propose a simple approximative algorithm yielding similar results with much lower computational overhead. We demonstrate its usefulness in the context of a cartoon animation production pipeline including applications such as enhancement, registration, composition, 3D modelling and stereoscopic display. [source] LazyBrush: Flexible Painting Tool for Hand-drawn CartoonsCOMPUTER GRAPHICS FORUM, Issue 2 2009Daniel Sýkora Abstract In this paper we present LazyBrush, a novel interactive tool for painting hand-made cartoon drawings and animations. Its key advantage is simplicity and flexibility. As opposed to previous custom tailored approaches [SBv05, QWH06] LazyBrush does not rely on style specific features such as homogenous regions or pattern continuity yet still offers comparable or even less manual effort for a broad class of drawing styles. In addition to this, it is not sensitive to imprecise placement of color strokes which makes painting less tedious and brings significant time savings in the context cartoon animation. LazyBrush originally stems from requirements analysis carried out with professional ink-and-paint illustrators who established a list of useful features for an ideal painting tool. We incorporate this list into an optimization framework leading to a variant of Potts energy with several interesting theoretical properties. We show how to minimize it efficiently and demonstrate its usefulness in various practical scenarios including the ink-and-paint production pipeline. [source] Dominant Texture and Diffusion Distance ManifoldsCOMPUTER GRAPHICS FORUM, Issue 2 2009Jianye Lu Abstract Texture synthesis techniques require nearly uniform texture samples, however identifying suitable texture samples in an image requires significant data preprocessing. To eliminate this work, we introduce a fully automatic pipeline to detect dominant texture samples based on a manifold generated using the diffusion distance. We define the characteristics of dominant texture and three different types of outliers that allow us to efficiently identify dominant texture in feature space. We demonstrate how this method enables the analysis/synthesis of a wide range of natural textures. We compare textures synthesized from a sample image, with and without dominant texture detection. We also compare our approach to that of using a texture segmentation technique alone, and to using Euclidean, rather than diffusion, distances between texture features. [source] Exposure Fusion: A Simple and Practical Alternative to High Dynamic Range PhotographyCOMPUTER GRAPHICS FORUM, Issue 1 2009T. Mertens I.4.8 [Image Processing]: Scene Analysis , Photometry, Sensor Fusion Abstract We propose a technique for fusing a bracketed exposure sequence into a high quality image, without converting to High dynamic range (HDR) first. Skipping the physically based HDR assembly step simplifies the acquisition pipeline. This avoids camera response curve calibration and is computationally efficient. It also allows for including flash images in the sequence. Our technique blends multiple exposures, guided by simple quality measures like saturation and contrast. This is done in a multiresolution fashion to account for the brightness variation in the sequence. The resulting image quality is comparable to existing tone mapping operators. [source] Transferring the Rig and Animations from a Character to Different Face ModelsCOMPUTER GRAPHICS FORUM, Issue 8 2008Verónica Costa Orvalho I.3.7 Computer Graphics: Three-Dimensional Graphics and Realism. Animation Abstract We introduce a facial deformation system that allows artists to define and customize a facial rig and later apply the same rig to different face models. The method uses a set of landmarks that define specific facial features and deforms the rig anthropometrically. We find the correspondence of the main attributes of a source rig, transfer them to different three-demensional (3D) face models and automatically generate a sophisticated facial rig. The method is general and can be used with any type of rig configuration. We show how the landmarks, combined with other deformation methods, can adapt different influence objects (NURBS surfaces, polygon surfaces, lattice) and skeletons from a source rig to individual face models, allowing high quality geometric or physically-based animations. We describe how it is possible to deform the source facial rig, apply the same deformation parameters to different face models and obtain unique expressions. We enable reusing of existing animation scripts and show how shapes nicely mix one with the other in different face models. We describe how our method can easily be integrated in an animation pipeline. We end with the results of tests done with major film and game companies to show the strength of our proposal. [source] Ptex: Per-Face Texture Mapping for Production RenderingCOMPUTER GRAPHICS FORUM, Issue 4 2008Brent Burley Explicit parameterization of subdivision surfaces for texture mapping adds significant cost and complexity to film production. Most parameterization methods currently in use require setup effort, and none are completely general. We propose a new texture mapping method for Catmull-Clark subdivision surfaces that requires no explicit parameterization. Our method, Ptex, stores a separate texture per quad face of the subdivision control mesh, along with a novel per-face adjacency map, in a single texture file per surface. Ptex uses the adjacency data to perform seamless anisotropic filtering of multi-resolution textures across surfaces of arbitrary topology. Just as importantly, Ptex requires no manual setup and scales to models of arbitrary mesh complexity and texture detail. Ptex has been successfully used to texture all of the models in an animated theatrical short and is currently being applied to an entire animated feature. Ptex has eliminated UV assignment from our studio and significantly increased the efficiency of our pipeline. [source] Interactive Visualization with Programmable Graphics HardwareCOMPUTER GRAPHICS FORUM, Issue 3 2002Thomas Ertl One of the main scientific goals of visualization is the development of algorithms and appropriate data models which facilitate interactive visual analysis and direct manipulation of the increasingly large data sets which result from simulations running on massive parallel computer systems, from measurements employing fast high-resolution sensors, or from large databases and hierarchical information spaces. This task can only be achieved with the optimization of all stages of the visualization pipeline: filtering, compression, and feature extraction of the raw data sets, adaptive visualization mappings which allow the users to choose between speed and accuracy, and exploiting new graphics hardware features for fast and high-quality rendering. The recent introduction of advanced programmability in widely available graphics hardware has already led to impressive progress in the area of volume visualization. However, besides the acceleration of the final rendering, flexible graphics hardware is increasingly being used also for the mapping and filtering stages of the visualization pipeline, thus giving rise to new levels of interactivity in visualization applications. The talk will present recent results of applying programmable graphics hardware in various visualization algorithms covering volume data, flow data, terrains, NPR rendering, and distributed and remote applications. [source] Assessment of Protection Systems for Buried Steel Pipelines Endangered by RockfallCOMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 5 2005Bernhard Pichler First, a gravel-based protection system (GBPS) is investigated, that is, a pipeline buried in sandy gravel is considered. To assess the load-carrying behavior of this structure when subjected to rockfall, a finite element (FE) model has been developed. The development and the validation of this structural model are strictly separated, that is, they are based on two physically and statistically independent sets of experiments. Subsequently, scenarios of rockfall onto a gravel-buried steel pipe are analyzed considering different boundary conditions and structural dimensions. Following the conclusions drawn from these numerical analyses, an enhanced protection system (EPS) is proposed. It consists of gravel as an energy-absorbing and impact-damping system and a buried steel plate resting on walls made of concrete representing a load-carrying structural component. The potential and the limitations of both protection systems are discussed in detail. [source] Adaptive structured parallelism for distributed heterogeneous architectures: a methodological approach with pipelines and farmsCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 15 2010Horacio González-Vélez Abstract Algorithmic skeletons abstract commonly used patterns of parallel computation, communication, and interaction. Based on the algorithmic skeleton concept, structured parallelism provides a high-level parallel programming technique that allows the conceptual description of parallel programs while fostering platform independence and algorithm abstraction. This work presents a methodology to improve skeletal parallel programming in heterogeneous distributed systems by introducing adaptivity through resource awareness. As we hypothesise that a skeletal program should be able to adapt to the dynamic resource conditions over time using its structural forecasting information, we have developed adaptive structured parallelism (ASPARA). ASPARA is a generic methodology to incorporate structural information at compilation into a parallel program, which will help it to adapt at execution. ASPARA comprises four phases: programming, compilation, calibration, and execution. We illustrate the feasibility of this approach and its associated performance improvements using independent case studies based on two algorithmic skeletons,the task farm and the pipeline,evaluated in a non-dedicated heterogeneous multi-cluster system. Copyright © 2010 John Wiley & Sons, Ltd. [source] IRSS Psychology Theory: Telling Experiences Among Underrepresented IS DoctoratesDECISION SCIENCES JOURNAL OF INNOVATIVE EDUCATION, Issue 2 2006Fay Cobb Payton ABSTRACT With the changing demographics of the American workforce, the National Science Foundation, along with the U.S. Department of Commerce, has highlighted the shortage of minorities in information technology (IT) careers (http://www.ta.doc.gov/Reports/itsw/itsw.pdf). Using data from a 6-year period and the psychology Involvement-Regimen-Self Management-Social (IRSS) network theory as defined by Boice (1992), we discuss lessons learned from mentoring a group of Information Systems doctoral students who are members of a pipeline that can potentially increase the number of underrepresented faculty in business schools and who made conscious decisions to renounce the IT corporate domain. While our lessons speak to the need for more diversity awareness, we conclude that effective mentoring for underrepresented groups can and should include faculty of color (though limited in numbers) as well as majority faculty who are receptive to the needs and cultural differences of these student groups. Lastly, we draw on the work of Ethnic America to provide additional insight into our findings that are not offered by IRSS network theory. [source] Role of medicines in malaria control and eliminationDRUG DEVELOPMENT RESEARCH, Issue 1 2010Marian Warsame Abstract Antimalarial medicines constitute important tools to cure and prevent malaria infections, thereby averting death and disability; their role in reducing the transmission of malaria is becoming increasingly important. Effective medicines that are currently available include artemisinin-based combination therapies (ACTs) for uncomplicated malaria, parenteral and rectal formulations of artemisinin derivatives and quinine injectables for severe malaria, and primaquine as an anti-relapse agent. These medicines are not optimal, however, owing to safety considerations in specific risk groups, complex regimens, and less than optimal formulations. The efficacy of antimalarial medicines including currently used ACTs is threatened by parasite resistance. Resistance to artemisinins has recently been identified at the Cambodia,Thailand border. Intermittent preventive treatment is constrained by the lack of a replacement for sulfadoxine-pyrimethamine. Despite increasing financial support to procure medicines, access to medicines by populations at risk of malaria, particularly in African countries, remains poor. This is largely due to weak health systems that are unable to deliver quality diagnostics and medicines through an efficient supply chain system, close at hand to the sick patient, especially in remote rural areas. Health systems are also challenged by incorrect prescribing practices in the informal and often unregulated private sector (an important provider of medicines for malaria) and the proliferation of counterfeit and substandard medicines. The provision of a more equitable access to life-saving medicines requires no less than a steady drug development pipeline for new medicines tailored to meet the challenging conditions in endemic countries, ideally single dose, highly effective against both disease and relapse-causing parasites and infective forms, extremely safe and with a long shelf life, and made available at affordable prices. Drug Dev Res 71: 4,11, 2010. © 2010 Wiley-Liss, Inc. [source] Multi-scale system reliability analysis of lifeline networks under earthquake hazardsEARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 3 2010Junho Song Abstract Recent earthquake events evidenced that damage of structural components in a lifeline network may cause prolonged disruption of lifeline services, which eventually results in significant socio-economic losses in the affected area. Despite recent advances in network reliability analysis, the complexity of the problem and various uncertainties still make it a challenging task to evaluate the post-hazard performance and connectivity of lifeline networks efficiently and accurately. In order to overcome such challenges and take advantage of merits of multi-scale analysis, this paper develops a multi-scale system reliability analysis method by integrating a network decomposition approach with the matrix-based system reliability (MSR) method. In addition to facilitating system reliability analysis of large-size networks, the multi-scale approach enables optimizing the level of computational effort on subsystems; identifying the relative importance of components and subsystems at multiple scales; and providing a collaborative risk management framework. The MSR method is uniformly applied for system reliability analyses at both the lower-scale (for link failure) and the higher-scale (for system connectivity) to obtain the probability of general system events, various conditional probabilities, component importance measures, statistical correlation between subsystem failures and parameter sensitivities. The proposed multi-scale analysis method is demonstrated by its application to a gas distribution network in Shelby County of Tennessee. A parametric study is performed to determine the number of segments during the lower-scale MSR analysis of each pipeline based on the strength of the spatial correlation of seismic intensity. It is shown that the spatial correlation should be considered at both scales for accurate reliability evaluation. The proposed multi-scale analysis approach provides an effective framework of risk assessment and decision support for lifeline networks under earthquake hazards. Copyright © 2009 John Wiley & Sons, Ltd. [source] Feasibility of using impedance-based damage assessment for pipeline structuresEARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 10 2001Gyuhae Park Abstract This paper presents the feasibility of using an impedance-based health monitoring technique in monitoring a critical civil facility. The objective of this research is to utilize the capability of the impedance method in identifying structural damage in those areas where a very quick condition monitoring is urgently needed, such as in a post-earthquake analysis of a pipeline system. The basic principle behind this technique is to utilize high-frequency structural excitation (typically greater than 30 kHz) through surface-bonded piezoelectric sensors/actuators to detect changes in structural point impedance due to the presence of damage. Real-time damage detection in pipes connected by bolted joints was investigated, and the capability of the impedance method in tracking and monitoring the integrity of the typical civil facility has been demonstrated. Data collected from the tests illustrates the capability of this technology to detect imminent damage under normal operating conditions and immediately after a natural disaster. Copyright © 2001 John Wiley & Sons, Ltd. [source] |