Home About us Contact | |||
Tedious
Selected AbstractsBetweenIT: An Interactive Tool for Tight InbetweeningCOMPUTER GRAPHICS FORUM, Issue 2 2010Brian Whited Abstract The generation of inbetween frames that interpolate a given set of key frames is a major component in the production of a 2D feature animation. Our objective is to considerably reduce the cost of the inbetweening phase by offering an intuitive and effective interactive environment that automates inbetweening when possible while allowing the artist to guide, complement, or override the results. Tight inbetweens, which interpolate similar key frames, are particularly time-consuming and tedious to draw. Therefore, we focus on automating these high-precision and expensive portions of the process. We have designed a set of user-guided semi-automatic techniques that fit well with current practice and minimize the number of required artist-gestures. We present a novel technique for stroke interpolation from only two keys which combines a stroke motion constructed from logarithmic spiral vertex trajectories with a stroke deformation based on curvature averaging and twisting warps. We discuss our system in the context of a feature animation production environment and evaluate our approach with real production data. [source] ProcDef: Local-to-global Deformation for Skeleton-free Character AnimationCOMPUTER GRAPHICS FORUM, Issue 7 2009Takashi Ijiri Abstract Animations of characters with flexible bodies such as jellyfish, snails, and, hearts are difficult to design using traditional skeleton-based approaches. A standard approach is keyframing, but adjusting the shape of the flexible body for each key frame is tedious. In addition, the character cannot dynamically adjust its motion to respond to the environment or user input. This paper introduces a new procedural deformation framework (ProcDef) for designing and driving animations of such flexible objects. Our approach is to synthesize global motions procedurally by integrating local deformations. ProcDef provides an efficient design scheme for local deformation patterns; the user can control the orientation and magnitude of local deformations as well as the propagation of deformation signals by specifying line charts and volumetric fields. We also present a fast and robust deformation algorithm based on shape-matching dynamics and show some example animations to illustrate the feasibility of our framework. [source] LazyBrush: Flexible Painting Tool for Hand-drawn CartoonsCOMPUTER GRAPHICS FORUM, Issue 2 2009Daniel Sýkora Abstract In this paper we present LazyBrush, a novel interactive tool for painting hand-made cartoon drawings and animations. Its key advantage is simplicity and flexibility. As opposed to previous custom tailored approaches [SBv05, QWH06] LazyBrush does not rely on style specific features such as homogenous regions or pattern continuity yet still offers comparable or even less manual effort for a broad class of drawing styles. In addition to this, it is not sensitive to imprecise placement of color strokes which makes painting less tedious and brings significant time savings in the context cartoon animation. LazyBrush originally stems from requirements analysis carried out with professional ink-and-paint illustrators who established a list of useful features for an ideal painting tool. We incorporate this list into an optimization framework leading to a variant of Potts energy with several interesting theoretical properties. We show how to minimize it efficiently and demonstrate its usefulness in various practical scenarios including the ink-and-paint production pipeline. [source] Performance comparison of checkpoint and recovery protocolsCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 15 2003Himadri Sekhar Paul Abstract Checkpoint and rollback recovery is a well-known technique for providing fault tolerance to long-running distributed applications. Performance of a checkpoint and recovery protocol depends on the characteristics of the application and the system on which it runs. However, given an application and system environment, there is no easy way to identify which checkpoint and recovery protocol will be most suitable for it. Conventional approaches require implementing the application with all the protocols under consideration, running them on the desired system, and comparing their performances. This process can be very tedious and time consuming. This paper first presents the design and implementation of a simulation environment, distributed process simulation or dPSIM, which enables easy implementation and evaluation of checkpoint and recovery protocols. The tool enables the protocols to be simulated under a wide variety of application, system, and network characteristics. The paper then presents performance evaluation of five checkpoint and recovery protocols. These protocols are implemented and executed in dPSIM under different simulated application, system, and network characteristics. Copyright © 2003 John Wiley & Sons, Ltd. [source] Follicular Unit Transplantation: The Option of Beard Construction in Eunuchoid MenDERMATOLOGIC SURGERY, Issue 9 2002Kayihan, ahinoglu MD background. Psychosocial problems are very common in eunuchoids and may be related to the impact of underlying disorders on the physical appearance which makes them unable to overcome the sense of inferiority of childhood. A beardless patient treated with follicular unit transplantation (FUT) is reported here. objective. Such patients desire to get rid of a boyish appearance and want to achieve a masculine appearance. One of the easiest methods to achieve this goal is FUT. methods. By using an 18-gauge needle, the recipient bed was prepared under local anesthesia after premedication, and 1200 one- or two-hair micrografts were transplanted to the perioral (goatee) and its extensions to the sideburns. results. After completion of the procedure to the planned area, we achieved restoration of a masculine appearance which made the patient seem quite satisfied. conclusion. The process of beard reconstruction is time consuming and tedious, but highly effective. [source] Graphic and movie illustrations of human prenatal development and their application to embryological education based on the human embryo specimens in the Kyoto collectionDEVELOPMENTAL DYNAMICS, Issue 2 2006Shigehito Yamada Abstract Morphogenesis in the developing embryo takes place in three dimensions, and in addition, the dimension of time is another important factor in development. Therefore, the presentation of sequential morphological changes occurring in the embryo (4D visualization) is essential for understanding the complex morphogenetic events and the underlying mechanisms. Until recently, 3D visualization of embryonic structures was possible only by reconstruction from serial histological sections, which was tedious and time-consuming. During the past two decades, 3D imaging techniques have made significant advances thanks to the progress in imaging and computer technologies, computer graphics, and other related techniques. Such novel tools have enabled precise visualization of the 3D topology of embryonic structures and to demonstrate spatiotemporal 4D sequences of organogenesis. Here, we describe a project in which staged human embryos are imaged by the magnetic resonance (MR) microscope, and 3D images of embryos and their organs at each developmental stage were reconstructed based on the MR data, with the aid of computer graphics techniques. On the basis of the 3D models of staged human embryos, we constructed a data set of 3D images of human embryos and made movies to illustrate the sequential process of human morphogenesis. Furthermore, a computer-based self-learning program of human embryology is being developed for educational purposes, using the photographs, histological sections, MR images, and 3D models of staged human embryos. Developmental Dynamics 235:468,477, 2006. © 2005 Wiley-Liss, Inc. [source] An optimal method of DNA silver staining in polyacrylamide gelsELECTROPHORESIS, Issue 8 2007Yun-Tao Ji Abstract A silver staining technique has widely been used to detect DNA fragments with high sensitivity on polyacrylamide gels. The conventional procedure of the silver staining is tedious, which takes about 40,60,min and needs five or six kinds of chemicals and four kinds of solutions. Although our previous improved method reduced several steps, it still needed six kinds of chemicals. The objective of this study was to improve further the existing procedures and develop an optimal method for DNA silver staining on polyacrylamide gels. The novel procedure could be completed with only four chemicals and two solutions within 20,min. The steps of ethanol, acetic acid, and nitric acid precession before silver impregnation have been eliminated and the minimal AgNO3 dose has been used in this up-to-date method. The polyacrylamide gel of the DNA sliver staining displayed a golden yellow and transparent background with high sensitivity. The minimum 0.44 and 3.5,ng of DNA amount could be detected in denaturing and nondenaturing polyacrylamide gel, respectively. This result indicated that our optimal method can save time and cost, and still keep a high sensitivity for DNA staining in polyacrylamide gels. [source] Ease of reading of mandatory information on Canadian food product labelsINTERNATIONAL JOURNAL OF CONSUMER STUDIES, Issue 4 2009Mary Alton Mackey Abstract Food product labels present individual product information, safety, nutrition, electronic inventory, container and environmental information, in various formats, languages and images. Some information is mandatory; much is promotional. The food label is an essential tool for regulators of safe food handling, nutrition policy and fair competition. Mandatory information on food labels in Canada is required to be presented in both English and French, readily discernable, prominently displayed and legible. This study examines the ease of finding and reading of mandatory label components on selected Canadian food products. A validated typographical scoring system assessed the lists of ingredients on a purposive sample of 100 food labels representing foods in all groups in Canada's Food Guide. Seven percent of the ingredient lists were easy to read; 26% were difficult to read and 67% were very difficult to read. Well-educated resourceful readers in consumer focus groups examined food labels for key elements that influence ease of finding and reading information. Focus groups and typographical scoring identified: colour contrast, case, print style, print size, space between the lines, reverse print, organization, justification, type of surface, hyphenation and print reproduction as factors that affect ease of reading. Print that curves around a container, lack of paragraphing or point form organization make reading difficult; text blocks at right angles to each other make comparisons difficult; separation of the nutrition facts table from the list of ingredients makes decision making tedious. Inadequate spacing between lines of print creates problems for readers of English and exacerbates problems for readers of French. Words placed over illustrations, busy backgrounds or watermarks increase reading difficulty. Hazard statements, instructions and storage information imbedded in other information without added space or appropriate heading is difficult to find and read. Canadian consumers echo consumers in 28 European countries who find label information difficult to find and to read and want clear guidelines/regulations on the placement and the typography of mandatory food label components [source] Silver-Catalyzed One-Pot Cyclization Reaction of Electron- Deficient Alkynes and 2-Yn-1-ols: An Efficient Domino Process to Polysubstituted FuransADVANCED SYNTHESIS & CATALYSIS (PREVIOUSLY: JOURNAL FUER PRAKTISCHE CHEMIE), Issue 1 2010Hua Cao Abstract Transition metal-catalyzed domino reactions have been used as powerful tools for the preparation of polysubstituted furans in a one-pot manner. In this paper, an efficient synthetic method was developed for the construction of tri- or tetrasubstituted furans from electron-deficient alkynes and 2-yn-1-ols by a silver-catalyzed domino reaction. It is especially noteworthy that a 2,3,5-trisubstituted 4-ynyl-furan was formally obtained in an extremely direct manner without tedious stepwise synthesis. In addition, regio-isomeric furans were observed when substituted aryl alkynyl ketones were employed. This methodology represents a highly efficient synthetic route to electron-deficient furans for which catalytic approaches are scarce. The reaction proceeds efficiently under mild conditions with commercially available catalysts and materials. [source] A software framework for fast prototyping of meta-heuristics hybridizationINTERNATIONAL TRANSACTIONS IN OPERATIONAL RESEARCH, Issue 2 2007Hoong Chuin Lau Abstract Hybrids of meta-heuristics have been shown to be more effective and adaptable than their parents in solving combinatorial optimization problems. However, hybridized schemes are also more tedious to implement due to their increased complexity. We address this problem by proposing the meta-heuristics development framework (MDF). In addition to being a framework that promotes software reuse to reduce developmental effort, the key strength of MDF lies in its ability to model meta-heuristics using a "request, sense and response" schema, which decomposes algorithms into a set of well-defined modules that can be flexibly assembled through a centralized controller. Under this scheme, hybrid schemes become an event-based search that can adaptively trigger a desired parent's behavior in response to search events. MDF can hence be used to design and implement a wide spectrum of hybrids with varying degrees of collaboration, thereby offering algorithm designers quick turnaround in designing and testing their meta-heuristics. Such technicality is illustrated in the paper through the construction of hybrid schemes using ant colony optimization and tabu search. [source] Effect of flow regimes on the presence of Legionella within the biofilm of a model plumbing systemJOURNAL OF APPLIED MICROBIOLOGY, Issue 2 2006Z. Liu Abstract Aims:, Stagnation is widely believed to predispose water systems to colonization by Legionella. A model plumbing system was constructed to determine the effect of flow regimes on the presence of Legionella within microbial biofilms. Methods and Results:, The plumbing model contained three parallel pipes where turbulent, laminar and stagnant flow regimes were established. Four sets of experiments were carried out with Reynolds number from 10 000 to 40 000 and from 355 to 2000 in turbulent and laminar pipes, respectively. Legionella counts recovered from biofilm and planktonic water samples of the three sampling pipes were compared with to determine the effect of flow regime on the presence of Legionella. Significantly higher colony counts of Legionella were recovered from the biofilm of the pipe with turbulent flow compared with the pipe with laminar flow. The lowest counts were in the pipe with stagnant flow. Conclusions:, We were unable to demonstrate that stagnant conditions promoted Legionella colonization. Significance and Impact of the Study:, Plumbing modifications to remove areas of stagnation including deadlegs are widely recommended, but these modifications are tedious and expensive to perform. Controlled studies in large buildings are needed to validate this unproved hypothesis. [source] Biodiesel production by direct methanolysis of oleaginous microbial biomassJOURNAL OF CHEMICAL TECHNOLOGY & BIOTECHNOLOGY, Issue 8 2007Bo Liu Abstract Biodiesel is a renewable fuel conventionally prepared by transesterification of pre-extracted vegetable oils and animal fats of all resources with methanol, catalyzed by strong acids or bases. This paper reports on a novel biodiesel production method that features acid-promoted direct methanolysis of cellular biomass of oleaginous yeasts and filamentous fungi. The process was optimized for tuning operation parameters, such as methanol dosage, catalyst concentration, reaction temperature and time. Up to 98% yield was reached with reaction conditions of 70 °C, under ambient pressure for 20 h and a dried biomass to methanol ratio 1:20 (w/v) catalyzed by either 0.2 mol L,1 H2SO4 or 0.4 mol L,1 HCl. Cetane numbers for these products were estimated to range from 56 to 59. This integrated method is thus effective and technically attractive, as dried microbial biomass as feedstocks omits otherwise tedious and time-consuming oil extraction processes. Copyright © 2007 Society of Chemical Industry [source] Combination of support vector machines (SVM) and near-infrared (NIR) imaging spectroscopy for the detection of meat and bone meal (MBM) in compound feedsJOURNAL OF CHEMOMETRICS, Issue 7-8 2004J. A. Fernández Pierna Abstract This study concerns the development of a new system to detect meat and bone meal (MBM) in compound feeds, which will be used to enforce legislation concerning feedstuffs enacted after the European mad cow crisis. Focal plane array near-infrared (NIR) imaging spectroscopy, which collects thousands of spatially resolved spectra in a massively parallel fashion, has been suggested as a more efficient alternative to the current methods, which are tedious and require significant expert human analysis. Chemometric classification strategies have been applied to automate the method and reduce the need for constant expert analysis of the data. In this work the performance of a new method for multivariate classification, support vector machines (SVM), was compared with that of two classical chemometric methods, partial least squares (PLS) and artificial neural networks (ANN), in classifying feed particles as either MBM or vegetal using the spectra from NIR images. While all three methods were able to effectively model the data, SVM was found to perform substantially better than PLS and ANN, exhibiting a much lower rate of false positive detection. Copyright © 2004 John Wiley & Sons, Ltd. [source] Assessing the ecological integrity of a grassland ecosystem: the applicability and rapidity of the SAGraSS methodAFRICAN JOURNAL OF ECOLOGY, Issue 3 2009W. Kaiser Abstract The Grassland Biome is currently one of the most threatened biomes in South Africa and is in dire need of a biomonitoring protocol. The components of ecological integrity in these ecosystems are, however, too diverse and time-consuming to measure scrupulously. It is therefore necessary to develop a set of grassland indicators that are efficient and rapid in their assessment of grassland ecosystem integrity. The South African Grassland Scoring System (SAGraSS), based on the grassland insect community, is such a suggested indicator. The present study is the first to investigate the applicability and rapidity of this proposed method. Although SAGraSS scores correlated significantly with Ecological Index values (the most commonly used index by which veld condition is evaluated in central South Africa), the method proved to be tedious and the identification of insects taxing. We offer a number of changes to make the SAGraSS method a more rapid method of assessment. Résumé Le Biome « Prairies » est aujourd'hui un des plus menacés d'Afrique du Sud et a sérieusement besoin d'un protocole de biomonitoring. Les composantes de l'intégritéécologique de ces écosystèmes sont cependant trop diverses, et il faudrait trop de temps pour les mesurer scrupuleusement. Il est donc nécessaire de mettre au point un ensemble d'indicateurs pour les prairies qui soient efficaces et permettent d'évaluer rapidement l'intégrité de ces écosystèmes. Le système sud-africain South African Grassland Scoring System (SAGraSS), basé sur la communauté des insectes des prairies, est un des indicateurs qui fut proposé. Cette étude est la première qui analyse l'applicabilité et la rapidité de cette méthode. Bien que les résultats du SAGraSS soient significativement reliés aux valeurs de l'Indice Ecologique (EI , l'indice le plus utilisé pour évaluer les conditions écologiques du Veld au centre de l'Afrique du Sud), la méthode s'est avérée fastidieuse, et l'identification des insectes assez longue. Nous proposons un certain nombre de changements à apporter pour faire de la méthode SAGraSS une méthode d'évaluation plus rapide. [source] Predation and the persistence of melanic male mosquitofish (Gambusia holbrooki)JOURNAL OF EVOLUTIONARY BIOLOGY, Issue 3 2004L. Horth Abstract The empirical reasons for the persistent rarity of a genotype are typically complex and tedious to identify, particularly in nature. Yet rare morphs occur in a substantial fraction of phenotypic polymorphisms. A colour polymorphism has persisted for decades in the eastern mosquitofish, yet why this is so remains obscure. Here, I report the results of (1) intensive sampling at 45 natural sites to obtain the frequency distribution of the melanic (black) mosquitofish morph in Florida, (2) predation trials, conducted independently in mesocosms, with three different predatory species and (3) two mark-recapture studies, conducted in nature. This work (1) documents the rarity of melanic mosquitofish in nature, (2) demonstrates that melanic males experience a selective advantage over silver males in the presence of predators, (3) indicates no difference in the colour morphs, survival at a natural site essentially devoid of these predators, although suggesting a higher rate of recapture for melanic males at a site rife with predators. Overall, selective predation appears to contribute to the persistence of the melanic morph, despite its rarity in nature. [source] Magnetic resonance brain perfusion imaging with voxel-specific arterial input functionsJOURNAL OF MAGNETIC RESONANCE IMAGING, Issue 3 2006Renate Grüner MSc Abstract Purpose To propose an automatic method for estimating voxel-specific arterial input functions (AIFs) in dynamic contrast brain perfusion imaging. Materials and Methods Voxel-specific AIFs were estimated blindly using the theory of homomorphic transformations and complex cepstrum analysis. Wiener filtering was used in the subsequent deconvolution. The method was verified using simulated data and evaluated in 10 healthy adults. Results Computer simulations accurately estimated differently shaped, normalized AIFs. Simple Wiener filtering resulted in underestimation of flow values. Preliminary in vivo results showed comparable cerebral flow value ratios between gray matter (GM) and white matter (WM) when using blindly estimated voxel-specific AIFs or a single manually selected AIF. Significant differences (P , 0.0125) in mean transit time (MTT) and time-to-peak (TTP) in GM compared to WM was seen with the new method. Conclusion Initial results suggest that the proposed method can replace the tedious and difficult task of manually selecting an AIF, while simultaneously providing better differentiation between time-dependent hemodynamic parameters. J. Magn. Reson. Imaging 2006. © 2006 Wiley-Liss, Inc. [source] Empirical-based recovery and maintenance of input error-correction featuresJOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 6 2007Minh Ngoc Ngo Abstract Most information systems deal with inputs submitted from their external environments. In such systems, input validation is often incorporated to reject erroneous inputs. Unfortunately, many input errors cannot be detected automatically and therefore result in errors in the effects raised by the system. Therefore, the provision of input error-correction features (IECFs) to correct these erroneous effects is critical. However, recovery and maintenance of these features are complicated, tedious and error prone because there are many possible input errors during user interaction with the system; each input error, in turn, might result in several erroneous effects. Through empirical study, we have discovered some interesting control flow graph patterns with regard to the implementation of IECFs in information systems. Motivated by these initial findings, in this paper, we propose an approach to the automated recovery of IECFs by realizing these patterns from the source code. On the basis of the recovered information, we further propose a decomposition-slicing technique to aid the maintenance of these features without interfering with other parts of the system. A case study has been conducted to show the usefulness of the proposed approach. Copyright © 2007 John Wiley & Sons, Ltd. [source] How do APIs evolve?JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 2 2006A story of refactoring Abstract Frameworks and libraries change their APIs. Migrating an application to the new API is tedious and disrupts the development process. Although some tools and ideas have been proposed to solve the evolution of APIs, most updates are done manually. To better understand the requirements for migration tools, we studied the API changes of four frameworks and one library. We discovered that the changes that break existing applications are not random, but tend to fall into particular categories. Over 80% of these changes are refactorings. This suggests that refactoring-based migration tools should be used to update applications. Copyright © 2006 John Wiley & Sons, Ltd. [source] CD146-based immunomagnetic enrichment followed by multiparameter flow cytometry: a new approach to counting circulating endothelial cellsJOURNAL OF THROMBOSIS AND HAEMOSTASIS, Issue 5 2008A. WIDEMANN Summary.,Background: Circulating endothelial cells (CECs) have emerged as non-invasive biomarkers of vascular dysfunction. The most widely used method for their detection is CD146-based immunomagnetic separation (IMS). Although this approach has provided consensus values in both normal and pathologic situations, it remains tedious and requires a trained operator. Objectives: Our objective was to evaluate a new hybrid assay for CEC measurement using a combination of pre-enrichment of CD146+ circulating cells and multiparametric flow cytometry measurement (FCM). Patients and methods: CECs were determined in peripheral blood from 20 healthy volunteers, 12 patients undergoing coronary angioplasty, and 30 renal transplant recipients, and blood spiked with cultured endothelial cells. CD146+ cells were isolated using CD146-coated magnetic nanoparticles and labeled using CD45,fluorescein isothiocyanate and CD146,PE or isotype control antibody and propidium iodide before FCM. The same samples were also processed using CD146-based immunomagnetic separation as the reference method. Results: The hybrid assay detected CECs, identified as CD45dim/CD146bright/propidium iodide+, with high size-related scatter characteristics, and clearly discriminated these from CD45bright/CD146dim activated T lymphocytes. The method demonstrated both high recovery efficiency and good reproducibility. Both IMS and the hybrid assay similarly identified increased CEC levels in patients undergoing coronary angioplasty and renal transplantation, when compared to healthy controls. In patients, CEC values from these two methods were of the same order of magnitude and highly correlated. Bland,Altman analysis revealed poor statistical agreement between methods, flerrofluid,FCM providing higher values than IMS. Conclusion: This new hybrid FCM assay constitutes an accurate alternative to visual counting of CECs following CD146-based IMS. [source] Transformations and seasonal adjustmentJOURNAL OF TIME SERIES ANALYSIS, Issue 1 2009Tommaso Proietti Abstract., We address the problem of seasonal adjustment of a nonlinear transformation of the original time series, measured on a ratio scale, which aims at enforcing two essential features: additivity and orthogonality of the components. The posterior mean and variance of the seasonally adjusted series admit an analytic finite representation only for particular values of the transformation parameter, e.g. for a fractional Box,Cox transformation parameter. Even if available, the analytical derivation can be tedious and difficult. As an alternative we propose to compute the two conditional moments of the seasonally adjusted series by means of numerical and Monte Carlo integration. The former is both fast and reliable in univariate applications. The latter uses the algorithm known as the ,simulation smoother' and it is most useful in multivariate applications. We present two case studies dealing with robust seasonal adjustment under the square root and the fourth root transformation. Our overall conclusion is that robust seasonal adjustment under transformations is feasible from the computational standpoint and that the possibility of transforming the scale ought to be considered as a further option for improving the quality of seasonal adjustment. [source] Locating knowledge sources through keyphrase extractionKNOWLEDGE AND PROCESS MANAGEMENT: THE JOURNAL OF CORPORATE TRANSFORMATION, Issue 2 2006Sara Tedmori There are a large number of tasks for which keyphrases can be useful. Manually identifying keyphrases can be a tedious and time consuming process that requires expertise, but if automated could save time and aid in creating metadata that could be used to locate knowledge sources. In this paper, the authors present an automated process for keyphrase extraction from e-mail messages. The process enables users to find other people who might hold the knowledge they require from information communicated via the e-mail system. The effectiveness of the extraction system is tested and compared against other extraction systems and the overall value of extracting information from e-mail explored. Copyright © 2006 John Wiley & Sons, Ltd. [source] Multifunctional Magnetoplasmonic Nanoparticle Assemblies for Cancer Therapy and Diagnostics (Theranostics),MACROMOLECULAR RAPID COMMUNICATIONS, Issue 2 2010Wei Chen Abstract In this work, we describe the preparation and biomedical functionalities of complex nanoparticle assemblies with magnetoplasmonic properties suitable for simultaneous cancer therapy and diagnostics (theranostics). Most commonly magnetoplasmonic nanostructures are made by careful adaptation of metal reduction protocols which is both tedious and restrictive. Here we apply the strategy of nanoscale assemblies to prepare such systems from individual building blocks. The prepared superstructures are based on magnetic Fe3O4 nanoparticles encapsulated in silica shell representing the magnetic module. The cores are surrounded in a corona-like fashion by gold nanoparticles representing the plasmonic module. As additional functionality they were also coated by poly(ethyleneglycol) chains as a cloaking agent to extend the blood circulation time. The preparation is exceptionally simple and allows one to vary the contribution of each function. Both modules can carry drugs and, in this study, they were loaded with the potential anticancer drug curcumin. A comprehensive set of microscopy, spectroscopy and biochemical methods were applied to characterize both imaging and therapeutic function of the nanoparticle assemblies against leukemia HL-60 cells. High contrast magnetic resonance images and high apoptosis rates demonstrate the success of assembly approach for the preparation of magnetoplasmonic nanoparticles. This technology allows one to easily "dial in" the functionalities in the clinical setting for personalized theranostic regiments. [source] An autonomous phase-boundary detection technique for colloidal hard sphere suspension experiments,MICROSCOPY RESEARCH AND TECHNIQUE, Issue 4 2006Mark McDowell Abstract Colloidal suspensions of monodisperse spheres are used as physical models of thermodynamic phase transitions and as precursors to photonic band gap materials. Current techniques for identifying the phase boundaries involve manually identifying the phase transitions, which is very tedious and time-consuming. In addition, current image analysis techniques are not able to distinguish between densely packed phases within conventional microscope images, which are mainly characterized by degrees of randomness or order with similar grayscale value properties. We have developed an intelligent machine vision technique that automatically identifies colloidal phase boundaries. The technique utilizes intelligent image processing algorithms that accurately identify and track phase changes vertically or horizontally for a sequence of colloidal hard sphere suspension images. This technique is readily adaptable to any imaging application wherein regions of interest are distinguished from the background by differing patterns of motion over time. Microsc. Res. Tech. 69:236,245, 2006. Published 2006 Wiley-Liss, Inc. [source] A Strategic Planning Process for a Small Nonprofit Organization: A Hospice ExampleNONPROFIT MANAGEMENT & LEADERSHIP, Issue 2 2000Cynthia Massie Mara Strategic planning is an essential part of management. However, planning processes can consume great amounts of time and resources that small, nonprofit organizations may lack. Moreover, the process that is used can be tedious and may result in plans that are discarded before or during their implementation. In this article, a strategic planning process is presented that incorporates a Policy Delphi group technique and Situation Structuring, a computer program that assists participants in structuring or defining the problems to be addressed in the plan. The organization to which the process is applied is a small, nonprofit hospice. Both the planning process and an evaluation of the implementation of the resultant strategic plan are examined. [source] Ex vivo generation of cytokine-induced killer cells (CD3+ CD56+) from post-stem cell transplant pediatric patients against autologous,Epstein,Barr virus,transformed lymphoblastoid cell linesPEDIATRIC TRANSPLANTATION, Issue 5 2007Sawang Petvises Abstract:, EBV-PTLDs affect as high as 20% of SCT recipients especially those with T-cell depleted grafts while high mortality rates were also noted. Adoptive allogeneic and autologous CTLs have a therapeutic potential in this setting. However, the process of expansion of these cells is tedious and time consuming in both allogeneic and autologous CTL generation. For the allogeneic SCT, another major obstacle is unavailability of donors especially in an unrelated SCT setting. The aim of the present study was therefore to investigate the efficacy of autologous CIK cells (CD3+ CD56+) against autologous EBV-LCLs from post-SCT pediatric patients. We could demonstrate that CIK cells can be generated within two wk and did show the significant cytotoxicity against autologous EBV-LCLs. CIK cells may provide a potent tool for use in post-transplantation adoptive immunotherapy. [source] LO + EPSS = just-in-time reuse of content to support employee performancePERFORMANCE IMPROVEMENT, Issue 6 2007Frank Nguyen Those involved in training know that creating instructional materials can become a tedious, repetitive process. They also know that business conditions often require training interventions to be delivered in ways that are not ideally structured or timed. This article examines the notion that learning objects can be reused and adapted for performance support systems. By doing so, a performance technologist can develop content for just-in-case training and reuse it for just-in-time performance support. [source] An algorithm to derive a numerical daily dose from unstructured text dosage instructions,PHARMACOEPIDEMIOLOGY AND DRUG SAFETY, Issue 3 2006Anoop D. Shah BSc Abstract Purpose The General Practice Research Database (GPRD) is a database of longitudinal patient records from general practices in the United Kingdom. It is an important data source for pharmacoepidemiology studies, but until now it has been tedious to calculate the daily dose and duration of exposure to drugs prescribed. This is because general practitioners routinely record dosage instructions as free text rather than in a structured way. The objective was to develop and assess the validity of an automated algorithm to derive the daily dose from text dosage instructions. Methods A computer program was developed to derive numerical information from unstructured text dosage instructions. It was tested on dosage texts from a random sample of one million prescription entries. A random sample of 1000 of these converted texts were manually checked for their accuracy. Results Out of the sample of one million prescription entries, 74.5% had text containing the daily dose, 14.5% had text but did not include a quantitative daily dose statement and 11.0% had no text entered. Of the 1000 texts which were checked manually, 767 stated the daily dose. The program interpreted 758 (98.8%) of these correctly, produced errors in four cases and failed to extract the dose from five texts. Conclusions An automated algorithm has been developed which can accurately extract the daily dose from almost 99% of general practitioners' text dosage instructions. It increases the utility of GPRD and other prescription data sources by enabling researchers to estimate the duration of drug exposure more efficiently. Copyright © 2005 John Wiley & Sons, Ltd. [source] Structural interpretation of mutations and SNPs using STRAP-NTPROTEIN SCIENCE, Issue 1 2006Christoph Gille Abstract Visualization of residue positions in protein alignments and mapping onto suitable structural models is an important first step in the interpretation of mutations or polymorphisms in terms of protein function, interaction, and thermodynamic stability. Selecting and highlighting large numbers of residue positions in a protein structure can be time-consuming and tedious with currently available software. Previously, a series of tasks and analyses had to be performed one-by-one to map mutations onto 3D protein structures; STRAP-NT is an extension of STRAP that automates these tasks so that users can quickly and conveniently map mutations onto 3D protein structures. When the structure of the protein of interest is not yet available, a related protein can frequently be found in the structure databases. In this case the alignment of both proteins becomes the crucial part of the analysis. Therefore we embedded these program modules into the Java-based multiple sequence alignment program STRAP-NT. STRAP-NT can simultaneously map an arbitrary number of mutations denoted using either the nucleotide or amino acid sequence. When the designations of the mutations refer to genomic sites, STRAP-NT translates them into the corresponding amino acid positions, taking intron,exon boundaries into account. STRAP-NT tightly integrates a number of current protein structure viewers (currently PYMOL, RASMOL, JMOL, and VMD) with which mutations and polymorphisms can be directly displayed on the 3D protein structure model. STRAP-NT is available at the PDB site and at http://www.charite.de/bioinf/strap/ or http://strapjava.de. [source] Annotated regions of significance of SELDI-TOF-MS spectra for detecting protein biomarkersPROTEINS: STRUCTURE, FUNCTION AND BIOINFORMATICS, Issue 23 2006Chuen Seng Tan Abstract Peak detection is a key step in the analysis of SELDI-TOF-MS spectra, but the current default method has low specificity and poor peak annotation. To improve data quality, scientists still have to validate the identified peaks visually, a tedious and time-consuming process, especially for large data sets. Hence, there is a genuine need for methods that minimize manual validation. We have previously reported a multi-spectral signal detection method, called RS for ,region of significance', with improved specificity. Here we extend it to include a peak quantification algorithm based on annotated regions of significance (ARS). For each spectral region flagged as significant by RS, we first identify a dominant spectrum for determining the number of peaks and the m/z region of these peaks. From each m/z region of peaks, a peak template is extracted from all spectra via the principal component analysis. Finally, with the template, we estimate the amplitude and location of the peak in each spectrum with the least-squares method and refine the estimation of the amplitude via the mixture model. We have evaluated the ARS algorithm on patient samples from a clinical study. Comparison with the standard method shows that ARS (i),inherits the superior specificity of RS, and (ii),gives more accurate peak annotations than the standard method. In conclusion, we find that ARS alleviates the main problems in the preprocessing of SELDI-TOF spectra. The R-package ProSpect that implements ARS is freely available for academic use at http://www.meb.ki.se/,yudpaw. [source] Software utilities for the interpretation of mass spectrometric data of glycoconjugates: application to glycosphingolipids of human serumRAPID COMMUNICATIONS IN MASS SPECTROMETRY, Issue 7 2010Jamal Souady Glycosphingolipids (GSLs) are major components of the outer leaflet of the cell membrane. These lipids are involved in many cell surface events and show disease-related expression changes. GSLs could thus serve as useful targets for biomarker discovery. The GSL structure is characterized by two entities: a hydrophilic glycan and a hydrophobic ceramide moiety. Both components exhibit numerous structural variations, the combination of which results in a large diversity of GSL structures that can potentially exist. Mass spectrometry (MS) is a powerful tool for high-throughput analysis of GSL expression analysis and structural elucidation. Yet, the assignment of GSL structures using MS data is tedious and demands highly specialized expertise. SysBioWare, a software platform developed for MS data evaluation in glycomics, was here applied for the MS analysis of human serum GSLs. The program was tuned to provide automated compositional assignment, supporting a variety of glycan and ceramide structures. Upon in silico fragmentation, the masses of predicted ions arising from cleavages in the glycan as well as the ceramide moiety were calculated, thus enabling structural characterization of both entities. Validation of proposed structures was achieved by matching in silico calculated fragment ions with those of experimental MS/MS data. These results indicate that SysBioWare can facilitate data interpretation and, furthermore, help the user to deal with large sets of data by supporting management of MS and non-MS data. SysBioWare has the potential to be a powerful tool for high-throughput glycosphingolipidomics in clinical applications. Copyright © 2010 John Wiley & Sons, Ltd. [source] |