User Intervention (user + intervention)

Distribution by Scientific Domains


Selected Abstracts


Augmented reality agents for user interface adaptation

COMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 1 2008
István Barakonyi
Abstract Most augmented reality (AR) applications are primarily concerned with letting a user browse a 3D virtual world registered with the real world. More advanced AR interfaces let the user interact with the mixed environment, but the virtual part is typically rather finite and deterministic. In contrast, autonomous behavior is often desirable in ubiquitous computing (Ubicomp), which requires the computers embedded into the environment to adapt to context and situation without explicit user intervention. We present an AR framework that is enhanced by typical Ubicomp features by dynamically and proactively exploiting previously unknown applications and hardware devices, and adapting the appearance of the user interface to persistently stored and accumulated user preferences. Our framework explores proactive computing, multi-user interface adaptation, and user interface migration. We employ mobile and autonomous agents embodied by real and virtual objects as an interface and interaction metaphor, where agent bodies are able to opportunistically migrate between multiple AR applications and computing platforms to best match the needs of the current application context. We present two pilot applications to illustrate design concepts. Copyright © 2007 John Wiley & Sons, Ltd. [source]


Live Video Montage with a Rotating Camera

COMPUTER GRAPHICS FORUM, Issue 7 2009
Zilong Dong
Abstract High-quality video editing usually requires accurate layer separation in order to resolve occlusions. However, most of the existing bilayer segmentation algorithms require either considerable user intervention or a simple stationary camera configuration with known background, which is difficult to meet for many real world online applications. This paper demonstrates that various visually appealing montage effects can be online created from a live video captured by a rotating camera, by accurately retrieving the camera state and segmenting out the dynamic foreground. The key contribution is that a novel fast bilayer segmentation method is proposed which can effectively extract the dynamic foreground under rotational camera configuration, and is robust to imperfect background estimation and complex background colors. Our system can create a variety of live visual effects, including but not limited to, realistic virtual object insertion, background substitution and blurring, non-photorealistic rendering and camouflage effect. A variety of challenging examples demonstrate the effectiveness of our method. [source]


Automatic Creation of Object Hierarchies for Radiosity Clustering

COMPUTER GRAPHICS FORUM, Issue 4 2000
Gordon Müller
Using object clusters for hierarchical radiosity greatly improves the efficiency and thus usability of radiosity computations. By eliminating the quadratic starting phase very large scenes containing about 100k polygons can be handled efficiently. Although the main algorithm extends rather easily to using object clusters, the creation of ,good' object hierarchies is a difficult task both in terms of construction time and in the way how surfaces or objects are grouped to clusters. The quality of an object hierarchy for clustering depends on its ability to accurately simulate the hierarchy of the energy flow in a given scene. Additionally it should support visibility computations by providing efficient ray acceleration techniques. In this paper we will present a new approach of building hierarchies of object clusters. Our hybrid structuring algorithm provides accuracy and speed by combining a highly optimized bounding volume hierarchy together with uniform spatial subdivisions for nodes with regular object densities. The algorithm works without user intervention and is well suited for a wide variety of scenes. First results of using these hierarchies in a radiosity clustering environment are very promising and will be presented here. The combination of very deep hierarchies (we use a binary tree) together with an efficient ray acceleration structure shifts the computational effort away from form factor and visibility calculation towards accurately propagating the energy through the hierarchy. We will show how an efficient single pass gathering can be used to minimize traversal costs. [source]


Bridging the language gap in scientific computing: the Chasm approach

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 2 2006
C. E. Rasmussen
Abstract Chasm is a toolkit providing seamless language interoperability between Fortran 95 and C++. Language interoperability is important to scientific programmers because scientific applications are predominantly written in Fortran, while software tools are mostly written in C++. Two design features differentiate Chasm from other related tools. First, we avoid the common-denominator type systems and programming models found in most Interface Definition Language (IDL)-based interoperability systems. Chasm uses the intermediate representation generated by a compiler front-end for each supported language as its source of interface information instead of an IDL. Second, bridging code is generated for each pairwise language binding, removing the need for a common intermediate data representation and multiple levels of indirection between the caller and callee. These features make Chasm a simple system that performs well, requires minimal user intervention and, in most instances, bridging code generation can be performed automatically. Chasm is also easily extensible and highly portable. Copyright © 2005 John Wiley & Sons, Ltd. [source]


Automated image-based phenotypic analysis in zebrafish embryos

DEVELOPMENTAL DYNAMICS, Issue 3 2009
Andreas Vogt
Abstract Presently, the zebrafish is the only vertebrate model compatible with contemporary paradigms of drug discovery. Zebrafish embryos are amenable to automation necessary for high-throughput chemical screens, and optical transparency makes them potentially suited for image-based screening. However, the lack of tools for automated analysis of complex images presents an obstacle to using the zebrafish as a high-throughput screening model. We have developed an automated system for imaging and analyzing zebrafish embryos in multi-well plates regardless of embryo orientation and without user intervention. Images of fluorescent embryos were acquired on a high-content reader and analyzed using an artificial intelligence-based image analysis method termed Cognition Network Technology (CNT). CNT reliably detected transgenic fluorescent embryos (Tg(fli1:EGFP)y1) arrayed in 96-well plates and quantified intersegmental blood vessel development in embryos treated with small molecule inhibitors of anigiogenesis. The results demonstrate it is feasible to adapt image-based high-content screening methodology to measure complex whole organism phenotypes. Developmental Dynamics 238:656,663, 2009. © 2009 Wiley-Liss, Inc. [source]


Development of the DYNA3D simulation code with automated fracture procedure for brick elements

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 14 2003
Ala Tabiei
Abstract Numerical simulation of cracked structures is an important aspect in structural safety assessment. In recent years, there has been an increasing rate of development of numerical codes for modelling fracture procedure. The subject of this investigation is implementing automated fracture models in the DYNA3D non-linear explicit finite element code to simulate pseudo 3D crack growth procedure. The implemented models have the capabilities of simulating automatic crack propagation without user intervention. The implementation is carried on solid elements. The methodology of implementing fracture models is described. An element deletion-and-replacement remeshing procedure is proposed for updating the explicit geometric description of evolving cracks. Fracture parameters such as stress intensity factors, energy release rates and crack tip opening angle are evaluated. The maximum circumferential stress criterion is used to predict the direction of crack advancement. Seven crack problems are presented to verify the effectiveness of the methodology. Mesh sensitivity and loading rate effects are studied in the validation of the presented procedure. Copyright © 2003 John Wiley & Sons, Ltd. [source]


Evaluation of an automated screening assay for von Willebrand Disease Type 2N

INTERNATIONAL JOURNAL OF LABORATORY HEMATOLOGY, Issue 6 2002
S. L. Taylor
Summary Evaluating the factor VIII (FVIII) binding activity of von Willebrand factor (VWF) is an important step in the diagnostic work-up of families affected by apparent mild haemophilia A. In von Willebrand's disease (VWD) type 2N (Normandy), mutations at the N-terminal end of the mature VWF subunit gene prevent the binding of FVIII. Individuals heterozygous for type 2N VWD are generally asymptomatic. Homozygotes and compound heterozygotes present with a clinical picture which mimics haemophilia A, with a markedly reduced FVIII : C activity and VWF within the normal range, but instead of exhibiting X-linked inheritance they show an autosomal recessive inheritance pattern. The distinction between haemophilia A and VWD type 2N has important implications for therapy and genetic counselling. We present a highly specific enzyme-linked immunosorbent assay screening method for the Normandy variant, which measures VWF : FVIII binding activity in parallel with VWF antigen, using monoclonal capture and detection antibodies. The assay is fully automated using a robotic microtitre plate processor, requiring minimal user intervention and providing the capacity to screen large numbers of patients. [source]


FULLPAT: a full-pattern quantitative analysis program for X-ray powder diffraction using measured and calculated patterns

JOURNAL OF APPLIED CRYSTALLOGRAPHY, Issue 6 2002
Steve J. Chipera
FULLPAT is a quantitative X-ray diffraction methodology that merges the advantages of existing full-pattern fitting methods with the traditional reference intensity ratio (RIR) method. Like the Rietveld quantitative analysis method, it uses complete diffraction patterns, including the background. However, FULLPAT can explicitly analyze all phases in a sample, including partially ordered or amorphous phases such as glasses, clay minerals, or polymers. Addition of an internal standard to both library standards and unknown samples eliminates instrumental and matrix effects and allows unconstrained analyses to be conducted by direct fitting of library standard patterns to each phase in the sample. Standard patterns may include data for any solid material including glasses, and calculated patterns may also be used. A combination of standard patterns is fitted to observed patterns using least-squares minimization, thereby reducing user intervention and bias. FULLPAT has been coded into Microsoft EXCEL using standard spreadsheet functions. [source]


Completion of crystal structures from powder data: the use of the coordination polyhedra

JOURNAL OF APPLIED CRYSTALLOGRAPHY, Issue 6 2000
Angela Altomare
Direct methods applied to powder diffraction data often provide well located heavy atoms and unreliable light-atom positions. The completion of the crystal structure is then not always straightforward and may require a considerable amount of user intervention. The heavy-atom connectivity provided by the trial solution may be used to guess the nature of the coordination polyhedra. A Monte Carlo procedure is described which, in the absence of a well defined structural model, is able to locate the light atoms correctly under the restraints of the experimental heavy-atom connectivity model. The correctness of the final model is assessed by criteria based on the agreement between the whole experimental diffraction pattern and the calculated one. The procedure requires little CPU computing time and has been implemented as a routine of EXPO [Altomare et al. (1999). J. Appl. Cryst.32, 339,340]. The method has proved to be sufficiently robust against the distortion of the coordination polyhedra and has been successfully applied to some test structures. [source]


Process optimization of injection molding using an adaptive surrogate model with Gaussian process approach

POLYMER ENGINEERING & SCIENCE, Issue 5 2007
Jian Zhou
This article presents an integrated, simulation-based optimization procedure that can determine the optimal process conditions for injection molding without user intervention. The idea is to use a nonlinear statistical regression technique and design of computer experiments to establish an adaptive surrogate model with short turn-around time and adequate accuracy for substituting time-consuming computer simulations during system-level optimization. A special surrogate model based on the Gaussian process (GP) approach, which has not been employed previously for injection molding optimization, is introduced. GP is capable of giving both a prediction and an estimate of the confidence (variance) for the prediction simultaneously, thus providing direction as to where additional training samples could be added to improve the surrogate model. While the surrogate model is being established, a hybrid genetic algorithm is employed to evaluate the model to search for the global optimal solutions in a concurrent fashion. The examples presented in this article show that the proposed adaptive optimization procedure helps engineers determine the optimal process conditions more efficiently and effectively. POLYM. ENG. SCI., 47:684,694, 2007. © 2007 Society of Plastics Engineers. [source]


Molecular replacement: the probabilistic approach of the program REMO09 and its applications

ACTA CRYSTALLOGRAPHICA SECTION A, Issue 6 2009
Rocco Caliandro
The method of joint probability distribution functions has been applied to molecular replacement techniques. The rotational search is performed by rotating the reciprocal lattice of the protein with respect to the calculated transform of the model structure; the translation search is performed by fast Fourier transform. Several cases of prior information are studied, both for the rotation and for the translation step: e.g. the conditional probability density for the rotation or the translation of a monomer is found both for ab initio and when the rotation and/or the translation values of other monomers are given. The new approach has been implemented in the program REMO09, which is part of the package for global phasing IL MILIONE [Burla, Caliandro, Camalli, Cascarano, De Caro, Giacovazzo, Polidori, Siliqi & Spagna (2007). J. Appl. Cryst.40, 609,613]. A large set of test structures has been used for checking the efficiency of the new algorithms, which proved to be significantly robust in finding the correct solutions and in discriminating them from noise. An important design concept is the high degree of automatism: REMO09 is often capable of providing a reliable model of the target structure without any user intervention. [source]


Comparison of PDQuest and Progenesis software packages in the analysis of two-dimensional electrophoresis gels

PROTEINS: STRUCTURE, FUNCTION AND BIOINFORMATICS, Issue 10 2003
Arsi T. Rosengren
Abstract Efficient analysis of protein expression by using two-dimensional electrophoresis (2-DE) data relies on the use of automated image processing techniques. The overall success of this research depends critically on the accuracy and the reliability of the analysis software. In addition, the software has a profound effect on the interpretation of the results obtained, and the amount of user intervention demanded during the analysis. The choice of analysis software that best meets specific needs is therefore of interest to the research laboratory. In this paper we compare two advanced analysis software packages, PDQuest and Progenesis. Their evaluation is based on quantitative tests at three different levels of standard 2-DE analysis: spot detection, gel matching and spot quantitation. As test materials we use three gel sets previously used in a similar comparison of Z3 and Melanie, and three sets of gels from our own research. It was observed that the quality of the test gels critically influences the spot detection and gel matching results. Both packages were sensitive to the parameter or filter settings with respect to the tendency of finding true positive and false positive spots. Quantitation results were very accurate for both analysis software packages. [source]


Application of the StrOligo algorithm for the automated structure assignment of complex N-linked glycans from glycoproteins using tandem mass spectrometry

RAPID COMMUNICATIONS IN MASS SPECTROMETRY, Issue 24 2003
Martin Ethier
Oligosaccharides associated with proteins are known to give these molecules specific conformations and functions. Analysis of proteins would not be complete without studying the glycans. However, high-throughput techniques in proteomics will soon overwhelm the current capacity of methods if no automation is incorporated into glycomics. New capabilities of the StrOligo algorithm introduced for this purpose (Ethier et al., Rapid Commun. Mass Spectrom., 2002; 16: 1743) will be discussed here. Experimental tandem mass spectra were acquired to test the algorithm using a hybrid quadrupole-time-of-flight (QqTOF) instrument with a matrix-assisted laser desorption/ionization (MALDI) source. The samples were N-linked oligosaccharides from monoclonal antibody IgG, beta interferon and fetuin, detached by enzymatic deglycosylation and labeled at the reducing end. Improvements to the program were made in order to reduce the need for user intervention. StrOligo strips the spectra down to monoisotopic peaks only. The algorithm first builds a relationship tree, accounting for each observed loss of a monosaccharide moiety, and then analyzes the tree and proposes possible structures from combinations of adducts and fragment ion types. A score, which reflects agreement with experimental results, is then given to each proposed structure. The program then decides which combination is the best one and labels relevant peaks in the experimental mass spectrum using a modified nomenclature. The usefulness of the algorithm has been demonstrated by assigning structures to several glycans released from glycoproteins. The analysis was completed in less than 2 minutes for any glycan, which is a substantial improvement over manual interpretation. Copyright © 2003 John Wiley & Sons, Ltd. [source]


BALBES: a molecular-replacement pipeline

ACTA CRYSTALLOGRAPHICA SECTION D, Issue 1 2008
Fei Long
The number of macromolecular structures solved and deposited in the Protein Data Bank (PDB) is higher than 40,000. Using this information in macromolecular crystallography (MX) should in principle increase the efficiency of MX structure solution. This paper describes a molecular-replacement pipeline, BALBES, that makes extensive use of this repository. It uses a reorganized database taken from the PDB with multimeric as well as domain organization. A system manager written in Python controls the workflow of the process. Testing the current version of the pipeline using entries from the PDB has shown that this approach has huge potential and that around 75% of structures can be solved automatically without user intervention. [source]


Pattern-recognition methods to identify secondary structure within X-ray crystallographic electron-density maps

ACTA CRYSTALLOGRAPHICA SECTION D, Issue 3 2002
Thomas Oldfield
The interpretation of macromolecular crystallographic electron-density maps is a difficult and traditionally a manual step in the determination of a protein structure. The visualization of information within an electron-density map can be extremely arduous owing to the amount and complexity of information present. The ability to see the overall fold and structure of the molecule is usually lost among all the detail, particularly with larger structures. This paper describes a novel method of analysis of electron density in real space that can determine the secondary structure of a protein within minutes without any user intervention. The method is able to work with poor data as well as good data at resolutions down to 3.5,Ĺ and is integral to the functionality of QUANTA. This article describes the methodology of the pattern recognition and its use with a number of sets of experimental data. [source]