Specification

Distribution by Scientific Domains
Distribution within Life Sciences

Kinds of Specification

  • cell fate specification
  • cell specification
  • correct specification
  • fate specification
  • formal specification
  • general specification
  • model specification

  • Terms modified by Specification

  • specification issues
  • specification test
  • specification testing

  • Selected Abstracts


    A Decision Support System Specification for Cost Escalation in Heavy Engineering Industry

    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 5 2002
    Nashwan N. Dawood
    The heavy civil engineering industry (railways, sewage-treatment, chemical and pharmaceutical facilities, oil and gas facilities, etc.) is one of the major contributors to the British economy and generally involves a high level of investment. Clients in this industry are demanding accurate cost estimates, proper analysis of out-turn cost and cost escalation, and a high quality risk analysis throughout the construction processes. Current practices in the industry have suggested that there is a lack of structured methodologies and systematic cost escalation approaches to achieve an appropriate cost analysis at the outset of projects and throughout the construction processes. In this context the prime objective of this research work is to develop a structured cost escalation methodology for improving estimating management and control in the heavy engineering industry construction processes. The methodology is composed of a forecasting model to predict cost indices of major items in industry and a risk knowledge-base model for identifying and quantifying causes of cost escalations. This paper reviews and discusses a knowledge-based model for applying a cost escalation factor. The cost escalation factor is made up of market variation, a risk element, and a component for bias. A knowledge elicitation strategy was employed to obtain the required knowledge for the model. The strategy included questionnaires, interviews, and workshops, and deliverables came in the form of influences and their effect on project cost escalation. From these deliverables, a decision support system and specifications for applying cost escalation to base estimates are presented. [source]


    Specification, planning, and execution of QoS-aware Grid workflows within the Amadeus environment

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 4 2008
    Ivona Brandic
    Abstract Commonly, at a high level of abstraction Grid applications are specified based on the workflow paradigm. However, majority of Grid workflow systems either do not support Quality of Service (QoS), or provide only partial QoS support for certain phases of the workflow lifecycle. In this paper we present Amadeus, which is a holistic service-oriented environment for QoS-aware Grid workflows. Amadeus considers user requirements, in terms of QoS constraints, during workflow specification, planning, and execution. Within the Amadeus environment workflows and the associated QoS constraints are specified at a high level using an intuitive graphical notation. A distinguishing feature of our system is the support of a comprehensive set of QoS requirements, which considers in addition to performance and economical aspects also legal and security aspects. A set of QoS-aware service-oriented components is provided for workflow planning to support automatic constraint-based service negotiation and workflow optimization. For improving the efficiency of workflow planning we introduce a QoS-aware workflow reduction technique. Furthermore, we present our static and dynamic planning strategies for workflow execution in accordance with user-specified requirements. For each phase of the workflow lifecycle we experimentally evaluate the corresponding Amadeus components. Copyright © 2007 John Wiley & Sons, Ltd. [source]


    Specification and detection of performance problems with ASL

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 11 2007
    Michael Gerndt
    Abstract Performance analysis is an important step in tuning performance-critical applications. It is a cyclic process of measuring and analyzing performance data, driven by the programmer's hypotheses on potential performance problems. Currently this process is controlled manually by the programmer. The goal of the work described in this article is to automate the performance analysis process based on a formal specification of performance properties. One result of the APART project is the APART Specification Language (ASL) for the formal specification of performance properties. Performance bottlenecks can then be identified based on the specification, since bottlenecks are viewed as performance properties with a large negative impact. We also present the overall design and an initial evaluation of the Periscope system which utilizes ASL specifications to automatically search for performance bottlenecks in a distributed manner. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    CCLRC Portal infrastructure to support research facilities

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 6 2007
    Asif Akram
    Abstract The emergence of portal technology is providing benefits in developing portlet interfaces to applications to meet the current and future requirements of CCLRC facilities support. Portlets can be reused by different projects, e.g. the high-profile Integrative Biology project (with the University of Oxford), and in different Java Specification Request 168 Portlet Specification (JSR 168) compliant portal frameworks. Deployment and maintenance of applications developed as portlets becomes easier and manageable. A community process is already beginning and many portal frameworks come with free-to-use useful portlets. As rendering is carried out in the framework, applications can be easily accessible and internationalized. Portlets are compatible with J2EE, thus providing additional capabilities required in the service-oriented architecture (SOA). We also describe how Web service gateways can be used to provide many of the functionalities encapsulated in a portal server in a way to support Grid applications. Portals used as a rich client can allow users to customize or personalize their user interfaces and even their workflow and application access. CCLRC facilities will be able to leverage the work so far carried out on the National Grid Service (NGS) and e-HTPX portals, as they are fully functional and have received detailed user feedback. This demonstrates the usefulness of providing advanced capabilities for e-Research and having the associated business logic in a SOA loosely coupled from the presentation layer for an Integrated e-Science Environment. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    Identification of genes expressed preferentially in the developing peripheral margin of the optic cup

    DEVELOPMENTAL DYNAMICS, Issue 9 2009
    Jeffrey M. Trimarchi
    Abstract Specification of the peripheral optic cup by Wnt signaling is critical for formation of the ciliary body/iris. Identification of marker genes for this region during development provides a starting point for functional analyses. During transcriptional profiling of single cells from the developing eye, two cells were identified that expressed genes not found in most other single cell profiles. In situ hybridizations demonstrated that many of these genes were expressed in the peripheral optic cup in both early mouse and chicken development, and in the ciliary body/iris at subsequent developmental stages. These analyses indicate that the two cells probably originated from the developing ciliary body/iris. Changes in expression of these genes were assayed in embryonic chicken retinas when canonical Wnt signaling was ectopically activated by CA-,-catenin. Twelve ciliary body/iris genes were identified as upregulated following induction, suggesting they are excellent candidates for downstream effectors of Wnt signaling in the optic cup. Developmental Dynamics 238:2327,2339, 2009. © 2009 Wiley-Liss, Inc. [source]


    Specification of the enveloping layer and lack of autoneuralization in zebrafish embryonic explants

    DEVELOPMENTAL DYNAMICS, Issue 1 2005
    Charles G. Sagerström
    Abstract We have analyzed the roles of cell contact during determination of the outermost enveloping layer (EVL) and deeper neurectoderm in zebrafish embryos. Outer cells, but not deeper cells, are specified to express the EVL-specific marker, cyt1 by late blastula. EVL specification requires cell contact or close cell proximity, because cyt1 is not expressed after explant dissociation. The EVL may be homologous to the Xenopus epithelial layer, including the ventral larval epidermis. While Xenopus epidermal cytokeratin gene expression is activated by bone morphogenetic protein (BMP) signaling, zebrafish cyt1 is not responsive to BMPs. Zebrafish early gastrula ectodermal explants are specified to express the neural markers opl (zic1) and otx2, and this expression is prevented by BMP4. Dissociation of zebrafish explants prevents otx2 and opl expression, suggesting that neural specification in zebrafish requires cell contact or close cell proximity. This finding is in contrast to the case in Xenopus, where ectodermal dissociation leads to activation of neural gene expression, or autoneuralization. Our data suggest that distinct mechanisms direct development of homologous lineages in different vertebrates. Developmental Dynamics 232:85,97, 2005. © 2004 Wiley-Liss, Inc. [source]


    Peer Commentaries on Marcy A. Kingsbury and Barbara L. Finlay's The cortex in multidimensional space: where do cortical areas come from?

    DEVELOPMENTAL SCIENCE, Issue 2 2001
    Article first published online: 28 JUN 200
    Elizabeth Bates, Brain evolution and development: passing through the eye of the needle, p. 143 Serena M. Dudek, Multidimensional gene expression in cortical space, p. 145 Henry Kennedy and Colette Dehay, Gradients and boundaries: limits of modularity and its influence on the isocortex, p. 147 Sarah L. Pallas, Specification of mammalian neocortex: the power of the evo,devo approach in resolving the nature,nurture dichotomy, p. 148 Michel Roger, Embryonic stage of commitment of neocortical cells to develop area-specific connections, p. 151 M.W. Spratling and M.H. Johnson, Activity-dependent processes in regional cortical specialization, p. 153 [source]


    Retinoid signaling and cardiac anteroposterior segmentation

    GENESIS: THE JOURNAL OF GENETICS AND DEVELOPMENT, Issue 3 2001
    José Xavier-Neto
    Abstract Summary: Establishment of anterior,posterior polarity is one of the earliest decisions in cardiogenesis. Specification of anterior (outflow) and posterior (inflow) structures ensures proper connections between venous system and inflow tract and between arterial tree and outflow tract. The last few years have witnessed remarkable progress in our understanding of cardiac anteroposterior patterning. Molecular cloning and subsequent studies on RALDH2, the key embryonic retinaldehyde dehydrogenase in retinoic acid (RA) synthesis, provided the missing link between teratogenic studies on RA deficiency and excess and normal chamber morphogenesis. We discuss work establishing the foundations of our current understanding of the mechanisms of cardiac anteroposterior segmentation, the reasons why early evidence pointing to the role of RA in anteroposterior segmentation was overlooked, and the key experiments unraveling the role of RA in cardiac anteroposterior segmentation. We have also integrated recent experiments in a model of cardiac anteroposterior patterning in which RALDH2 expression determines anteroposterior boundaries in the heart field. genesis 31:97,104, 2001. © 2001 Wiley-Liss, Inc. [source]


    Radiopacity of root filling materials using digital radiography

    INTERNATIONAL ENDODONTIC JOURNAL, Issue 7 2007
    J. R. Carvalho-Junior
    Abstract Aim, To evaluate radiopacity of root filling materials using digital radiography. Methodology, The sealers tested were AH PlusTM, Endofill®, EndoREZTM and EpiphanyTM. Gutta-percha (Dentsply Maillefer) and ResilonTM cones were also tested. Acrylic plates, containing six wells, measuring 1 mm in depth and 5 mm in diameter, were prepared for the test, and filled with the materials. The test samples were radiographed together with an aluminium stepwedge calibrated in millimetres, according to ANSI/ADA Specification 57. For the radiographic exposures, digital imaging plates and an X-ray machine at 70 kVp and 8 mA were used. The object-to-focus distance was 30 cm, and the exposure time, 0.2 s. After the laser optic reading process, the software determined the radiopacity of the standardized areas, using grey-scale values, calculating the average radiographic density for each material. Results, The decreasing values of radiopacity of the studied materials, expressed in millimetres of aluminium equivalent, were: ResilonTM (13.0), AH PlusTM (11.2), gutta-percha (9.8), EpiphanyTM (8.0), Endofill® (6.9) and EndoREZTM (6.6). Conclusion, All materials had radiopacity values above 3 mm of aluminium recommended by ANSI/ADA Specification 57. [source]


    Bayesian Hypothesis Testing: a Reference Approach

    INTERNATIONAL STATISTICAL REVIEW, Issue 3 2002
    José M. Bernardo
    Summary For any probability model M={p(x|,, ,), ,,,, ,,,} assumed to describe the probabilistic behaviour of data x,X, it is argued that testing whether or not the available data are compatible with the hypothesis H0={,=,0} is best considered as a formal decision problem on whether to use (a0), or not to use (a0), the simpler probability model (or null model) M0={p(x|,0, ,), ,,,}, where the loss difference L(a0, ,, ,) ,L(a0, ,, ,) is proportional to the amount of information ,(,0, ,), which would be lost if the simplified model M0 were used as a proxy for the assumed model M. For any prior distribution ,(,, ,), the appropriate normative solution is obtained by rejecting the null model M0 whenever the corresponding posterior expectation ,,,(,0, ,, ,),(,, ,|x)d,d, is sufficiently large. Specification of a subjective prior is always difficult, and often polemical, in scientific communication. Information theory may be used to specify a prior, the reference prior, which only depends on the assumed model M, and mathematically describes a situation where no prior information is available about the quantity of interest. The reference posterior expectation, d(,0, x) =,,,(,|x)d,, of the amount of information ,(,0, ,, ,) which could be lost if the null model were used, provides an attractive nonnegative test function, the intrinsic statistic, which is invariant under reparametrization. The intrinsic statistic d(,0, x) is measured in units of information, and it is easily calibrated (for any sample size and any dimensionality) in terms of some average log-likelihood ratios. The corresponding Bayes decision rule, the Bayesian reference criterion (BRC), indicates that the null model M0 should only be rejected if the posterior expected loss of information from using the simplified model M0 is too large or, equivalently, if the associated expected average log-likelihood ratio is large enough. The BRC criterion provides a general reference Bayesian solution to hypothesis testing which does not assume a probability mass concentrated on M0 and, hence, it is immune to Lindley's paradox. The theory is illustrated within the context of multivariate normal data, where it is shown to avoid Rao's paradox on the inconsistency between univariate and multivariate frequentist hypothesis testing. Résumé Pour un modèle probabiliste M={p(x|,, ,) ,,,, ,,,} censé décrire le comportement probabiliste de données x,X, nous soutenons que tester si les données sont compatibles avec une hypothèse H0={,=,0 doit être considéré comme un problème décisionnel concernant l'usage du modèle M0={p(x|,0, ,) ,,,}, avec une fonction de coût qui mesure la quantité d'information qui peut être perdue si le modèle simplifiéM0 est utilisé comme approximation du véritable modèle M. Le coût moyen, calculé par rapport à une loi a priori de référence idoine fournit une statistique de test pertinente, la statistique intrinsèque d(,0, x), invariante par reparamétrisation. La statistique intrinsèque d(,0, x) est mesurée en unités d'information, et sa calibrage, qui est independante de la taille de léchantillon et de la dimension du paramètre, ne dépend pas de sa distribution à l'échantillonage. La règle de Bayes correspondante, le critère de Bayes de référence (BRC), indique que H0 doit seulement êetre rejeté si le coût a posteriori moyen de la perte d'information à utiliser le modèle simplifiéM0 est trop grande. Le critère BRC fournit une solution bayésienne générale et objective pour les tests d'hypothèses précises qui ne réclame pas une masse de Dirac concentrée sur M0. Par conséquent, elle échappe au paradoxe de Lindley. Cette théorie est illustrée dans le contexte de variables normales multivariées, et on montre qu'elle évite le paradoxe de Rao sur l'inconsistence existant entre tests univariés et multivariés. [source]


    Effect of Thawing and Cold Storage on Frozen Chicken Thigh Meat Quality by High-Voltage Electrostatic Field

    JOURNAL OF FOOD SCIENCE, Issue 4 2010
    Chang-Wei Hsieh
    ABSTRACT:, One of the most popular issues in electrostatic biology is the effects of a high-voltage electrostatic field (HVEF) on the thawing of chicken thigh meat. In this study, chicken thigh meat was treated with HVEF (E-group), and compared to samples stored in a common refrigerator (R-group), to investigate how HVEF affects chicken thigh meat quality after thawing at low temperature storage (,3 and 4 °C). The results showed that there were no significant differences in biochemical and microorganism indices at ,3 °C. However, the HVEF can significantly shorten thawing time for frozen chicken thigh meat at ,3 °C. After thawing chicken thigh meat and storing at 4 °C, the total viable counts reached the Intl. Commission on Microbiological Specification for Foods limit of 107 CFU/g on the 6 and 8 d for the R- and E-group, respectively. On the 8th d, the volatile basic nitrogen had increased from 11.24 mg/100 g to 21.9 mg/100 g for the E-group and 39.9 mg/100 g for the R-group, respectively. The biochemical and microorganism indices also indicated that the E-group treatment yielded better results on thawing than the R-group treatment. The application of this model has the potential to keep products fresh. [source]


    The effect of the addition of poly(methyl methacrylate) fibres on some properties of high strength heat-cured acrylic resin denture base material

    JOURNAL OF ORAL REHABILITATION, Issue 3 2003
    D. Jagger
    summary, The self-reinforcement of acrylic resin with butadiene styrene surface treated poly(methyl methacrylate) fibres has been reported to have the potential to substantially improve the transverse bend strength of conventional heat-cured acrylic resin. The aim of this study was to investigate the effect of the addition of butadiene styrene surface treated poly(methyl methacrylate) fibres in cross-ply arrangement to high impact acrylic resin on the transverse and impact strength. Specimens were prepared as specified in the International Standard Organization and British Standards for the Testing of Denture Base Resins (ISO 1567, 1988; BS 2487, 1989) and the British Standard Specification for Orthodontic resins (BS 6747, 1987) for transverse bend and impact testing. The impact strength was measured using a Zwick pendulum impact tester and the transverse bend strength measured using a Lloyds Instruments testing machine. The results showed that the impact strength was not improved with the addition of fibres, high impact acrylic resin with fibres (LF) 11·1 kJ m,2 and high impact acrylic resin (L) (12·5 kJ m,2). The modulus of rupture was decreased with the addition of fibres (57·8 MPa) for (LF) compared with (60·4 MPa) for (L). The modulus of elasticity was also reduced with the addition of fibres (1834·9 MPa) (LF) and 2086·2 MPa (L) as was the peak load (LF) (50·8 N) and (L) (55·8 N). It was concluded that the addition of surface treated poly(methyl methacrylate) fibres in cross-ply arrangement to high strength acrylic resin did not produce an improvement in the impact or transverse strength and cannot be recommended as a method of reinforcement. [source]


    Specification and estimation of social interaction models with network structures

    THE ECONOMETRICS JOURNAL, Issue 2 2010
    Lung-fei Lee
    Summary, This paper considers the specification and estimation of social interaction models with network structures and the presence of endogenous, contextual and correlated effects. With macro group settings, group-specific fixed effects are also incorporated in the model. The network structure provides information on the identification of the various interaction effects. We propose a quasi-maximum likelihood approach for the estimation of the model. We derive the asymptotic distribution of the proposed estimator, and provide Monte Carlo evidence on its small sample performance. [source]


    On the Specification and Estimation of the Production Function for Cognitive Achievement*

    THE ECONOMIC JOURNAL, Issue 485 2003
    Petra E. Todd
    This paper considers methods for modelling the production function for cognitive achievement in a way that captures theoretical notions that child development is a cumulative process depending on the history of family and school inputs and on innate ability. It develops a general modelling framework that accommodates many of the estimating equations used in the literatures. It considers different ways of addressing data limitations, and it makes precise the identifying assumptions needed to justify alternative approaches. Commonly used specifications are shown to place restrictive assumptions on the production technology. Ways of testing modelling assumptions and of relaxing them are discussed. [source]


    Model Specification and Risk Premia: Evidence from Futures Options

    THE JOURNAL OF FINANCE, Issue 3 2007
    MARK BROADIE
    ABSTRACT This paper examines model specification issues and estimates diffusive and jump risk premia using S&P futures option prices from 1987 to 2003. We first develop a time series test to detect the presence of jumps in volatility, and find strong evidence in support of their presence. Next, using the cross section of option prices, we find strong evidence for jumps in prices and modest evidence for jumps in volatility based on model fit. The evidence points toward economically and statistically significant jump risk premia, which are important for understanding option returns. [source]


    1222: Pharmacological overview of the ophthalmic anaesthesia

    ACTA OPHTHALMOLOGICA, Issue 2010
    E FISCHER
    Purpose Review of the chemical properties, differences between the effectiveness of the ophthalmological used local anaesthetics. Historical overview from 1860, year of the isolation of cocaine, to date. Methods Didactic and substantive summary of the literature. Results Specification of the basicity, lipophilicity, physicochemical properties, and the advantages, and disadvantages of adding adrenalin in different concentrations. Local anaesthetics also have adverse effects. All stimulate central nervous system, therefore it is very important to have a proper anamnesis, especially information about drug hypersensitivity from the patient. Conclusion Interventions must be carried out in the view of drug interactions, with personalized choice of drugs and dose. [source]


    CONTEXTUALIZING LEARNING OBJECTS USING ONTOLOGIES

    COMPUTATIONAL INTELLIGENCE, Issue 3 2007
    Phaedra Mohammed
    Educational research over the past three years has intensified such that the context of learning resources needs to be properly modeled. Many researchers have described and even mandated the use of ontologies in the research being conducted, yet the process of actually connecting one or more ontologies to a learning object has not been extensively discussed. This paper describes a practical model for associating multiple ontologies with learning objects while making full use of the IEEE LOM specification. The model categorizes these ontologies according to five major categories of context based on the most popular fields of study actively being pursued by the educational research community: Thematic context, Pedagogical context, Learner context, Organizational context, and Historical/Statistical context. [source]


    Automated Negotiation from Declarative Contract Descriptions

    COMPUTATIONAL INTELLIGENCE, Issue 4 2002
    Daniel M. Reeves
    Our approach for automating the negotiation of business contracts proceeds in three broad steps. First, determine the structure of the negotiation process by applying general knowledge about auctions and domain,specific knowledge about the contract subject along with preferences from potential buyers and sellers. Second, translate the determined negotiation structure into an operational specification for an auction platform. Third, after the negotiation has completed, map the negotiation results to a final contract. We have implemented a prototype which supports these steps by employing a declarative specification (in courteous logic programs) of (1) high,level knowledge about alternative negotiation structures, (2) general,case rules about auction parameters, (3) rules to map the auction parameters to a specific auction platform, and (4) special,case rules for subject domains. We demonstrate the flexibility of this approach by automatically generating several alternative negotiation structures for the domain of travel shopping in a trading agent competition. [source]


    To Commit or Not to Commit: Modeling Agent Conversations for Action

    COMPUTATIONAL INTELLIGENCE, Issue 2 2002
    Roberto A. Flores
    Conversations are sequences of messages exchanged among interacting agents. For conversations to be meaningful, agents ought to follow commonly known specifications limiting the types of messages that can be exchanged at any point in the conversation. These specifications are usually implemented using conversation policies (which are rules of inference) or conversation protocols (which are predefined conversation templates). In this article we present a semantic model for specifying conversations using conversation policies. This model is based on the principles that the negotiation and uptake of shared social commitments entail the adoption of obligations to action, which indicate the actions that agents have agreed to perform. In the same way, obligations are retracted based on the negotiation to discharge their corresponding shared social commitments. Based on these principles, conversations are specified as interaction specifications that model the ideal sequencing of agent participations negotiating the execution of actions in a joint activity. These specifications not only specify the adoption and discharge of shared commitments and obligations during an activity, but also indicate the commitments and obligations that are required (as preconditions) or that outlive a joint activity (as postconditions). We model the Contract Net Protocol as an example of the specification of conversations in a joint activity. [source]


    On-line motion blending for real-time locomotion generation

    COMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 3-4 2004
    Sang Il Park
    Abstract In this paper, we present an integrated framework of on-line motion blending for locomotion generation. We first provide a novel scheme for incremental timewarping, which always guarantees that the time goes forward. Combining the idea of motion blending with that of posture rearrangement, we introduce a motion transition graph to address on-line motion blending and transition simultaneously. Guided by a stream of motion specifications, our motion synthesis scheme moves from node to node in an on-line manner while blending a motion at a node and generating a transition motion at an edge. For smooth on-line motion transition, we also attach a set of example transition motions to an edge. To represent similar postures consistently, we exploit the inter-frame coherency embedded in the input motion specification. Finally, we provide a comprehensive solution to on-line motion retargeting by integrating existing techniques. Copyright © 2004 John Wiley & Sons, Ltd. [source]


    A software player for providing hints in problem-based learning according to a new specification

    COMPUTER APPLICATIONS IN ENGINEERING EDUCATION, Issue 3 2009
    Pedro J. Muñoz-Merino
    Abstract The provision of hints during problem solving has been a successful strategy in the learning process. There exist several computer systems that provide hints to students during problem solving, covering some specific issues of hinting. This article presents a novel software player module for providing hints in problem-based learning. We have implemented it into the XTutor Intelligent Tutoring System using its XDOC extension mechanism and the Python programming language. This player includes some of the functionalities that are present in different state-of-the-art systems, and also other new relevant functionalities based on our own ideas and teaching experience. The article explains each feature for providing hints and it also gives a pedagogical justification or explanation. We have created an XML binding, so any combination of the model hints functionalities can be expressed as an XML instance, enabling interoperability and reusability. The implemented player tool together with the XTutor server-side XDOC processor can interpret and run XML files according to this newly defined hints specification. Finally, the article presents several running examples of use of the tool, the subjects where it is in use, and results that lead to the conclusion of the positive impact of this hints tool in the learning process based on quantitative and qualitative analysis. © 2009 Wiley Periodicals, Inc. Comput Appl Eng Educ 17: 272,284, 2009; Published online in Wiley InterScience (www.interscience.wiley.com); DOI 10.1002/cae.20240 [source]


    Interaction-Dependent Semantics for Illustrative Volume Rendering

    COMPUTER GRAPHICS FORUM, Issue 3 2008
    Peter Rautek
    In traditional illustration the choice of appropriate styles and rendering techniques is guided by the intention of the artist. For illustrative volume visualizations it is difficult to specify the mapping between the 3D data and the visual representation that preserves the intention of the user. The semantic layers concept establishes this mapping with a linguistic formulation of rules that directly map data features to rendering styles. With semantic layers fuzzy logic is used to evaluate the user defined illustration rules in a preprocessing step. In this paper we introduce interaction-dependent rules that are evaluated for each frame and are therefore computationally more expensive. Enabling interaction-dependent rules, however, allows the use of a new class of semantics, resulting in more expressive interactive illustrations. We show that the evaluation of the fuzzy logic can be done on the graphics hardware enabling the efficient use of interaction-dependent semantics. Further we introduce the flat rendering mode and discuss how different rendering parameters are influenced by the rule base. Our approach provides high quality illustrative volume renderings at interactive frame rates, guided by the specification of illustration rules. [source]


    Generating Animatable 3D Virtual Humans from Photographs

    COMPUTER GRAPHICS FORUM, Issue 3 2000
    WonSook Lee
    We present an easy, practical and efficient full body cloning methodology. This system utilizes photos taken from the front, side and back of a person in any given imaging environment without requiring a special background or a controlled illuminating condition. A seamless generic body specified in the VRML H-Anim 1.1 format is used to generate an individualized virtual human. The system is composed of two major components: face-cloning and body-cloning. The face-cloning component uses feature points on front and side images and then applies DFFD for shape modification. Next a fully automatic seamless texture mapping is generated for 360° coloring on a 3D polygonal model. The body-cloning component has two steps: (i feature points specification, which enables automatic silhouette detection in an arbitrary background (ii two-stage body modification by using feature points and body silhouette respectively. The final integrated human model has photo-realistic animatable face, hands, feet and body. The result can be visualized in any VRML compliant browser. [source]


    Design of a virtual environment aided by a model-based formal approach using DEVS,

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 11 2009
    Azzedine Boukerche
    Abstract Virtual environment (VE) is a modern computer technique that aims to provide an attracting and meaningful human,computer interacting platform, which can essentially help the human users to learn, to play or to be trained in a ,like-real' situation. Recent advances in VE techniques have resulted in their being widely used in many areas, in particular, the E-learning-based training applications. Many researchers have developed the techniques for designing and implementing the 3D virtual environment; however, the existing approaches cannot fully catch up the increasing complexity of modern VE applications. In this paper, we designed and implemented a very attracting web-based 3D virtual environment application that aims to help the training practice of personnel working in the radiology department of a hospital. Furthermore, we presented a model-based formal approach using discrete event system specification (DEVS) to help us in validating the X3D components' behavior. As a step further, DEVS also helps to optimize our design through simulating the design alternatives. Copyright © 2009 John Wiley & Sons, Ltd. [source]


    Specification, planning, and execution of QoS-aware Grid workflows within the Amadeus environment

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 4 2008
    Ivona Brandic
    Abstract Commonly, at a high level of abstraction Grid applications are specified based on the workflow paradigm. However, majority of Grid workflow systems either do not support Quality of Service (QoS), or provide only partial QoS support for certain phases of the workflow lifecycle. In this paper we present Amadeus, which is a holistic service-oriented environment for QoS-aware Grid workflows. Amadeus considers user requirements, in terms of QoS constraints, during workflow specification, planning, and execution. Within the Amadeus environment workflows and the associated QoS constraints are specified at a high level using an intuitive graphical notation. A distinguishing feature of our system is the support of a comprehensive set of QoS requirements, which considers in addition to performance and economical aspects also legal and security aspects. A set of QoS-aware service-oriented components is provided for workflow planning to support automatic constraint-based service negotiation and workflow optimization. For improving the efficiency of workflow planning we introduce a QoS-aware workflow reduction technique. Furthermore, we present our static and dynamic planning strategies for workflow execution in accordance with user-specified requirements. For each phase of the workflow lifecycle we experimentally evaluate the corresponding Amadeus components. Copyright © 2007 John Wiley & Sons, Ltd. [source]


    JaMP: an implementation of OpenMP for a Java DSM

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 18 2007
    Michael Klemm
    Abstract Although OpenMP is a widely agreed-upon standard for the C/C++ and Fortran programming languages for the semi-automatic parallelization of programs for shared memory machines, not much has been done on the binding of OpenMP to Java that targets clusters with distributed memory. This paper presents three major contributions: (1) JaMP is an adaptation of the OpenMP standard to Java that implements a large subset of the OpenMP specification with an expressiveness comparable to that of OpenMP; (2) we suggest a set of extensions that allow a better integration of OpenMP into the Java language; (3) we present our prototype implementation of JaMP in the research compiler Jackal, a software-based distributed shared memory implementation for Java. We evaluate the performance of JaMP with a set of micro-benchmarks and with OpenMP versions of the parallel Java Grande Forum (JGF) benchmarks. The micro-benchmarks show that OpenMP for Java can be implemented without much overhead. The JGF benchmarks achieve a good speed-up of 5,8 on eight nodes. Copyright © 2007 John Wiley & Sons, Ltd. [source]


    Specification and detection of performance problems with ASL

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 11 2007
    Michael Gerndt
    Abstract Performance analysis is an important step in tuning performance-critical applications. It is a cyclic process of measuring and analyzing performance data, driven by the programmer's hypotheses on potential performance problems. Currently this process is controlled manually by the programmer. The goal of the work described in this article is to automate the performance analysis process based on a formal specification of performance properties. One result of the APART project is the APART Specification Language (ASL) for the formal specification of performance properties. Performance bottlenecks can then be identified based on the specification, since bottlenecks are viewed as performance properties with a large negative impact. We also present the overall design and an initial evaluation of the Periscope system which utilizes ASL specifications to automatically search for performance bottlenecks in a distributed manner. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    A test suite for parallel performance analysis tools

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 11 2007
    Michael Gerndt
    Abstract Parallel performance analysis tools must be tested as to whether they perform their task correctly, which comprises at least three aspects. First, it must be ensured that the tools neither alter the semantics nor distort the run-time behavior of the application under investigation. Next, it must be verified that the tools collect the correct performance data as required by their specification. Finally, it must be checked that the tools perform their intended tasks and detect relevant performance problems. Focusing on the latter (correctness) aspect, testing can be done using synthetic test functions with controllable performance properties, possibly complemented by real-world applications with known performance behavior. A systematic test suite can be built from synthetic test functions and other components, possibly with the help of tools to assist the user in putting the pieces together into executable test programs. Clearly, such a test suite can be highly useful to builders of performance analysis tools. It is surprising that, up until now, no systematic effort has been undertaken to provide such a suite. In this paper we describe the APART Test Suite (ATS) for checking the correctness (in the above sense) of parallel performance analysis tools. In particular, we describe a collection of synthetic test functions which allows one to easily construct both simple and more complex test programs with desired performance properties. We briefly report on experience with MPI and OpenMP performance tools when applied to the test cases generated by ATS. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    Coordinating components in middleware systems

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 13 2003
    Matthias Radestock
    Abstract Configuration and coordination are central issues in the design and implementation of middleware systems and are one of the reasons why building such systems is more complex than constructing stand-alone sequential programs. Through configuration, the structure of the system is established,which elements it contains, where they are located and how they are interconnected. Coordination is concerned with the interaction of the various components,when an interaction takes place, which parties are involved, what protocols are followed. Its purpose is to coordinate the behaviour of the various components to meet the overall system specification. The open and adaptive nature of middleware systems makes the task of configuration and coordination particularly challenging. We propose a model that can operate in such an environment and enables the dynamic integration and coordination of components by observing and coercing their behaviour through the interception of the messages exchanged between them. Copyright © 2003 John Wiley & Sons, Ltd. [source]


    A quality-of-service-based framework for creating distributed heterogeneous software components

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 12 2002
    Rajeev R. Raje
    Abstract Component-based software development offers a promising solution for taming the complexity found in today's distributed applications. Today's and future distributed software systems will certainly require combining heterogeneous software components that are geographically dispersed. For the successful deployment of such a software system, it is necessary that its realization, based on assembling heterogeneous components, not only meets the functional requirements, but also satisfies the non-functional criteria such as the desired quality of service (QoS). In this paper, a framework based on the notions of a meta-component model, a generative domain model and QoS parameters is described. A formal specification based on two-level grammar is used to represent these notions in a tightly integrated way so that QoS becomes a part of the generative domain model. A simple case study is described in the context of this framework. Copyright © 2002 John Wiley & Sons, Ltd. [source]