Home About us Contact | |||
Granularity
Selected AbstractsGranularity in Relational Formalisms,With Application to Time and Space RepresentationCOMPUTATIONAL INTELLIGENCE, Issue 4 2001Jérôme Euzenat Temporal and spatial phenomena can be seen at a more or less precise granularity, depending on the kind of perceivable details. As a consequence, the relationship between two objects may differ depending on the granularity considered. When merging representations of different granularity, this may raise problems. This paper presents general rules of granularity conversion in relation algebras. Granularity is considered independently of the specific relation algebra, by investigating operators for converting a representation from one granularity to another and presenting six constraints that they must satisfy. The constraints are shown to be independent and consistent and general results about the existence of such operators are provided. The constraints are used to generate the unique pairs of operators for converting qualitative temporal relationships (upward and downward) from one granularity to another. Then two fundamental constructors (product and weakening) are presented: they permit the generation of new qualitative systems (e.g. space algebra) from existing ones. They are shown to preserve most of the properties of granularity conversion operators. [source] Effects of granularity of search results on the relevance judgment behavior of engineers: Building systems for retrieval and understanding of contextJOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, Issue 3 2010Panos Balatsoukas Granularity is a novel concept for presenting information in search result interfaces of hierarchical query-driven information retrieval systems in a manner that can support understanding and exploration of the context of the retrieved information (e.g., by highlighting its position in the granular hierarchy and exposing its relationship with relatives in the hierarchy). Little research, however, has been conducted on the effects of granularity of search results on the relevance judgment behavior of engineers. Engineers are highly motivated information users who are particularly interested in understanding the context of the retrieved information. Therefore, it is hypothesized that the design of systems with careful regard for granularity would improve engineers' relevance judgment behavior. To test this hypothesis, a prototype system was developed and evaluated in terms of the time needed for users to find relevant information, the accuracy of their relevance judgment, and their subjective satisfaction. To evaluate the prototype, a user study was conducted where participants were asked to complete tasks, complete a satisfaction questionnaire, and be interviewed. The findings showed that participants performed better and were more satisfied when the prototype system presented only relevant information in context. Although this study presents some novel findings about the effects of granularity and context on user relevance judgment behavior, the results should be interpreted with caution. For example, participants in this research were recruited by convenience and performed a set of simulated tasks as opposed to real ones. However, suggestions for further research are presented. [source] Toward replication in grids for digital libraries with freshness and correctness guaranteesCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 17 2008Fuat Akal Abstract Building digital libraries (DLs) on top of data grids while facilitating data access and minimizing access overheads is challenging. To achieve this, replication in a Grid has to provide dedicated features that are only partly supported by existing Grid environments. First, it must provide transparent and consistent access to distributed data. Second, it must dynamically control the creation and maintenance of replicas. Third, it should allow higher replication granularities, i.e. beyond individual files. Fourth, users should be able to specify their freshness demands, i.e. whether they need most recent data or are satisfied with slightly outdated data. Finally, all these tasks must be performed efficiently. This paper presents an approach that will finally allow one to build a fully integrated and self-managing replication subsystem for data grids that will provide all the above features. Our approach is to start with an accepted replication protocol for database clusters, namely PDBREP, and to adapt it to the grid. Copyright © 2008 John Wiley & Sons, Ltd. [source] Granularity in Relational Formalisms,With Application to Time and Space RepresentationCOMPUTATIONAL INTELLIGENCE, Issue 4 2001Jérôme Euzenat Temporal and spatial phenomena can be seen at a more or less precise granularity, depending on the kind of perceivable details. As a consequence, the relationship between two objects may differ depending on the granularity considered. When merging representations of different granularity, this may raise problems. This paper presents general rules of granularity conversion in relation algebras. Granularity is considered independently of the specific relation algebra, by investigating operators for converting a representation from one granularity to another and presenting six constraints that they must satisfy. The constraints are shown to be independent and consistent and general results about the existence of such operators are provided. The constraints are used to generate the unique pairs of operators for converting qualitative temporal relationships (upward and downward) from one granularity to another. Then two fundamental constructors (product and weakening) are presented: they permit the generation of new qualitative systems (e.g. space algebra) from existing ones. They are shown to preserve most of the properties of granularity conversion operators. [source] Study of a highly accurate and fast protein,ligand docking method based on molecular dynamicsCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 14 2005M. Taufer Abstract Few methods use molecular dynamics simulations in concert with atomically detailed force fields to perform protein,ligand docking calculations because they are considered too time demanding, despite their accuracy. In this paper we present a docking algorithm based on molecular dynamics which has a highly flexible computational granularity. We compare the accuracy and the time required with well-known, commonly used docking methods such as AutoDock, DOCK, FlexX, ICM, and GOLD. We show that our algorithm is accurate, fast and, because of its flexibility, applicable even to loosely coupled distributed systems such as desktop Grids for docking. Copyright © 2005 John Wiley & Sons, Ltd. [source] GridBLAST: a Globus-based high-throughput implementation of BLAST in a Grid computing frameworkCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 13 2005Arun KrishnanArticle first published online: 24 JUN 200 Abstract Improvements in the performance of processors and networks have made it feasible to treat collections of workstations, servers, clusters and supercomputers as integrated computing resources or Grids. However, the very heterogeneity that is the strength of computational and data Grids can also make application development for such an environment extremely difficult. Application development in a Grid computing environment faces significant challenges in the form of problem granularity, latency and bandwidth issues as well as job scheduling. Currently existing Grid technologies limit the development of Grid applications to certain classes, namely, embarrassingly parallel, hierarchical parallelism, work flow and database applications. Of all these classes, embarrassingly parallel applications are the easiest to develop in a Grid computing framework. The work presented here deals with creating a Grid-enabled, high-throughput, standalone version of a bioinformatics application, BLAST, using Globus as the Grid middleware. BLAST is a sequence alignment and search technique that is embarrassingly parallel in nature and thus amenable to adaptation to a Grid environment. A detailed methodology for creating the Grid-enabled application is presented, which can be used as a template for the development of similar applications. The application has been tested on a ,mini-Grid' testbed and the results presented here show that for large problem sizes, a distributed, Grid-enabled version can help in significantly reducing execution times. Copyright © 2005 John Wiley & Sons, Ltd. [source] Semantic Link Network Builder and Intelligent Semantic BrowserCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 14 2004H. Zhuge Abstract Semantic Link Network (SLN) is a semantic Web model using semantic links,the natural extension of the hyperlink-based Web. The SLN-Builder is a software tool that enables definition, modification and verification of, as well as access to the SLN. The SLN-Builder can convert a SLN definition into XML descriptions for cross-platform information exchange. The Intelligent Semantic Browser is used to visualize the SLN and carry out two types of reasoning in browsing time: small granularity reasoning by chaining semantic links and large granularity reasoning by matching semantic views of SLN. With the help of the reasoning mechanism, the browser can recommend the content that is semantically relevant to the current browsing content, and it enables users to foresee the end-side content of a semantic link chain. This paper presents the design and implementation of the SLN-Builder and the intelligent semantic browser as well as key algorithms. Copyright © 2004 John Wiley & Sons, Ltd. [source] A flexible framework for consistency managementCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 1 2002S. Weber Abstract Recent distributed shared memory (DSM) systems provide increasingly more support for the sharing of objects rather than portions of memory. However, like earlier DSM systems these distributed shared object systems (DSO) still force developers to use a single protocol, or a small set of given protocols, for the sharing of application objects. This limitation prevents the applications from optimizing their communication behaviour and results in unnecessary overhead. A current general trend in software systems development is towards customizable systems, for example frameworks, reflection, and aspect-oriented programming all aim to give the developer greater flexibility and control over the functionality and performance of their code. This paper describes a novel object-oriented framework that defines a DSM system in terms of a consistency model and an underlying coherency protocol. Different consistency models and coherency protocols can be used within a single application because they can be customized, by the application programmer, on a per-object basis. This allows application specific semantics to be exploited at a very fine level of granularity and with a resulting improvement in performance. The framework is implemented in JAVA and the speed-up obtained by a number of applications that use the framework is reported. Copyright © 2002 John Wiley & Sons, Ltd. [source] Osteopontin is produced by mast cells and affects IgE-mediated degranulation and migration of mast cellsEUROPEAN JOURNAL OF IMMUNOLOGY, Issue 2 2008Akiko Nagasaka Abstract Osteopontin (OPN), originally discovered in bone as an extracellular matrix protein, was identified in many cell types in the immune system, presumably being involved in many aspects of pathogenesis of inflammatory and immune diseases. Mast cells are also involved in such pathological aspects by secreting multiple mediators. However, it has not been determined whether mast cells produce OPN and whether it affects their function. To test this, we used murine fetal skin-derived cultured mast cells (FSMC) and bone marrow-derived cultured mast cells. We found that OPN was spontaneously produced by FSMC and inducible by ionomycin and Fc,RI aggregation in bone marrow-derived cultured mast cells. In the presence of mast cell growth factors, FSMC were similarly generated from both OPN-deficient (OPN,/,) and -sufficient (OPN+/+) mice without significant differences in yield, purity, granularity, and viability. Using OPN,/, FSMC, we found that recombinant OPN augmented IgE-mediated degranulation and induced FSMC chemotaxis. Both effects were mediated by OPN receptors (i.e. CD44 and integrin,,v). IgE-mediated passive cutaneous anaphylaxis was significantly reduced in OPN,/, mice compared with OPN+/+ mice, indicating physiological relevance of OPN. These results indicate that OPN is a mast cell mediator, enhances mast cell responses to antigen, and thus may influence mast cell-related pathological conditions. See accompanying commentary at http://dx.doi.org/10.1002/eji200738131 [source] Water-repellent soil and its relationship to granularity, surface roughness and hydrophobicity: a materials science viewEUROPEAN JOURNAL OF SOIL SCIENCE, Issue 4 2005G. McHale Summary Considerable soil water repellency has been observed at a wide range of locations worldwide. The soil exhibiting water repellency is found within the upper part of the soil profile. The reduced rate of water infiltration into these soils leads to severe runoff erosion, and reduction of plant growth. Soil water repellency is promoted by drying of soil, and can be induced by fire or intense heating of soil containing hydrophobic organic matter. Recent studies outside soil science have shown how enhancement of the natural water repellency of materials, both porous and granular, by surface texture (i.e. surface roughness, pattern and morphology) into super-hydrophobicity is possible. The similarities between these super-hydrophobic materials and observed properties of water-repellent soil are discussed from a non-soil scientist, materials-based perspective. A simple model is developed for a hydrophobic granular surface and it is shown that this can provide a mechanism for enhancement of soil water repellency through the relative size and spacing of grains and pores. The model provides a possible explanation for why soil water repellency should be more prevalent under dry conditions than wet. Consequences for water runoff, raindrop splash and soil erosion are discussed. [source] Traffic analysis in optical burst switching networks: a trace-based case studyEUROPEAN TRANSACTIONS ON TELECOMMUNICATIONS, Issue 7 2009Ahmad Rostami Optical burst switching (OBS) appears as a promising technology for building dynamic optical transport networks. The main advantage of OBS is that it allows for dynamic allocation of resources at sub-wavelength granularity. Nevertheless, the burst contention problem, which occurs frequently inside the network, has to be addressed before OBS can be really deployed as the next generation optical transport network. Recently a lot of attention is devoted to different approaches for resolving contentions in OBS networks. Although performance analysis of these approaches is strongly dependent on the traffic characteristics in the network, the majority of the studies is so far based on very hypothetical traffic assumptions. In this study we use traces of real measurements in the Internet to derive realistic data about the traffic that is injected into the OBS network. Specifically, we investigate the marginal distributions of burst size, burst interdeparture time, assembly delay and number of packets per burst as well as the burstiness of the burst traces. We demonstrate that the performance of an OBS core node using the real traces is pretty similar to the results obtained when the traffic arriving to the core node is assumed to be Poisson. In fact, usage of the Poisson as the process of burst arrival to the core node leads in all the investigated cases to an upper bound on the burst drop rate at that node. Copyright © 2009 John Wiley & Sons, Ltd. [source] Expression of GM1, a marker of lipid rafts, defines two subsets of human monocytes with differential endocytic capacity and lipopolysaccharide responsivenessIMMUNOLOGY, Issue 4 2007M. Maximina Bertha Moreno-Altamirano Summary Monocytes constitute 5,10% of total human peripheral blood leucocytes and remain in circulation for several days before replenishing the tissue macrophage populations. Monocytes display heterogeneity in size, granularity and nuclear morphology, and in the expression of cell membrane molecules, such as CD14, CD16, CD32, CD64, major histocompatibility complex class II, CCR2, CCR5, among others. This has led to the suggestion that individual monocyte/macrophage populations have specialized functions within their microenvironments. This study provides evidence for the occurrence of two peripheral blood monocyte subpopulations on the basis of their differential expression of GM1, a sphingolipid found mostly in lipid rafts, a CD14+ GM1low population and a CD14+ GM1high population comprising about 97·5% and 2·5% of total CD14+ cells, respectively. GM1 expression correlates with functional differences in terms of endocytic activity, susceptibility to mycobacterial infection, and response to lipopolysaccharide (LPS) (modulation of Toll-like receptor-4 expression). CD14+ GM1low cells proved to be less endocytic and more responsive to LPS, whereas CD14+ GM1high cells are more endocytic and less responsive to LPS. In addition, during monocyte to macrophage differentiation in vitro, the percentage of CD14+ GM1high cells increases from about 2·5% at day 1 to more than 50% at day 7 of culture. These results suggest that GM1low and GM1high monocytes in peripheral blood, represent either different stages of maturation or different subsets with specialized activities. The expression of CD16 on GM1high favours the first possibility and, on the other hand that up-regulation of GM1 expression and probably lipid rafts function is involved in the monocyte to macrophage differentiation process. [source] End-user access to multiple sources: incorporating knowledge discovery into knowledge managementINTELLIGENT SYSTEMS IN ACCOUNTING, FINANCE & MANAGEMENT, Issue 4 2002Katharina Morik The End-User Access to Multiple Sources,Eams system,integrates given information sources into a knowledge management system. It relates the world of documents with the database world using an ontology. The focus of developing the Eams system is on the acquisition and maintenance of knowledge. Hence, in both worlds, machine learning is applied. In the document world, a learning search engine adapts to user behaviour by analysing the click-through-data. This eases the personalization of selecting appropriate documents for users and does not require further maintenance. In the database world, knowledge discovery in databases (KDD) bridges the gap between the ,ne granularity of relational databases and the actual information needs of users. KDD extracts knowledge from data and, therefore, allows the knowledge management system to make good use of already existing company data,without further acquisition or maintenance. A graphical user interface provides users with a uniform access to document collections on the Internet (Intranet) as well as to relational databases. Since the ontology generates the items in the user interface, a change in the ontology automatically changes the user interface without further efforts. The Eams system has been applied to customer relationship management in the insurance domain. Questions to be answered by the system concern customer acquisition (e.g. direct marketing), customer up- and cross-selling (e.g. which products sell well together), and customer retention (here, which customers are likely to leave the insurance company or ask for a return of a capital life insurance). Documents about other insurance companies and demographic data published on the Internet contribute to the answers, as do the results of data analysis of the company's contracts. Copyright © 2003 John Wiley & Sons, Ltd. [source] Performance comparison between fixed length switching and variable length switchingINTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, Issue 5 2008Chengchen Hu Abstract Fixed length switching (FLS) and variable length switching (VLS) are two main types of switching architecture in high-speed input-queued switches. FLS is based on a cell-by-cell scheduling algorithm, while VLS operates on the variable packet granularity. This paper aims to make a comprehensive comparison between these two switching modes to guide the industrial design and academic research. We use stochastic models, Petri net models, analysis and simulations to investigate various performance measures of interest. Average packet latency, bandwidth utilization, segmentation and reassembly overhead, as well as packet loss are the identified key parameters that influence the outcome of the comparison. The results achieved in this paper are twofold. On one hand, it is shown that FLS enables smaller packet loss and lower packet delay in case of a short packet. On the other hand, VLS favors better bandwidth utilization, reduced implementation complexity and lower average packet delay. We recommend VLS in the conclusion since its disadvantages can be compensated by some methods, while the problems in FLS are difficult to be solved. Copyright © 2007 John Wiley & Sons, Ltd. [source] Argumentation within deductive reasoningINTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 1 2007Armin Fiedler Deductive reasoning is an area related to argumentation where machine-based techniques, notably theorem proving, can contribute substantially to the formation of arguments. However, making use of the functionality of theorem provers for this issue is associated with a number of difficulties and, as we will demonstrate, requires considerable effort for obtaining reasonable results. Aiming at the exploitation of machine-oriented reasoning for human-adequate argumentation in a broader sense, we present our model for producing proof presentations from machine-oriented inference structures. Capabilities of the model include adaptation to human-adequate degrees of granularity and explicitness in the underlying argumentation and interactive exploration of proofs. Enhancing capabilities in all these respects, even just those we have addressed so far, does not only improve the interactive use of theorem provers, but shows they are essential ingredients to support the functionality of dialog-oriented tutorial systems in formal domains. © 2007 Wiley Periodicals, Inc. Int J Int Syst 22: 49,70, 2007. [source] Optimal choice of granularity in commonsense estimation: Why half-orders of magnitude?INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 8 2006Jerry R. Hobbs It has been observed that when people make crude estimates, they feel comfortable choosing between alternatives that differ by a half-order of magnitude (e.g., were there 100, 300, or 1000 people in the crowd?) and less comfortable making a choice on a more detailed scale, with finer granules, or on a coarser scale (like 100 or 1000). In this article, we describe two models of choosing granularity in commonsense estimates, and we show that for both models, in the optimal granularity, the next estimate is three to four times larger than the previous one. Thus, these two optimization results explain the commonsense granularity. © 2006 Wiley Periodicals, Inc. Int J Int Syst 21: 843,855, 2006. [source] An introduction of the condition class space with continuous value discretization and rough set theoryINTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 2 2006Malcolm J. Beynon The granularity of an information system has an incumbent effect on the efficacy of the analysis from many machine learning algorithms. An information system contains a universe of objects characterized and categorized by condition and decision attributes. To manage the concomitant granularity, a level of continuous value discretization (CVD) is often undertaken. In the case of the rough set theory (RST) methodology for object classification, the granularity contributes to the grouping of objects into condition classes with the same condition attribute values. This article exposits the effect of a level of CVD on the subsequent condition classes constructed, with the introduction of the condition class space,the domain within which the condition classes exist. This domain elucidates the association of the condition classes to the related decision outcomes,reflecting the inexactness incumbent when a level of CVD is undertaken. A series of measures is defined that quantify this association. Throughout this study and without loss of generality, the findings are made through the RST methodology. This further offers a novel exposition of the relationship between all the condition attributes and the RST-related reducts (subsets of condition attributes). © 2006 Wiley Periodicals, Inc. Int J Int Syst 21: 173,191, 2006. [source] Subsessions: A granular approach to click path analysisINTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 7 2004Ernestina Menasalvas The fiercely competitive web-based electronic commerce (e-commerce) environment has made necessary the application of intelligent methods to gather and analyze information collected from consumer web sessions. Knowledge about user behavior and session goals can be discovered from the information gathered about user activities, as tracked by web clicks. Most current approaches to customer behavior analysis study the user session by examining each web page access. However, the abstraction of subsessions provides a more granular view of user activity. Here, we propose a method of increasing the granularity of the user session analysis by isolating useful subsessions within sessions. Each subsession represents a high-level user activity such as performing a purchase or searching for a particular type of information. Given a set of previously identified subsessions, we can determine at which point the user begins a preidentified subsession by tracking user clicks. With this information we can (1) optimize the user experience by precaching pages or (2) provide an adaptive user experience by presenting pages according to our estimation of the user's ultimate goal. To identify subsessions, we present an algorithm to compute frequent click paths from which subsessions then can be isolated. The algorithm functions by scanning all user sessions and extracting all frequent subpaths by using a distance function to determining subpath similarity. Each frequent subpath represents a subsession. An analysis of the pages represented by the subsession provides additional information about semantically related activities commonly performed by users. © 2004 Wiley Periodicals, Inc. [source] Developing Clinical Terms for Health Visiting in the United KingdomINTERNATIONAL JOURNAL OF NURSING TERMINOLOGIES AND CLASSIFICATION, Issue 2003June Clark BACKGROUND The UK health visiting service provides a universalist preventive health service that focuses mainly on families with young children and the elderly or vulnerable, but anyone who wishes can access the services. The principles of health visiting have been formally defined as the search for health needs, the stimulation of awareness of health needs, influencing policies that affect health, and the facilitation of health-enhancing activities. The project is currently in its fourth phase. In phase 1, 17 health visitors recorded their encounters with families with new babies over a period of 3 months; in phase 2, 27 health visitors recorded their encounters with a wider range of clients (769 encounters with 205 families) over a period of 9 months; in phase 3, the system is being used by a variety of healthcare professionals in a specialist program that provides intensive parenting support; phase 4 is developing a prototype of an automated version for point-of-contact recording. UK nursing has no tradition of standardized language and the concept of nursing diagnosis is almost unknown. Over the past decade, however, the government has initiated the development of a standardized terminology (Read codes) to cover all disciplines and all aspects of health care, and it is likely that the emerging SNOMED-CT terminology (a merger of the Read codes with the SNOMED terminology) will be mandated for use throughout the National Health Service (NHS). MAIN CONTENT POINTS The structure and key elements of the Omaha System were retained but the terminology was modified to take account of the particular field of practice and emerging UK needs. Modifications made were carefully tracked. The Problem Classification Scheme was modified as follows: ,All terms were anglicized. ,Some areas , notably relating to antepartum/postpartum, neonatal care, child protection, and growth and development,were expanded. ,The qualifiers "actual,""potential," and "health promotion" were changed to "problem,""risk," and "no problem." ,Risk factors were included as modifiers of "risk" alongside the "signs and symptoms" that qualify problems. The Intervention Classification was modified by substituting synonymous terms for "case management" and "surveillance" and dividing "health teaching, guidance, and counseling" into two categories. The Omaha System "targets" were renamed "focus" and a new axis of "recipient" was introduced in line with SNOMED-CT. The revised terminologies were tested in use and also sent for review to 3 nursing language experts and 12 practitioners, who were asked to review them for domain completeness, appropriate granularity, parsimony, synonymy, nonambiguity, nonredundancy, context independence, and compatibility with emerging multiaxial and combinatorial nomenclatures. Review comments were generally very favourable and modifications suggested are being incorporated. CONCLUSIONS The newly published government strategy for information management and technology in the NHS in Wales requires the rapid development of an electronic patient record, for which the two prerequisites are structured documentation and the use of standardized language. The terminology developed in this project will enable nursing concepts to be incorporated into the new systems. The experiences of the project team also offer many lessons that will be useful for developing the necessary educational infrastructure. [source] Standardized Care Planning: Evaluation of ICNP Beta in the Areas of Nutrition and Skin Care in a Practice SettingINTERNATIONAL JOURNAL OF NURSING TERMINOLOGIES AND CLASSIFICATION, Issue 2003Jan Florin PURPOSE To evaluate completeness, granularity, multiple axial content, and clinical utility of the beta version of the International Classification of Nursing Practice (ICNP®). METHODS Standardized care plans were developed based on research in the areas of nutrition and skin care and clinically tested in a 35-bed infectious disease unit at a Swedish university hospital. A convenience sample of 56 computerized and manual patient records were content analyzed and mapped to the terms in ICNP® beta. FINDINGS A total of 1,771 phrases were identified. Approximately 60% of the record content describing nursing phenomena and about one third of the nursing interventions in the areas of nutrition and skin care could be expressed satisfactorily using the terminology of ICNP® beta. For about 25% of the content describing both nursing phenomena and interventions, no corresponding term was found. The most common deficiencies were focus terms for stating patient perspective or collaboration, nonhuman focus, normal findings, more qualitative judgments, and different expressions for stating duration. Some terms are available in the ICNP beta as a whole, but the organization of axes impedes or restricts the use of terms beyond the ICNP categories. Terms needed to express nursing phenomena could sometimes be found in nursing actions axes. CONCLUSIONS The ICNP® beta needs to be further developed to capture relevant data in nursing care. The axial structure needs to be evaluated, and completeness and granularity of terms need to be addressed further before ICNP beta can be used on a daily basis in the clinical setting. Terms need to be developed to express patient participation and preferences, normal conditions, qualitative dimensions and characteristics, nonhuman focuses as well as duration. Empirical studies covering the complexity of information in nursing care are needed. [source] Applicability of the International Classification of Nursing Practice (ICNP®) in the Areas of Nutrition and Skin CareINTERNATIONAL JOURNAL OF NURSING TERMINOLOGIES AND CLASSIFICATION, Issue 1 2003Margareta Ehnfors PhD PURPOSE. To evaluate completeness, granularity, multiple axial content, and clinical utility of the beta version of the ICNP® in the context of standardized nursing care planning in a clinical setting. METHODS. An 35-bed acute care ward for infectious diseases at a Swedish university hospital was selected for clinical testing. A convenience sample of 56 patient records with data on nutrition and skin care was analyzed and mapped to the ICNP. FINDINGS. Using the ICNP terminology, 59%-62% of the record content describing nursing phenomena and 30%-44% of the nursing interventions in the areas of nutrition and skin care could be expressed satisfactorily. For about a quarter of the content describing nursing phenomena and interventions, no corresponding ICNP term was found. CONCLUSIONS. The ICNP needs to be further developed to allow representation of the entire range of nursing care. Terms need to be developed to express patient participation and preferences, normal conditions, qualitative dimensions and characteristics, nonhuman focus, and duration. PRACTICE IMPLICATIONS. The practical usefulness of the ICNP needs further testing before conclusions about its clinical benefits can be determined. Search terms: ICNP®, nursing classification, standardized terminology, VIPS [source] Areca nut extract-treated gingival fibroblasts modulate the invasiveness of polymorphonuclear leukocytes via the production of MMP-2JOURNAL OF ORAL PATHOLOGY & MEDICINE, Issue 1 2009Hsuan-Hsuan Lu Background:, Areca nut chewing is associated with an increase in the incidence of oral neoplastic or inflammatory diseases. Aberrations in matrix metalloprotease (MMP) expression are associated with the pathogenesis of oral diseases. This study investigated the potential effects of areca nut extract (ANE) on human gingival fibroblasts and the consequential impacts on inflammatory pathogenesis. Methods:, Analyses of senescence marker, cell viability, changes of the cell cycle, and cell granularity in gingival fibroblasts together with an assessment of the invasiveness of polymorphonuclear (PMN) leukocytes after treatment with the supernatant of ANE-treated gingival fibroblasts were performed to characterize the phenotypic impacts. Western blotting and gelatin zymography were used to assay the expression and activity of MMP-2. Results:, Chronic subtoxic (<10 ,g/ml) ANE treatment resulted in premature growth arrest, appearance of senescence-associated ,-galactosidase activity and various other senescence-associated phenotypes in gingival fibroblasts. Gingival fibroblasts established from older individuals had a higher propensity to become ANE-induced senescent gingival fibroblasts. An activation of MMP-2 was identified in senescent cells. PMN leukocytes treated with the supernatant of ANE-induced senescent cells exhibited a significant increase in invasiveness, which was abrogated by both a MMP-2 blocker and a MMP-2 nullifying antibody. Conclusions:, This study provides evidence whereby MMP-2 secreted from ANE-induced senescent gingival fibroblasts would facilitate the invasiveness of PMN leukocytes, which could be associated with the oral inflammatory process in areca chewers. [source] Apoptosis inducing activity of viscin, a lipophilic extract from Viscum album L.JOURNAL OF PHARMACY AND PHARMACOLOGY: AN INTERNATI ONAL JOURNAL OF PHARMACEUTICAL SCIENCE, Issue 1 2005K. Urech Detection of antiproliferative activity and bioactivity-guided fractionation of viscin, a lipophilic extract from Viscum album L., led to the isolation of betulinic acid, oleanolic acid and ursolic acid as active components. Viscin, betulinic acid, oleanolic acid and ursolic acid inhibited growth and induced apoptotic cell death in Molt4, K562 and U937 leukaemia cells. The growth inhibitory effect of viscin was more pronounced in Molt4 and U937 cells (IC50 (concentration that inhibited cell proliferation by 50%): 118 ± 24 and 138 ± 24 ,g mL,1) than in K562 cells (IC50: 252 ± 37 ,g mL,1). Oleanolic acid was the least effective in all cell lines (7.5,45.5% inhibition at 10 ,g mL,1) and ursolic acid the most active in Molt4 and U937 cells (81.8 and 97.8% inhibition, respectively, at 5 ,g mL,1). A dose-dependent loss of membrane phospholipid asymmetry associated with apoptosis was induced in all cell lines as shown in flow cytometry by the externalization of phosphatidylserine and morphological changes in cell size and granularity. There were differences in individual cell lines' response towards the apoptosis-inducing effect of viscin, betulinic acid, oleanolic acid and ursolic acid. The triterpenoids ,-amyrin, ,-amyrinacetate, lupeol, lupeolacetate, ,-sitosterol and stigmasterol, and the fatty acids oleic acid, linoleic acid, palmitic acid and stearic acid were also present in the lipophilic extract. [source] Scatter matters: Regularities and implications for the scatter of healthcare information on the WebJOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, Issue 4 2010Suresh K. Bhavnani Abstract Despite the development of huge healthcare Web sites and powerful search engines, many searchers end their searches prematurely with incomplete information. Recent studies suggest that users often retrieve incomplete information because of the complex scatter of relevant facts about a topic across Web pages. However, little is understood about regularities underlying such information scatter. To probe regularities within the scatter of facts across Web pages, this article presents the results of two analyses: (a) a cluster analysis of Web pages that reveals the existence of three page clusters that vary in information density and (b) a content analysis that suggests the role each of the above-mentioned page clusters play in providing comprehensive information. These results provide implications for the design of Web sites, search tools, and training to help users find comprehensive information about a topic and for a hypothesis describing the underlying mechanisms causing the scatter. We conclude by briefly discussing how the analysis of information scatter, at the granularity of facts, complements existing theories of information-seeking behavior. [source] Effects of granularity of search results on the relevance judgment behavior of engineers: Building systems for retrieval and understanding of contextJOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, Issue 3 2010Panos Balatsoukas Granularity is a novel concept for presenting information in search result interfaces of hierarchical query-driven information retrieval systems in a manner that can support understanding and exploration of the context of the retrieved information (e.g., by highlighting its position in the granular hierarchy and exposing its relationship with relatives in the hierarchy). Little research, however, has been conducted on the effects of granularity of search results on the relevance judgment behavior of engineers. Engineers are highly motivated information users who are particularly interested in understanding the context of the retrieved information. Therefore, it is hypothesized that the design of systems with careful regard for granularity would improve engineers' relevance judgment behavior. To test this hypothesis, a prototype system was developed and evaluated in terms of the time needed for users to find relevant information, the accuracy of their relevance judgment, and their subjective satisfaction. To evaluate the prototype, a user study was conducted where participants were asked to complete tasks, complete a satisfaction questionnaire, and be interviewed. The findings showed that participants performed better and were more satisfied when the prototype system presented only relevant information in context. Although this study presents some novel findings about the effects of granularity and context on user relevance judgment behavior, the results should be interpreted with caution. For example, participants in this research were recruited by convenience and performed a set of simulated tasks as opposed to real ones. However, suggestions for further research are presented. [source] Context-based generic cross-lingual retrieval of documents and automated summariesJOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, Issue 2 2005Wai Lam We develop a context-based generic cross-lingual retrieval model that can deal with different language pairs. Our model considers contexts in the query translation process. Contexts in the query as well as in the documents based on co-occurrence statistics from different granularity of passages are exploited. We also investigate cross-lingual retrieval of automatic generic summaries. We have implemented our model for two different cross-lingual settings, namely, retrieving Chinese documents from English queries as well as retrieving English documents from Chinese queries. Extensive experiments have been conducted on a large-scale parallel corpus enabling studies on retrieval performance for two different cross-lingual settings of full-length documents as well as automated summaries. [source] Quantitative assessment of chondrocyte viability after laser mediated reshaping: A novel application of flow cytometryLASERS IN SURGERY AND MEDICINE, Issue 1 2003Alexandre Rasouli BS Abstract Background and Objectives Lasers can be used to reshape cartilage by accelerating mechanical stress relaxation. In this study, fluorescent differential cell viability staining and flow cytometry were used to determine chondrocyte viability following laser heating. Study Design/Materials and Methods Porcine septal cartilages were irradiated with an Nd:YAG laser (,,= 1.32 ,m, 25 W/cm2) while surface temperature, stress relaxation, and diffuse reflectance were recorded. Each slab received one, two, or three laser exposures (respective exposure times of 6.7, 7.2, 10 seconds). Irradiated samples were then divided into two groups analyzed immediately and at 5 days following laser exposure. Chondrocytes were isolated following serial enzymatic digestion, and stained using SYTO®/DEAD RedÔ (Molecular Probes, Eugene, OR). A flow cytometer was then used to detect differential cell fluorescence; size; granularity; and the number of live cells, dead cells, and post-irradiation debris in each treatment population. Results Nearly 60% of chondrocytes from reshaped cartilage samples isolated shortly after one irradiation, were viable while non-irradiated controls were 100% viable. Specimens irradiated two or three times demonstrated increasing amounts of cellular debris along with a reduction in chondrocyte viability: 31 and 16% after two and three exposures, respectively. In those samples maintained in culture medium and assayed 5 days after irradiation, viability was reduced by 28,88%, with the least amount of deterioration in untreated and singly irradiated samples. Conclusions Functional fluorescent dyes combined with flow cytometric analysis successfully determines the effect of laser irradiation on the viability of reshaped cartilage. Lasers Surg. Med. 32:3,9,2003. © 2003 Wiley-Liss, Inc. [source] Packet OADMs for the next generation of ring networksBELL LABS TECHNICAL JOURNAL, Issue 4 2010Dominique Chiaroni The deployment of fiber-to-the-home (FTTH) technology in access networks is creating new demands on metropolitan area and backbone networks. The increasing bit rate per user and the simplification of access networks will make the traffic profile more bursty, requiring new flexible techniques at the metropolitan area network. This paper describes a ring network exploiting optical transparency and packet granularity. After a description of the packet optical add/drop multiplexer (POADM)-based network model motivated by specifications derived from expected needs, the paper addresses the advantages of the approach and the feasibility of the concept. © 2010 Alcatel-Lucent. [source] Monitoring infrastructure for converged networks and servicesBELL LABS TECHNICAL JOURNAL, Issue 2 2007Shipra Agrawal Network convergence is enabling service providers to deploy a wide range of services such as Voice over Internet Protocol (VoIP), Internet Protocol television (IPTV), and push-to-talk on the same underlying IP networks. Each service has unique performance requirements from the network, and IP networks have not been designed to satisfy these diverse requirements easily. These requirements drive the need for a robust, scalable, and easy-to-use network management platform that enables service providers to monitor and manage their networks to provide the necessary quality, availability, and security. In this paper, we describe monitoring mechanisms that give service providers critical information on the performance of their networks at a per-user, per-service granularity in real time. This allows the service providers to ensure that their networks adequately satisfy the requirements of the various services. We present various methods to acquire data, which can be analyzed to determine the performance of the network. This platform enables service providers to offer carrier grade services over their converged networks, giving their customers a high-quality experience. © 2007 Alcatel-Lucent. [source] Mixing Deposition of Upper Carboniferous in Jiangshan, Zhejiang Province and its Tectonic SignificanceACTA GEOLOGICA SINICA (ENGLISH EDITION), Issue 2 2010Fusheng GUO Abstract: The Outangdi Formation in Jiangshan, Zhejiang, is the mixing deposit of terrigenous clastics and carbonates in Weiningian of the late Carboniferous. The mixing deposits include interbeddings, which constitute a series of alternated clastic and carbonate beds and mixing within the same bed which forms "hunji rock". The Outangdi Formation has the features of intercalated marine and terrestrial deposits with the progradational sequences, which are lower fine and upper coarse sedimentary granularity in the section. Hunji rock is formed in a seashore environment. It is a mixed carbonate sediment found in beaches or tideland facies with quartz sand taken from a bayou or beach by coastwise flow and circumfluence. There are two kinds of hunji sequences: (1) interbeds of sandstone and carbonate rock in seashore environments; and (2) interbeds of clastics in river facies and carbonate rock in ocean facies. It is indicated that mixing depositions belong to "facies mixing", affected mainly by regional tectonic uplift, rise of the global sea level, and the dynamics of water medium in the basin. Regional sea level periodic changes and progradational sequences probably resulted from the intense uplift of the old land called Cathaysia. The classification and name of mixed sediments are also discussed in the present study. Interbeds and alternated beds of clastic and carbonate beds are named "hunji sequence", a new genetic term. It is suggested that hunji rock means a special sediment event of mixing terrigenous clastics and carbonates instead of a name of a specific rock. [source] |