Specifications

Distribution by Scientific Domains
Distribution within Business, Economics, Finance and Accounting

Kinds of Specifications

  • alternative model specifications
  • alternative specifications
  • design specifications
  • different specifications
  • model specifications
  • performance specifications
  • quality specifications
  • standard specifications
  • technical specifications


  • Selected Abstracts


    THE IMPACT OF CLEAN FUEL SPECIFICATIONS ON ADELAIDE RETAIL PETROL PRICES,

    AUSTRALIAN ECONOMIC PAPERS, Issue 1 2009
    ALISTAIR DAVEY
    In March 2001, the South Australian Government introduced a clean fuel policy which it claimed was designed to protect air quality. This paper quantifies the policy's impact on relative Adelaide retail prices for unleaded petrol through Box-Jenkins autoregressive integrated moving average methodology coupled with Box and Tiao intervention analysis. The analysis uses weekly price data spanning from January 2000 until the beginning of June 2002. It finds the clean fuel policy had a statistically significant impact on relative retail petrol prices, resulting in an increase of almost 1.9 cents per litre and, therefore, costing Adelaide motorists around an extra $15.8 million per annum. Based on claims that the quality of fuel produced by the local Adelaide refiner did not change in response to the implementation of the clean fuel policy, the paper concludes that the increase in relative retail petrol prices was most likely associated with the exercise of market power rather than an increase in refinery production costs. [source]


    Dynamic energy and exergy analyses of an industrial cogeneration system

    INTERNATIONAL JOURNAL OF ENERGY RESEARCH, Issue 4 2010
    Yilmaz Yoru
    Abstract The study deals with the energetic and exergetic analyses of a cogeneration (combined heat and power, CHP) system installed in a ceramic factory, located in Izmir, Turkey. This system has three gas turbines with a total capacity of 13,MW, six spray dryers and two heat exchangers. In the analysis, actual operational data over one-month period are utilized. The so-called CogeNNexT code is written in C++ and developed to analyze energetic and exergetic data from a database. This code is also used to analyze turbines, spray dryers and heat exchangers in this factory. Specifications of some parts of system components have been collected from the factory. Based on the 720,h data pattern (including 43,200 data), the mean energetic and exergetic efficiency values of the cogeneration system are found to be 82.3 and 34.7%, respectively. Copyright © 2009 John Wiley & Sons, Ltd. [source]


    Robustness of Spatial Autocorrelation Specifications: Some Monte Carlo Evidence

    JOURNAL OF REGIONAL SCIENCE, Issue 2 2003
    Robin Dubin
    The generated data are then used to estimate all of the models. The estimated models are evaluated primarily on their predictive power. [source]


    Buchbesprechung: Concrete Mix Design, Quality Control and Specifications.

    BETON- UND STAHLBETONBAU, Issue 12 2008
    Von Ken W. Day.
    No abstract is available for this article. [source]


    Utility Functions for Ceteris Paribus Preferences

    COMPUTATIONAL INTELLIGENCE, Issue 2 2004
    Michael McGeachie
    Ceteris paribus (all-else equal) preference statements concisely represent preferences over outcomes or goals in a way natural to human thinking. Although deduction in a logic of such statements can compare the desirability of specific conditions or goals, many decision-making methods require numerical measures of degrees of desirability. To permit ceteris paribus specifications of preferences while providing quantitative comparisons, we present an algorithm that compiles a set of qualitative ceteris paribus preferences into an ordinal utility function. Our algorithm is complete for a finite universe of binary features. Constructing the utility function can, in the worst case, take time exponential in the number of features, but common independence conditions reduce the computational burden. We present heuristics using utility independence and constraint-based search to obtain efficient utility functions. [source]


    Web Discovery and Filtering Based on Textual Relevance Feedback Learning

    COMPUTATIONAL INTELLIGENCE, Issue 2 2003
    Wai Lam
    We develop a new approach for Web information discovery and filtering. Our system, called WID, allows the user to specify long-term information needs by means of various topic profile specifications. An entire example page or an index page can be accepted as input for the discovery. It makes use of a simulated annealing algorithm to automatically explore new Web pages. Simulated annealing algorithms possess some favorable properties to fulfill the discovery objectives. Information retrieval techniques are adopted to evaluate the content-based relevance of each page being explored. The hyperlink information, in addition to the textual context, is considered in the relevance score evaluation of a Web page. WID allows users to provide three forms of the relevance feedback model, namely, the positive page feedback, the negative page feedback, and the positive keyword feedback. The system is domain independent and does not rely on any prior knowledge or information about the Web content. Extensive experiments have been conducted to demonstrate the effectiveness of the discovery performance achieved by WID. [source]


    To Commit or Not to Commit: Modeling Agent Conversations for Action

    COMPUTATIONAL INTELLIGENCE, Issue 2 2002
    Roberto A. Flores
    Conversations are sequences of messages exchanged among interacting agents. For conversations to be meaningful, agents ought to follow commonly known specifications limiting the types of messages that can be exchanged at any point in the conversation. These specifications are usually implemented using conversation policies (which are rules of inference) or conversation protocols (which are predefined conversation templates). In this article we present a semantic model for specifying conversations using conversation policies. This model is based on the principles that the negotiation and uptake of shared social commitments entail the adoption of obligations to action, which indicate the actions that agents have agreed to perform. In the same way, obligations are retracted based on the negotiation to discharge their corresponding shared social commitments. Based on these principles, conversations are specified as interaction specifications that model the ideal sequencing of agent participations negotiating the execution of actions in a joint activity. These specifications not only specify the adoption and discharge of shared commitments and obligations during an activity, but also indicate the commitments and obligations that are required (as preconditions) or that outlive a joint activity (as postconditions). We model the Contract Net Protocol as an example of the specification of conversations in a joint activity. [source]


    On-line motion blending for real-time locomotion generation

    COMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 3-4 2004
    Sang Il Park
    Abstract In this paper, we present an integrated framework of on-line motion blending for locomotion generation. We first provide a novel scheme for incremental timewarping, which always guarantees that the time goes forward. Combining the idea of motion blending with that of posture rearrangement, we introduce a motion transition graph to address on-line motion blending and transition simultaneously. Guided by a stream of motion specifications, our motion synthesis scheme moves from node to node in an on-line manner while blending a motion at a node and generating a transition motion at an edge. For smooth on-line motion transition, we also attach a set of example transition motions to an edge. To represent similar postures consistently, we exploit the inter-frame coherency embedded in the input motion specification. Finally, we provide a comprehensive solution to on-line motion retargeting by integrating existing techniques. Copyright © 2004 John Wiley & Sons, Ltd. [source]


    Image modification for immersive projection display based on pseudo-projection models

    COMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 4 2003
    Toshio Moriya
    Abstract This paper describes a practical method that enables actual images to be converted so that they can be projected onto an immersive projection display (IPD) screen. IPD screens are particularly unique in that their angle of view is extremely wide; therefore, the images projected onto them need to be taken on a special format. In practice, however, it is generally very difficult to shoot images that completely satisfy the specifications of the targeting IPD environment due to cost, technical problems or other reasons. To overcome these problems, we developed a method to modify the images by abandoning geometrical consistency. We were able to utilize this method by assuming that the given image was shot according to a special projection model. Because this model differed from the actual projection model with which the image was taken, we termed it the pseudo-projection model. Since our method uses simple geometry, and can easily be expressed by a parametric function, the degree of modification or the time sequence for modification can readily be adjusted according to the features of each type of content. Copyright © 2003 John Wiley & Sons, Ltd. [source]


    Training scenario prototyping for VR-based simulation of neonatal decision-making

    COMPUTER APPLICATIONS IN ENGINEERING EDUCATION, Issue 4 2007
    A. Holobar
    Abstract This paper presents the design and implementation of a real-time system for virtual reality (VR)-based training in neonatal medicine, with main emphasis on simple creation of various training scenarios. This system combines an articulated 3D model of a virtual newborn with text-based descriptions of its physiological and behavioral responses, enabling medical experts to easily construct, simulate and revise an arbitrary postnatal critical situation. Afterwards, the resulting descriptions of newborn's behavior can be used for technical specifications (and even for automatic generation) of more complex behavioral models, such as finite-state automata. © 2007 wiley Periodicals, Inc. Comput Appl Eng Educ 15: 317,328, 2007; Published online in Wiley InterScience (www.interscience.wiley.com); DOI 10.1002/cae.20121 [source]


    Incorporating Maintenance Effectiveness in the Estimation of Dynamic Infrastructure Performance Models

    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 3 2008
    Chih-Yuan Chu
    Specifically, we consider state-space specifications of autoregressive moving averages with exogenous inputs models to develop deterioration and inspection models for infrastructure facilities, and intervention analysis to estimate transitory and permanent effects of maintenance, for example, performance jumps or deterioration rate changes. To illustrate the methodology, we analyze the effectiveness of an overlay on a flexible pavement section from the AASHO Road Test. The results show the effect of the overlay on improvements both on surface distress, that is, rutting and slope variance, as well as on the pavement's underlying serviceability. The results also provide evidence that the overlay changes the pavement's response to traffic, that is, the overlay causes a reduction in the rate at which traffic damages the pavement. [source]


    A Decision Support System Specification for Cost Escalation in Heavy Engineering Industry

    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 5 2002
    Nashwan N. Dawood
    The heavy civil engineering industry (railways, sewage-treatment, chemical and pharmaceutical facilities, oil and gas facilities, etc.) is one of the major contributors to the British economy and generally involves a high level of investment. Clients in this industry are demanding accurate cost estimates, proper analysis of out-turn cost and cost escalation, and a high quality risk analysis throughout the construction processes. Current practices in the industry have suggested that there is a lack of structured methodologies and systematic cost escalation approaches to achieve an appropriate cost analysis at the outset of projects and throughout the construction processes. In this context the prime objective of this research work is to develop a structured cost escalation methodology for improving estimating management and control in the heavy engineering industry construction processes. The methodology is composed of a forecasting model to predict cost indices of major items in industry and a risk knowledge-base model for identifying and quantifying causes of cost escalations. This paper reviews and discusses a knowledge-based model for applying a cost escalation factor. The cost escalation factor is made up of market variation, a risk element, and a component for bias. A knowledge elicitation strategy was employed to obtain the required knowledge for the model. The strategy included questionnaires, interviews, and workshops, and deliverables came in the form of influences and their effect on project cost escalation. From these deliverables, a decision support system and specifications for applying cost escalation to base estimates are presented. [source]


    Reliability in grid computing systems,

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 8 2009
    Christopher Dabrowski
    Abstract In recent years, grid technology has emerged as an important tool for solving compute-intensive problems within the scientific community and in industry. To further the development and adoption of this technology, researchers and practitioners from different disciplines have collaborated to produce standard specifications for implementing large-scale, interoperable grid systems. The focus of this activity has been the Open Grid Forum, but other standards development organizations have also produced specifications that are used in grid systems. To date, these specifications have provided the basis for a growing number of operational grid systems used in scientific and industrial applications. However, if the growth of grid technology is to continue, it will be important that grid systems also provide high reliability. In particular, it will be critical to ensure that grid systems are reliable as they continue to grow in scale, exhibit greater dynamism, and become more heterogeneous in composition. Ensuring grid system reliability in turn requires that the specifications used to build these systems fully support reliable grid services. This study surveys work on grid reliability that has been done in recent years and reviews progress made toward achieving these goals. The survey identifies important issues and problems that researchers are working to overcome in order to develop reliability methods for large-scale, heterogeneous, dynamic environments. The survey also illuminates reliability issues relating to standard specifications used in grid systems, identifying existing specifications that may need to be evolved and areas where new specifications are needed to better support the reliability. Published in 2009 by John Wiley & Sons, Ltd. [source]


    A context- and role-driven scientific workflow development pattern

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 15 2008
    Wanchun Dou
    Abstract Scientific workflow execution often demands data-centric and computation-intensive collaboration efforts, which is typically different from the process-centric workflow execution with fixed execution specifications. Scientific workflow execution often challenges the traditional workflow development strategy in dynamic context management and role definition. In view of this observation, application context spectrums are firstly distinguished from different profiles of scientific workflow development. Then, a role enactment strategy is proposed for enabling workflow execution in certain application context. They jointly enhance the validity of a scientific workflow development through clearly articulating the correlation between computational subjects and computational objects engaged in scientific workflow system. Furthermore, a novel context- and role-driven scientific workflow development pattern is proposed for enacting a scientific workflow system on the Grid. Finally, a case study is presented to demonstrate the generic natures of the methods in this paper. Copyright © 2008 John Wiley & Sons, Ltd. [source]


    Specification and detection of performance problems with ASL

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 11 2007
    Michael Gerndt
    Abstract Performance analysis is an important step in tuning performance-critical applications. It is a cyclic process of measuring and analyzing performance data, driven by the programmer's hypotheses on potential performance problems. Currently this process is controlled manually by the programmer. The goal of the work described in this article is to automate the performance analysis process based on a formal specification of performance properties. One result of the APART project is the APART Specification Language (ASL) for the formal specification of performance properties. Performance bottlenecks can then be identified based on the specification, since bottlenecks are viewed as performance properties with a large negative impact. We also present the overall design and an initial evaluation of the Periscope system which utilizes ASL specifications to automatically search for performance bottlenecks in a distributed manner. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    Simulation of resource synchronization in a dynamic real-time distributed computing environment

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 14 2004
    Chen Zhang
    Abstract Today, more and more distributed computer applications are being modeled and constructed using real-time principles and concepts. In 1989, the Object Management Group (OMG) formed a Real-Time Special Interest Group (RT SIG) with the goal of extending the Common Object Request Broker Architecture (CORBA) standard to include real-time specifications. This group's most recent efforts have focused on the requirements of dynamic distributed real-time systems. One open problem in this area is resource access synchronization for tasks employing dynamic priority scheduling. This paper presents two resource synchronization protocols that the authors have developed which meet the requirements of dynamic distributed real-time systems as specified by Dynamic Scheduling Real-Time CORBA (DSRT CORBA). The proposed protocols can be applied to both Earliest Deadline First (EDF) and Least Laxity First (LLF) dynamic scheduling algorithms, allow distributed nested critical sections, and avoid unnecessary runtime overhead. In order to evaluate the performance of the proposed protocols, we analyzed each protocol's schedulability. Since the schedulability of the system is affected by numerous system configuration parameters, we have designed simulation experiments to isolate and illustrate the impact of each individual system parameter. Simulation experiments show the proposed protocols have better performance than one would realize by applying a schema that utilizes dynamic priority ceiling update. Copyright © 2004 John Wiley & Sons, Ltd. [source]


    An approach for quality of service adaptation in service-oriented Grids

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 5 2004
    Rashid Al-Ali
    Abstract Some applications utilizing Grid computing infrastructure require the simultaneous allocation of resources, such as compute servers, networks, memory, disk storage and other specialized resources. Collaborative working and visualization is one example of such applications. In this context, quality of service (QoS) is related to Grid services, and not just to the network connecting these services. With the emerging interest in service-oriented Grids, resources may be advertised and traded as services based on a service level agreement (SLA). Such a SLA must include both general and technical specifications, including pricing policy and properties of the resources required to execute the service, to ensure QoS requirements are satisfied. An approach for QoS adaptation is presented to enable the dynamic adjustment of behavior of an application based on changes in the pre-defined SLA. The approach is particularly useful if workload or network traffic changes in unpredictable ways during an active session. Copyright © 2004 John Wiley & Sons, Ltd. [source]


    A set-oriented method definition language for object databases and its semantics

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 14 2003
    Elisa Bertino
    Abstract In this paper we propose a set-oriented rule-based method definition language for object-oriented databases. Most existing object-oriented database systems exploit a general-purpose imperative object-oriented programming language as the method definition language. Because methods are written in a general-purpose imperative language, it is difficult to analyze their properties and to optimize them. Optimization is important when dealing with a large amount of objects as in databases. We therefore believe that the use of an ad hoc, set-oriented language can offer some advantages, at least at the specification level. In particular, such a language can offer an appropriate framework to reason about method properties. In this paper, besides defining a set-oriented rule-based language for method definition, we formally define its semantics, addressing the problems of inconsistency and non-determinism in set-oriented updates. Moreover, we characterize some relevant properties of methods, such as conflicts among method specifications in sibling classes and behavioral refinement in subclasses. Copyright © 2003 John Wiley & Sons, Ltd. [source]


    Simulating multiple inheritance in Java

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 12 2002
    Douglas Lyon
    Abstract The CentiJ system automatically generates code that simulates multiple inheritance in Java. The generated code inputs a series of instances and outputs specifications that can be combined using multiple inheritance. The multiple inheritance of implementation is obtained by simple message forwarding. The reflection API of Java is used to reverse engineer the instances, and so the program can generate source code, but does not require source code on its input. Advantages of CentiJ include compile-time type checking, speed of execution, automatic disambiguation (name space collision resolution) and ease of maintenance. Simulation of multiple inheritance was previously available only to Java programmers who performed manual delegation or who made use of dynamic proxies. The technique has been applied at a major aerospace corporation. Copyright © 2002 John Wiley & Sons, Ltd. [source]


    Extrapolation of the W7-X Magnet System to Reactor Size

    CONTRIBUTIONS TO PLASMA PHYSICS, Issue 8 2010
    F. Schauer
    Abstract The fusion experiment Wendelstein 7-X (W7-X), presently under construction at the Greifswald branch institute of IPP, shall demonstrate the reactor potential of a HELIAS stellarator. HELIAS reactors with three, four and five periods have been studied at IPP since many years. With a plasma axis induction of 5 T, corresponding to about 10 T maximal induction at the coil, it was shown that such reactors are feasible. Now the possibility is being investigated to increase the conductor induction up to the 12 T , range, corresponding to > 5.5 T at the plasma axis. This improves the stellarator confinement properties but does not change the basic physics with respect to the previously analyzed machines. In particular the 5periodic HELIAS type, HSR5, is considered which evolves from W7-X by linear scaling of the main dimensions by a factor of four. Recent progress in superconductor technology and the extensive development work performed for ITER are taken into account. The latter is particularly relevant since by coincidence the circumferences of the HSR5 and the ITER toroidal field coils are practically the same. For the presented 12 T reactor version, the HSR50a, also the conductor and structural requirements are comparable to the corresponding ITER specifications. Therefore, advantage can be taken of these similarities for the stellarator reactor magnet design. The input was provided by the new code "MODUCO" which was developed for interactive coil layout. It is based on Bézier curve approximations and includes the computation of magnetic surfaces and forces (© 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


    Sustainability quotients and the social footprint

    CORPORATE SOCIAL RESPONSIBILITY AND ENVIRONMENTAL MANAGEMENT, Issue 4 2008
    Mark W. McElroy
    Abstract We argue that most of what passes for mainstream reporting in corporate sustainability management fails to do precisely the one thing it purports to do , which is make it possible for organizations to measure and report on the sustainability of their operations. It fails because of the lack of what the Global Reporting Initiative calls sustainability context, a shortcoming from which it, too, suffers. We suggest that this missing context calls for a new notion of sustainability (the binary perspective), which can be conceptualized in the form of sustainability quotients. We provide specifications for such quotients in ecological and social contexts, and suggest that sustainability is best understood in terms of the impact organizations can have on the carrying capacity of non-financial capital, or what in the social case we call anthro capital. We conclude by introducing a quantitative quotients-based method for measuring and reporting on the social sustainability of an organization, the social footprint method. Copyright © 2007 John Wiley & Sons, Ltd and ERP Environment. [source]


    THE SHORT-TERM EFFECTS OF EXECUTIONS ON HOMICIDES: DETERRENCE, DISPLACEMENT, OR BOTH?,

    CRIMINOLOGY, Issue 4 2009
    KENNETH C. LAND
    Does the death penalty save lives? In recent years, a new round of research has been using annual time-series panel data from the 50 U.S. states for 25 or so years from the 1970s to the late 1990s that claims to find many lives saved through reductions in subsequent homicide rates after executions. This research, in turn, has produced a round of critiques, which concludes that these findings are not robust enough to model even small changes in specifications that yield dramatically different results. A principal reason for this sensitivity of the findings is that few state-years exist (about 1 percent of all state-years) in which six or more executions have occurred. To provide a different perspective, we focus on Texas, a state that has used the death penalty with sufficient frequency to make possible relatively stable estimates of the homicide response to executions. In addition, we narrow the observation intervals for recording executions and homicides from the annual calendar year to monthly intervals. Based on time-series analyses and independent-validation tests, our best-fitting model shows that, from January 1994 through December 2005, evidence exists of modest, short-term reductions in homicides in Texas in the first and fourth months that follow an execution,about 2.5 fewer homicides total. Another model suggests, however, that in addition to homicide reductions, some displacement of homicides may be possible from one month to another in the months after an execution, which reduces the total reduction in homicides after an execution to about .5 during a 12-month period. Implications for additional research and the need for future analysis and replication are discussed. [source]


    RIGHT-TO-CARRY CONCEALED HANDGUNS AND VIOLENT CRIME: CRIME CONTROL THROUGH GUN DECONTROL?,

    CRIMINOLOGY AND PUBLIC POLICY, Issue 3 2003
    TOMISLAV V. KOVANDZIC
    Research Summary: "Right-to-Carry" (RTC) concealed-handgun laws mandate that authorities issue concealed handgun permits to qualified applicants. The supposition by those supporting the laws is that allowing private citizens to carry concealed handguns in public can reduce violent crime by deterring prospective criminals afraid of encountering armed civilians. Critics of the laws argue that violent altercations are more likely to turn deadly when more people carry guns. Whether the laws cause violent crime to increase or to decrease has become an important public policy question, as most states have now adopted such legislation. The present study evaluates Florida's 1987 RTC law, which prior research suggests plays a key role in the RTC debate. Specifically, we use panel data for 58 Florida counties from 1980 to 2000 to examine the effects on violent crime from increases in the number of people with concealed-carry permits, rather than before-after dummy and time-trend variables used in prior research. We also address many of the methodological problems encountered in earlier RTC studies. We present numerous model specifications, and we find little evidence that increases in the number of citizens with concealed-handgun permits reduce or increase rates of violent crime. Policy Implications: The main policy implication of this research is that there appears to be little gained in the way of crime prevention by converting restrictive gun carrying laws to "shall-issue" laws, although the laws might still prove beneficial by (1) eliminating arbitrary decisions on gun permit applications, (2) encouraging gun safety, (3) making permit holders feel safer when out in public, (4) providing permit holders with a more effective means of self-defense, and (5) reducing the costs to police departments of enforcing laws prohibiting unlicensed gun carrying. [source]


    Albumin enhanced morphometric image analysis in CLL,

    CYTOMETRY, Issue 1 2004
    Matthew A. Lunning
    Abstract BACKGROUND The heterogeneity of lymphocytes from patients with chronic lymphocytic leukemia (CLL) and blood film artifacts make morphologic subclassification of this disease difficult. METHODS We reviewed paired blood films prepared from ethylene-diamine-tetraacetic acid (ETDA) samples with and without bovine serum albumin (BSA) from 82 CLL patients. Group 1 adhered to NCCLS specifications for the preparations of EDTA blood films. Group 2 consisted of blood films containing EDTA and a 1:12 dilution of 22% BSA. Eight patients were selected for digital photomicroscopy and statistical analysis. Approximately 100 lymphocytes from each slide were digitally captured. RESULTS The mean cell area ± standard error was 127.8 ,m2 ± 1.42 for (n = 793) for group 1 versus 100.7 ,m2 ± 1.39 (n = 831) for group 2. The nuclear area was 88.9 ,m2 ± 0.85 for group 1 versus 76.4 ,m2 ± 0.83 for group 2. For the nuclear transmittance, the values were 97.6 ± 0.85 for group 1 and 104.1 ± 0.83 for group 2. The nuclear:cytoplasmic ratios were 0.71 ± 0.003 for group 1 and 0.78 ± 0.003 for group 2. All differences were statistically significant (P < 0.001). CONCLUSIONS BSA addition results in the reduction of atypical lymphocytes and a decrease in smudge cells. BSA also decreases the lymphocyte area and nuclear area, whereas nuclear transmittance and nuclear:cytoplasmic ratio are increased. A standardized method of slide preparation would allow accurate interlaboratory comparison. The use of BSA may permit better implementation of the blood film-based subclassification of CLL and lead to a better correlation of morphology with cytogenetics and immunophenotyping. Published 2003 Wiley-Liss, Inc. [source]


    Is there a SSRI dose response in treating major depression?

    DEPRESSION AND ANXIETY, Issue 1 2003
    The case for re-analysis of current data, for enhancing future study design
    Abstract It has been widely stated that the available research data has not demonstrated a SSRI dose response for major depression. We re-evaluated the methods used to analyze the SSRI data by clarifying two key alternative definitions of dose response and their implications for enhancing analysis of currently available data as well as future study design. We differentiated "potential" dose response, which focuses exclusively on response excluding tolerability effects and asks whether differences in dose can result in significant differences in response, from "expressed" dose response, which incorporates all tolerability effects currently associated with dose (including those caused by study protocol or treatment practice) and asks whether differences in dose do result in significant differences in response. To analyze potential dose response for all studies, one should use a "dose-tolerant" sample, i.e., an ITT sample from which dropouts due to adverse events have been removed. To analyze an expressed dose response, an ITT sample is the optimum sample if the study conforms to several design specifications. In the absence of conformance to these specifications, an ITT sample may be an approximation of the appropriate sample. Given design limitations of currently available studies, a dose-tolerant sample may provide a more informative approximation of an optimal sample to be used in evaluating the expressed dose response that could be expected in the best clinical practice. Future studies of dose-response relations could be enhanced by taking into account the principles noted above, and currently available data should be reanalyzed based on these principles. This re-analysis is performed in a companion article [Baker et al. 2003, Depress Anxiety 17:1-9]. Depression and Anxiety 17:10,18, 2003. © 2003 Wiley-Liss, Inc. [source]


    Complementary therapy for psoriasis

    DERMATOLOGIC THERAPY, Issue 2 2003
    Giovanni Luigi Capella
    ABSTRACT: The authors provide some specifications regarding the correct terminology to be applied in the field of complementary medicine, and review and comment on several complementary treatments for psoriasis. Putative psychotherapeutic equivalents are kept distinct from treatments based on the surreptitious administration of physical or pharmacologic agents. Limits on the application of psychotherapeutic techniques are discussed. Risks inherent to complementary treatments (psychological derangements, moral subjugation, physical damage, economic exploitation) are underscored. The authors plead for the application of adequate scientific criticism in complementary medicine, but warn that any approach to the practice of medicine which is not disinterested and patient oriented,as the academic one should be,will be inappropriate, misleading, or even immoral. In the authors' opinion, this could also apply to the evidence-based medicine movement (often perceived as the archenemy of alternative medicine), should this movement be influenced by economical, political, or other nonmedical factors. [source]


    Food Industrialisation and Food Power: Implications for Food Governance

    DEVELOPMENT POLICY REVIEW, Issue 5-6 2003
    Tim Lang
    Food supply chains of developed countries industrialised in the second half of the twentieth century, with significant implications for developing countries over policy priorities, the ensuing external costs and the accompanying concentration of market power. Very powerful corporations dominate many sectors. Primary producers are locked into tight specifications and contracts. Consumers may benefit from cheaper food but there are quality implications and health externalities. As consumer confidence has been shaken, new quality agencies have been created. Tensions have emerged about the state's role as facilitator of industrial efficiencies. Food policy is thus torn between the pursuit of productivity and reduced prices and the demand for higher quality, with implications for both producers and consumers in the developing world. [source]


    Screening for type 2 diabetes: an update of the evidence

    DIABETES OBESITY & METABOLISM, Issue 10 2010
    R. K. Simmons
    A growing body of evidence on diabetes screening has been published during the last 10 years. Type 2 diabetes meets many but not all of the criteria for screening. Concerns about potential harms of screening have largely been resolved. Screening identifies a high-risk population with the potential to gain from widely available interventions. However, in spite of the findings of modelling studies, the size of the benefit of earlier initiation of treatment and the overall cost-effectiveness remains uncertain, in contrast to other screening programmes (such as for abdominal aortic aneurysms) that are yet to be fully implemented. There is also uncertainty about optimal specifications and implementation of a screening programme, and further work to complete concerning development and delivery of individual- and population-level preventive strategies. While there is growing evidence of the net benefit of earlier detection of individuals with prevalent but undiagnosed diabetes, there remains limited justification for a policy of universal population-based screening for type 2 diabetes at the present time. Data from ongoing studies should inform the key assumptions in existing modelling studies and further reduce uncertainty. [source]


    On phenomenology and classification of hoarding: a review

    ACTA PSYCHIATRICA SCANDINAVICA, Issue 5 2004
    T. Maier
    Objective:, Hoarding is a behavioural abnormity characterized by the excessive collection of poorly useable objects. It is described mainly in association with obsessive,compulsive disorders (OCDs) and in geriatric populations. Yet the literature on the phenomenon is heterogeneous and the notion obviously lacks a consistent definition. This review attempts to describe the psychopathological and clinical spectrum of hoarding and may contribute to clarify its classification. Method:, Systematic review and discussion of the literature on hoarding. Results:, Hoarding is a complex behavioural phenomenon associated with different mental disorders. The psychopathological structure is variously composed of elements of OCDs, impulse-control disorders, and ritualistic behaviour. Severe self-neglect is a possible consequence of hoarding. Conclusion:, Without further specifications the term hoarding is of limited heuristic value and cannot guide therapeutic interventions satisfactorily. The condition needs to be evaluated carefully in every particular case in relation to the aforementioned psychopathological concepts. [source]


    Scaling of spectral displacement ordinates with damping ratios

    EARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 2 2005
    Julian J. Bommer
    Abstract The next generation of seismic design codes, especially those adopting the framework of performance-based design, will include the option of design based on displacements rather than forces. For direct displacement-based design using the substitute structure approach, the spectral ordinates of displacement need to be specified for a wide range of response periods and for several levels of damping. The code displacement spectra for damping values higher than the nominal value of 5% of critical will generally be obtained, as is the case in Eurocode 8 and other design codes, by applying scaling factors to the 5% damped ordinates. These scaling factors are defined as functions of the damping ratio and, in some cases, the response period, but are independent of the nature of the expected ground shaking. Using both predictive equations for spectral ordinates at several damping levels and stochastic simulations, it is shown that the scaling factors for different damping levels vary with magnitude and distance, reflecting a dependence of the scaling on the duration of shaking that increases with the damping ratio. The options for incorporating the influence of this factor into design code specifications of displacement response spectra are discussed. Copyright © 2004 John Wiley & Sons, Ltd. [source]