Useful Insights (useful + insight)

Distribution by Scientific Domains


Selected Abstracts


Performance evaluation of GPON vs EPON for multi-service access

INTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, Issue 2 2009
T. Orphanoudakis
Abstract Recently both ITU and IEEE have standardized solutions for passive optical networks (PONs) operating at gigabit per second line rates and optimized for the transport of packet-based traffic to improve the efficiency of previously standardized broadband PONs, which used the ATM cell as the data transport unit. The efficiency and performance of PON systems depend on the transmission convergence layer and mainly on the implemented medium access protocol. Although the latter is not part of the standards and left to the implementer, the standards describe a set of control fields that constitute the tool-set for the media access control (MAC) operation. Though starting from a common and quite obvious basis, the two standards present significant differences with the legacy of Ethernet marking the IEEE approach, while the emphasis of ITU is on demanding services. In this paper we compare the efficiency and performance of the two systems assuming the implementation of as close as possible MAC protocols. The target is twofold: assess and compare the traffic handling potential of each of the two standards and identify the range of applications they can support. Useful insight can also be gained to the MAC tools that could be designed into the next generation extra large WDM PONs. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Sprinklered office fire tests

FIRE AND MATERIALS, Issue 3 2008
I. D. Bennetts
Abstract This paper presents data relating to the performance of sprinklers and detectors in real office fire situations. For sprinklers, these data are additional to that associated with the standardized testing used to determine the design delivery density and pressure requirements for various occupancy situations, and provide a useful insight into the effect of sprinklers on developing fires with various office situations. The data given in this paper include the times for activation of various types of sprinkler heads (normal and fast response), the efficacy of the systems as far as extinguishment is concerned, estimates of the maximum size of the fires prior to commencement of extinguishment and associated air temperatures at various locations within the office enclosures. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Theta rhythm of navigation: Link between path integration and landmark navigation, episodic and semantic memory

HIPPOCAMPUS, Issue 7 2005
György Buzsáki
Abstract Five key topics have been reverberating in hippocampal-entorhinal cortex (EC) research over the past five decades: episodic and semantic memory, path integration ("dead reckoning") and landmark ("map") navigation, and theta oscillation. We suggest that the systematic relations between single cell discharge and the activity of neuronal ensembles reflected in local field theta oscillations provide a useful insight into the relationship among these terms. In rats trained to run in direction-guided (1-dimensional) tasks, hippocampal cell assemblies discharge sequentially, with different assemblies active on opposite runs, i.e., place cells are unidirectional. Such tasks do not require map representation and are formally identical with learning sequentially occurring items in an episode. Hebbian plasticity, acting within the temporal window of the theta cycle, converts the travel distances into synaptic strengths between the sequentially activated and unidirectionally connected assemblies. In contrast, place representations by hippocampal neurons in 2-dimensional environments are typically omnidirectional, characteristic of a map. Generation of a map requires exploration, essentially a dead reckoning behavior. We suggest that omnidirectional navigation through the same places (junctions) during exploration gives rise to omnidirectional place cells and, consequently, maps free of temporal context. Analogously, multiple crossings of common junction(s) of episodes convert the common junction(s) into context-free or semantic memory. Theta oscillation can hence be conceived as the navigation rhythm through both physical and mnemonic space, facilitating the formation of maps and episodic/semantic memories. © 2005 Wiley-Liss, Inc. [source]


Impact of dilute acid pretreatment on the structure of bagasse for bioethanol production

INTERNATIONAL JOURNAL OF ENERGY RESEARCH, Issue 3 2010
Wei-Hsin Chen
Abstract Dilute acid pretreatment is a commonly used pretreatment method in the course of producing bioethanol from lignocellulosics and the structure variation of the lignocellulosics is highly related to the pretreatment process. To understand the impact of dilute acid pretreatment on the structure of bagasse, four different pretreatment conditions by varying heating time are considered where the bagasse and the pretreated materials are examined using a variety of analysis methods. The obtained results indicate that the thermogravimetric analysis (TGA) is able to provide a useful insight into the recognition of lignocellulosic structure. Specifically, the peak of the TGA of the pretreated materials moves toward the low temperature region, revealing that the lignocellulosic structure is loosened. However, the characteristic of crystal structure of cellulose remains in the pretreated materials. Increasing heating time enhances the pretreatment procedure; as a result, the average particle size of the investigated materials increases with heating time. This swelling behavior may be attributed to the enlarged holes inside the particles in that the surface area decreases with increasing heating time. In addition, when the heating time is increased to a certain extent (e.g. 15,min), some fragments are found at the surface and they tend to peel off from the surface. It follows that the dilute acid pretreatments have a significant effect on the bagasse structure. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Dimensioning and optimization of push-to-talk over cellular server

INTERNATIONAL JOURNAL OF NETWORK MANAGEMENT, Issue 1 2008
M. T. Alam
The PoC (push-to-talk over cellular) application allows point-to-point or point-to-multipoint voice communication between mobile network users. The related work over PoC focuses on the performance analysis only and is ignorant about dimensioning a PoC controller to optimize revenue for service providers. In this paper, we dimension a PoC service with the assumption that the network grade of service is provided. The on-demand sessions should have access priority over pre-established sessions. A PoC controller should be able to terminate a PoC session based on an optimal timer. Moreover, the number of simultaneous session initiations by a PoC client is also a configurable parameter. We derived relations to provide access priority to special PoC sessions based on available transmit/receive units (TRU) and threshold level. Load sharing expressions are reported for a PoC controller using the Lagrange multiplier technique. A simple relation to control the PoC session timer is proposed. Finally, the derivation of maximum number of allowable simultaneous sessions is depicted using two-state Markov models. Numerical results have been computed with the corresponding derivation to provide a useful insight into the system behaviour. A PoC service can benefit from these optimal values of our work during the busy hour. Copyright © 2007 John Wiley & Sons, Ltd. [source]


One-Dimensional Rabbit Sinoatrial Node Models:

JOURNAL OF CARDIOVASCULAR ELECTROPHYSIOLOGY, Issue 2003
Benefits, Limitations
Introduction: Cardiac multicellular modeling has traditionally focused on ventricular electromechanics. More recently, models of the atria have started to emerge, and there is much interest in addressing sinoatrial node structure and function. Methods and Results: We implemented a variety of one-dimensional sinoatrial models consisting of descriptions of central, transitional, and peripheral sinoatrial node cells, as well as rabbit or human atrial cells. These one-dimensional models were implemented using CMISS on an SGI® Origin® 2000 supercomputer. Intercellular coupling parameters recorded in experimental studies on sinoatrial node and atrial cell-pairs under-represent the electrotonic interactions that any cardiomyocyte would have in a multidimensional setting. Unsurprisingly, cell-to-cell coupling had to be scaled-up (by a factor of 5) in order to obtain a stable leading pacemaker site in the sinoatrial node center. Further critical parameters include the gradual increase in intercellular coupling from sinoatrial node center to periphery, and the presence of electrotonic interaction with atrial cells. Interestingly, the electrotonic effect of the atrium on sinoatrial node periphery is best described as opposing depolarization, rather than necessarily hyperpolarizing, as often assumed. Conclusion: Multicellular one-dimensional models of sinoatrial node and atrium can provide useful insight into the origin and spread of normal cardiac excitation. They require larger than "physiologic" intercellular conductivities in order to make up for a lack of "anatomical" spatial scaling. Multicellular models for more in-depth quantitative studies will require more realistic anatomico-physiologic properties. (J Cardiovasc Electrophysiol, Vol. 14, pp. S121-S132, October 2003, Suppl.) [source]


Rheumatoid arthritis patient education: RA patients' experience

JOURNAL OF CLINICAL NURSING, Issue 14 2009
Paula Mäkeläinen
Aim and objective., The purpose of this paper is to describe the content of patient education as portrayed and evaluated by rheumatoid arthritis (RA) patients. Background., Rheumatology nurses have an important role in educating and supporting RA patients. However, there is a lack of knowledge of the RA patients' own perspective of patient education. Design., Survey. Method., Data for this study were collected from 173 RA patients from 11 hospitals and 23 health centers using open-ended questions. Fifty-seven percent (57%) of the patients described the content of patient education and eighty-one percent (81%) evaluated it expressing their experience and satisfaction with it. Data were analysed using descriptive and non-parametric statistical tests. Results., Rheumatology nurses mostly gave their RA patients information about how to use the anti-rheumatic drugs prescribed to them (26%). About half (51%) of the patients were satisfied with patient education. However, every fourth patient (24%) was not satisfied, the main reason for the dissatisfaction being that nurses did not focus on the patient's emotional support. The patients of over 57 years of age and those who had suffered from RA for over five years were more satisfied with their education than the others. Conclusions., It is important that rheumatology nurses, besides passing on medical treatment-related information, concentrate on RA patients' emotional well-being. Relevance to clinical practice., The results provide a useful insight into RA patient education. Nurses should avoid merely passing on information in a routine workmanlike way. It is important that they take time to discuss their patients' feelings and worries especially with newly diagnosed patients. RA patient education should balance patients' information needs with their need for emotional support. [source]


Real-time polymerase chain reaction quantification of Epstein,Barr virus in chronic periodontitis patients

JOURNAL OF PERIODONTAL RESEARCH, Issue 4 2005
Antonis Konstantinidis
Background:, Although herpes viruses have been implicated in the pathogenesis of chronic and aggressive periodontitis, few data in the literature refer to quantification of these viruses in periodontal sites, especially in relation to serological findings. Objective:, The aim of the present study was to compare Epstein,Barr virus (EBV) DNA load in subgingival specimens from chronic periodontitis patients and in periodontally healthy subjects, in relation to serologic testing of IgM and IgG antibodies to EBV. Methods:, A total of 22 chronic periodontitis patients and 13 controls participated in the present study. Seventy-nine subgingival specimens (one pooled, one from a deep and one from a shallow site), sampled with paper points, were analysed with real-time polymerase chain reaction for EBV. Subjects were also examined for anti-EBV IgG and IgM levels in serum, using an enzyme-linked immunosorbent assay. Results:, One subject was seronegative for EBV. Three subjects (one patient and two controls) displayed anti-EBV IgM. Their data were excluded from further analysis. All three displayed EBV in their subgingival samples. Nine out of the remaining 20 chronic periodontitis patients and 10 out of 11 controls did not display EBV subgingivally. A statistically significant difference in viral load was observed between pooled and shallow-pocket samples from periodontitis patients but not between samples from deep and shallow pockets (Kruskall,Wallis anova, Dunn's multiple comparisons test). Conclusions:, Data from the present study do not strongly support the pathogenetic significance of EBV in chronic periodontitis lesions. The data do, however, suggest that parallel serological assessments provide a useful insight into the association of viruses with periodontal disease. [source]


A comprehensive and systematic model of user evaluation of Web search engines: I. Theory and background,

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, Issue 13 2003
Louise T. Su
The project proposes and tests a comprehensive and systematic model of user evaluation of Web search engines. The project contains two parts. Part I describes the background and the model including a set of criteria and measures, and a method for implementation. It includes a literature review for two periods. The early period (1995,1996) portrays the settings for developing the model and the later period (1997,2000) places two applications of the model among contemporary evaluation work. Part II presents one of the applications that investigated the evaluation of four major search engines by 36 undergraduates from three academic disciplines. It reports results from statistical analyses of quantitative data for the entire sample and among disciplines, and content analysis of verbal data containing users' reasons for satisfaction. The proposed model aims to provide systematic feedback to engine developers or service providers for system improvement and to generate useful insight for system design and tool choice. The model can be applied to evaluating other compatible information retrieval systems or information retrieval (IR) techniques. It intends to contribute to developing a theory of relevance that goes beyond topicality to include value and usefulness for designing user-oriented information retrieval systems. [source]


Effect of growth conditions on grown-in defect formation and luminescence efficiency in Ga(In)NP epilayers grown by molecular-beam epitaxy

PHYSICA STATUS SOLIDI (C) - CURRENT TOPICS IN SOLID STATE PHYSICS, Issue 2 2008
D. Dagnelund
Abstract A detailed study of the impact of different growth conditions (i.e. ion bombardment, nitrogen flow and In content) on the defect formation in Ga(In)NP epilayers grown on GaP substrates by solid-source molecular beam epitaxy is performed. Reduced nitrogen ion bombardment during the growth is shown to significantly reduce formation of defects acting as competing recombination centers, such as a Ga interstitial defect and other unidentified defects revealed by optically detected magnetic resonance. Further, high nitrogen flow is found to be even more effective than the ion bombardment in introducing the defects. The incorporation of In by 5.1% is, on the other hand, found not to affect the introduction of defects. The results provide a useful insight into the formation mechanism of the defects that will hopefully shed light on a control of the defect introduction in the alloys by optimizing growth conditions. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


Stalled replication forks: Making ends meet for recognition and stabilization

BIOESSAYS, Issue 8 2010
Hisao Masai
Abstract In bacteria, PriA protein, a conserved DEXH-type DNA helicase, plays a central role in replication restart at stalled replication forks. Its unique DNA-binding property allows it to recognize and stabilize stalled forks and the structures derived from them. Cells must cope with fork stalls caused by various replication stresses to complete replication of the entire genome. Failure of the stalled fork stabilization process and eventual restart could lead to various forms of genomic instability. The low viability of priA null cells indicates a frequent occurrence of fork stall during normal growth that needs to be properly processed. PriA specifically recognizes the 3,-terminus of the nascent leading strand or the invading strand in a displacement (D)-loop by the three-prime terminus binding pocket (TT-pocket) present in its unique DNA binding domain. Elucidation of the structural basis for recognition of arrested forks by PriA should provide useful insight into how stalled forks are recognized in eukaryotes. [source]


Electrochemistry at High Pressures: A Review

ELECTROANALYSIS, Issue 10 2004
Debora Giovanelli
Abstract High pressure electrochemical studies are potentially dangerous and less immediately implemented than conventional investigations. Technical obstacles related to properties of the working electrode material, preparation of its surface, availability of suitable reference electrodes, and the need for specially designed high pressure equipment and cells may account for the relative lack of experimental data on electrochemistry at high pressures. However, despite the stringent requirements for system and equipment stability, significant developments have been made in recent years and the combination of electrochemical methods with high hydrostatic pressure has provided useful insights into the thermodynamics, kinetics, and other physico-chemical characteristics of a wide range of redox reactions. In addition to fundamental information, high pressure electrochemistry has also lead to a better understanding of a variety of processes under non-classical conditions with potential applications in today's industrial environment from extraction and electrosynthesis in supercritical fluids to measurement of the pH at the bottom of the ocean. The purpose of this article is to detail the experimental pressurizing apparatus for electroanalytical measurements at high pressures and to review the relevant literature on the effect of pressure on electrode processes and on the properties of aqueous electrolyte solutions. [source]


Space varying coefficient models for small area data

ENVIRONMETRICS, Issue 5 2003
Renato M. Assunção
Abstract Many spatial regression problems using area data require more flexible forms than the usual linear predictor for modelling the dependence of responses on covariates. One direction for doing this is to allow the coefficients to vary as smooth functions of the area's geographical location. After presenting examples from the scientific literature where these spatially varying coefficients are justified, we briefly review some of the available alternatives for this kind of modelling. We concentrate on a Bayesian approach for generalized linear models proposed by the author which uses a Markov random field to model the coefficients' spatial dependency. We show that, for normally distributed data, Gibbs sampling can be used to sample from the posterior and we prove a result showing the equivalence between our model and other usual spatial regression models. We illustrate our approach with a number of rather complex applied problems, showing that the method is computationally feasible and provides useful insights in substantive problems. Copyright © 2003 John Wiley & Sons, Ltd. [source]


Cocaine Rapid Efficacy Screening Trials (CREST): lessons learned

ADDICTION, Issue 2005
Kyle M. Kampman
ABSTRACT Aims The Cocaine Rapid Efficacy Screening Trials (CREST) were designed by the National Institute on Drug Abuse Division of Treatment Research and Development (NIDA, DT R&D) to rapidly screen a number of medications potentially useful for the treatment of cocaine dependence. Design Each CREST trial was designed to compare several medications in a single trial against an unmatched placebo. The placebo group was included in each trial to avoid the nearly universal positive response to medications seen in open-label trials. In addition, a common set of procedures and outcome measures were employed throughout to increase comparability of results obtained from different trials and from different times. Participants In all, 18 medications were screened in seven different trials, conducted in four different sites throughout the United States involving 398 cocaine-dependent patients. Findings Three medications were found to be promising enough to include in subsequent larger trials. Common statistical procedures for evaluating medications were developed to facilitate comparisons across sites and across time. A portion of the data were pooled and analyzed, which yielded some useful insights into cocaine dependence and its treatment. Finally, a review of individual trials together with the pooled analysis revealed several potential improvements for future screening trials. Conclusions Overall, the CREST trials proved to be useful for rapidly screening medications for treatment of cocaine dependence, but several modifications in design should be made before this framework is applied further. [source]


Effect of Electric Field on Coulomb-Stabilized Excitons in Host/Guest Systems for Deep-Blue Electrophosphorescence

ADVANCED FUNCTIONAL MATERIALS, Issue 15 2009
Stephan Haneder
Abstract Here, a study of the electric field induced quenching on the phosphorescence intensity of a deep-blue triplet emitter dispersed in different host materials is presented. The hosts are characterized by a higher triplet excitonic level with respect to the emitter, ensuring efficient energy transfer and exciton confinement, whereas they differ in the highest occupied molecular orbital (HOMO) alignment, forming type I and type II host/guest heterostructures. While the type I structure shows negligible electric field induced quenching, a quenching up to 25% for the type II at a field of 2,MV/cm is reported. A similar quenching behaviour is also reported for thin films of the pure emitter, revealing an important luminescence loss mechanism for aggregated emitter molecules. These results are interpreted by considering Coulomb stabilized excitons in the type II heterostructure and in the pure emitter, that become very sensitive to dissociation upon application of the field. These results clarify the role of external electric field quenching on the phosphorescence of triplet emitters and provide useful insights for the design of deep-blue electrophosphorescent devices with a reduced efficiency roll-off. [source]


Earnings characteristics and analysts' differential interpretation of earnings announcements: An empirical analysis

ACCOUNTING & FINANCE, Issue 2 2009
Anwer S. Ahmed
G14; M41 Abstract This study provides empirical evidence on factors that drive differential interpretation of earnings announcements. We document that Kandel and Pearson's forecast measures of differential interpretation are decreasing in proxies for earnings quality and pre-announcement information quality. This evidence yields new and useful insights regarding which earnings announcements are less likely to generate newfound disagreement among analysts and investors. Recent research suggests that investor disagreement can increase investment risk, increase the cost of capital, and cause stock prices to deviate from fundamental value. Therefore, our results support prior intuition that increasing the quality of earnings and pre-announcement information can improve the efficiency of capital markets. [source]


Does Prospective Payment Really Contain Nursing Home Costs?

HEALTH SERVICES RESEARCH, Issue 2 2002
Li-Wu Chen
Objective. To examine whether nursing homes would behave more efficiently, without compromising their quality of care, under prospective payment. Data Sources. Four data sets for 1994: the Skilled Nursing Facility Minimum Data Set, the Online Survey Certification and Reporting System file, the Area Resource File, and the Hospital Wage Indices File. A national sample of 4,635 nursing homes is included in the analysis. Study Design. Using a modified hybrid functional form to estimate nursing home costs, we distinguish our study from previous research by controlling for quality differences (related to both care and life) and addressing the issues of output and quality endogeneity, as well as using more recent national data. Factor analysis was used to operationalize quality variables. To address the endogeneity problems, instrumental measures were created for nursing home output and quality variables. Principal Findings. Nursing homes in states using prospective payment systems do not have lower costs than their counterpart facilities under retrospective cost-based payment systems, after quality differences among facilities are controlled for and the endogeneity problem of quality variables is addressed. Conclusions. The effects of prospective payment on nursing home cost reduction may be through quality cuts, rather than cost efficiency. If nursing home payments under prospective payment systems are not adjusted for quality, nursing homes may respond by cutting their quality levels, rather than controlling costs. Future outcomes research may provide useful insights into the adjustment of quality in the design of prospective payment for nursing home care. [source]


Molecular diagnosis of inherited disorders: lessons from hemoglobinopathies,

HUMAN MUTATION, Issue 5 2005
George P. Patrinos
Abstract Hemoglobinopathies constitute a major health problem worldwide, with a high carrier frequency, particularly in certain regions where malaria has been endemic. These disorders are characterized by a vast clinical and hematological phenotypic heterogeneity. Over 1,200 different genetic alterations that affect the DNA sequence of the human ,-like (HBZ, HBA2, HBA1, and HBQ1) and ,-like (HBE1, HBG2, HBG1, HBD, and HBB) globin genes are mainly responsible for the observed clinical heterogeneity. These mutations, together with detailed information about the resulting phenotype, are documented in the globin locus-specific HbVar database. Family studies and comprehensive hematological analyses provide useful insights for accurately diagnosing thalassemia at the DNA level. For this purpose, numerous techniques can provide accurate, rapid, and cost-effective identification of the underlying genetic defect in affected individuals. The aim of this article is to review the diverse methodological and technical platforms available for the molecular diagnosis of inherited disorders, using thalassemia and hemoglobinopathies as a model. This article also attempts to shed light on issues closely related to thalassemia diagnostics, such as prenatal and preimplantation genetic diagnoses and genetic counseling, for better-quality disease management. Hum Mutat 26(5), 399,412, 2005. © 2005 Wiley-Liss, Inc. [source]


Interpreting sediment delivery processes using suspended sediment-discharge hysteresis patterns from nested upland catchments, south-eastern Australia

HYDROLOGICAL PROCESSES, Issue 17 2009
Hugh G. Smith
Abstract In this study, suspended sediment concentration (SSC) and discharge (Q) hysteresis patterns recorded at the outlets of two nested upland catchments in south-eastern Australia were examined. Detailed monitoring of sediment flux was undertaken in a 1·64 km2 sub-catchment located within a 53·5 km2 catchment for which sediment yield was measured and the extent of incised channels mapped. The analysis of SSC,Q hysteresis patterns was supplemented by these additional datasets to contribute to the explanation of observed patterns. Clockwise SSC,Q hysteresis loops (with the suspended sediment peak leading the discharge peak) were recorded most frequently at both sites. This was attributed to initial rapid delivery of sediment from channel banks, the dominant sediment source in the sub-catchment and probably also for the catchment, in conjunction with remobilization of in-channel fine sediment deposits. Sediment exhaustion effects were considered to enhance clockwise hysteresis, with reduced SSC on the falling limb of event hydrographs. Pronounced exhaustion effects were observed on some multi-rise events, with subsequent flow peaks associated with much reduced sediment peaks. To compare SSC,Q hysteresis patterns between the two catchments, a dimensionless similarity function (SF) was derived to differentiate paired-event hysteresis patterns according to the extent of pattern similarity. This analysis, coupled with the other datasets, provided insight into the function of erosion and sediment delivery processes across the spatial scales examined and indicated the dependency of between-scale suspended sediment transfer on defined flow event scenarios. Quantitative measures of event SSC,Q hysteresis pattern similarity may provide a mechanism for linking the timing and magnitude of process response across spatial scales. This may offer useful insights into the between-scale linkage of dominant processes and the extent of downstream suspended sediment delivery. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Validation of ECMWF (re)analysis surface climate data, 1979,1998, for Greenland and implications for mass balance modelling of the ice sheet

INTERNATIONAL JOURNAL OF CLIMATOLOGY, Issue 2 2001
Edward Hanna
Abstract Climate (re)analysis products are potentially valuable tools, when properly verified, for helping to constrain the surface mass balance of the Greenland Ice Sheet (GIS). Monthly surface fields from European Centre for Medium-Range Weather Forecasts (ECMWF) operational- and re-analyses spanning 1979,1998 were validated using in situ data (surface air pressure and temperature, precipitation, cloud cover, short-/all-wave radiation, and wind speed/direction). These validation data are from coastal or near-coastal Danish Meteorological Institute (DMI) synoptic stations, inland Greenland Climate Network (GC-Net) and University of Wisconsin Automatic Weather Stations (AWSs), and two energy balance stations near the southern ice margin. The ECMWF analyses closely reproduce the seasonal patterns and interannual variations of much of the in situ data. Differences in the mean values of surface air pressure and temperature can mainly be ascribed to orography errors in the analyses' schemes, compared with the latest available accurate digital elevation model. Much of the GIS margin as modelled by ECMWF was too cold, on average by 4°C, and ECMWF precipitation averaged some 136% of the DMI station values. The misrepresentation of the (relatively) steep ice-sheet margin, which tends to be broadened and systematically over-elevated by several hundred metres, orographically reduced temperature and enhanced precipitation there in the ECMWF models. The cloud-cover comparison revealed not dissimilar annual mean cloud covers (ECMWF ,8%) but the ECMWF analyses had too little cloud and were too ,sunny' during the critical summer melt-season. ECMWF-modelled surface albedo in summer was ,11% lower than GC-Net values, which was mainly responsible for the disagreement of modelled surface short-wave radiation fluxes with observations. Model albedo and cloud errors need to be rectified if the analyses are to be used effectively to drive energy balance models of Greenland snowmelt. ECMWF wind speed averaged 66% (62%) of the DMI station (AWS) values. The validation results provide useful insights into how one can best improve the ECMWF Greenland climate data for use in glaciological and climatological studies. Copyright © 2001 Royal Meteorological Society [source]


Thermoeconomic analysis of household refrigerators

INTERNATIONAL JOURNAL OF ENERGY RESEARCH, Issue 10 2007
Arif Hepbasli
Abstract This study deals with thermoeconomic analysis of household refrigerators for providing useful insights into the relations between thermodynamics and economics. In the analysis, the EXCEM method based on the quantities exergy, cost, energy and mass is applied to a household refrigerator using the refrigerant R134a. The performance evaluation of the refrigerator is conducted in terms of exergoeconomic aspects based on the various reference state temperatures ranging from 0 to 20°C. The exergy destructions in each of the components of the overall system are determined for average values of experimentally measured parameters. Exergy efficiencies of the system components are determined to assess their performances and to elucidate potentials for improvement. Thermodynamic loss rate-to-capital cost ratios for each components of the refrigerator are investigated. Correlations are developed to estimate exergy efficiencies and ratios of exergy loss rate-to-capital cost as a function of reference (dead) state temperature. The ratios of exergy loss rates to capital cost values are obtained to vary from 2.949 × 10,4 to 3.468 × 10,4 kW US$,1. The exergy efficiency values are also found to range from 13.69 to 28.00% and 58.15 to 68.88% on the basis of net rational efficiency and product/fuel at the reference state temperatures considered, respectively. It is expected that the results obtained will be useful to those involved in the development of analysis and design methodologies that integrate thermodynamics and economics. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Promoting acute thrombolysis for ischaemic stroke (PRACTISE)

INTERNATIONAL JOURNAL OF STROKE, Issue 2 2007
Protocol for a cluster randomised controlled trial to assess the effect of implementation strategies on the rate, effects of thrombolysis for acute ischaemic stroke (ISRCTN 20405426)
Rationale Thrombolysis with intravenous rtPA is an effective treatment for patients with ischaemic stroke if given within 3 h from onset. Generally, more than 20% of stroke patients arrive in time to be treated with thrombolysis. Nevertheless, in most hospitals, only 1,8% of all stroke patients are actually treated. Interorganisational, intraorganisational, medical and psychological barriers are hampering broad implementation of thrombolysis for acute ischaemic stroke. Aims To evaluate the effect of a high-intensity implementation strategy for intravenous thrombolysis in acute ischaemic stroke, compared with regular implementation; to identify success factors and obstacles for implementation and to assess its cost-effectiveness, taking into account the costs of implementation. Design The PRACTISE study is a national cluster-randomised-controlled trial. Twelve hospitals have been assigned to the regular or high-intensity intervention by random allocation after pair-wise matching. The high-intensity implementation consists of training sessions in conformity with the Breakthrough model, and a tool kit. All patients who are admitted with acute stroke and onset of symptoms not longer than 24 h are registered. Study outcomes The primary outcome measure is treatment with thrombolysis. Secondary outcomes are admission within 4 h after onset of symptoms, death or disability at 3 months, the rate of haemorrhagic complications in patients treated with thrombolysis, and costs of implementation and stroke care in the acute setting. Tertiary outcomes are derived from detailed criteria for the organisational characteristics, such as door-to-needle time and protocol violations. These can be used to monitor the implementation process and study the effectiveness of specific interventions. Discussion This study will provide important information on the effectiveness and cost-effectiveness of actively implementing an established treatment for acute ischaemic stroke. The multifaceted aspect of the intervention will make it difficult to attribute a difference in the primary outcome measure to a specific aspect of the intervention. However, careful monitoring of intermediate parameters as well as monitoring of accomplished SMART tasks can be expected to provide useful insights into the nature and role of factors associated with implementation of thrombolysis for acute ischaemic stroke, and of effective acute interventions in general. [source]


Accuracy of patient recall of preoperative symptom severity (angina and breathlessness) at one year following aorta-coronary artery bypass grafting

JOURNAL OF CLINICAL NURSING, Issue 3 2009
Grace M Lindsay
Aim and objective., The accuracy with which patients recall their cardiac symptoms prior to aorta-coronary artery bypass grafting is assessed approximately one year after surgery together with patient-related factors potentially influencing accuracy of recall. Background., This is a novel investigation of patient's rating of preoperative symptom severity before and approximately one year following aorta-coronary artery bypass grafting. Design., Patients undergoing aorta-coronary artery bypass grafting (n = 208) were recruited preoperatively and 177 of these were successfully followed up at 16·4 (SD 2·1) months after surgery and asked to describe current and recalled preoperative symptoms using a 15-point numerical scale. Method., Accuracy of recall was measured and correlated (Pearson's correlation) with current and past symptoms, health-related quality of life and coronary artery disease risk factors. Hypothesis tests used Student's t -test and the chi-squared test. Results., Respective angina and breathlessness scores were recalled accurately by 16·9% and 14·1% while 59% and 58% were inaccurate by more than one point. Although the mean preoperative and recalled scores for severity of both angina and breathlessness and were not statistically different, patients who recalled most accurately their preoperative scores had, on average, significantly higher preoperative scores than those with less accurate recall. Patients whose angina and breathlessness symptoms were relieved by operation had significantly better accuracy of recall than patients with greater levels of symptoms postoperatively. Conclusion., Patient's rating of preoperative symptom severity before and one year following aorta-coronary artery bypass grafting was completely accurate in approximately one sixth of patients with similar proportions of the remaining patients overestimating and underestimating symptoms. The extent to which angina and breathlessness was relieved by operation was a significant factor in improving accuracy of recall. Relevance to clinical practice., Factors associated with accuracy of recall of symptoms provide useful insights for clinicians when interpreting patients' views of the effectiveness of aorta-coronary artery bypass grafting for the relief of symptoms associated with coronary heart disease. [source]


Can MM-PBSA calculations predict the specificities of protein kinase inhibitors?

JOURNAL OF COMPUTATIONAL CHEMISTRY, Issue 16 2006
Christopher S. Page
Abstract An application of the Molecular mechanics Poisson,Boltzmann surface area (MM-PBSA) protocol to the prediction of protein kinase inhibitor selectivity is presented. Six different inhibitors are placed in equivalent orientations in each of six different receptors. Fully solvated molecular dynamics is then run for 1 ns on each of the 36 complexes, and the resulting trajectories scored, using the implicit solvent model. The results show some correlation with experimentally-determined specificities; anomalies may be attributed to a variety of causes, including difficulties in quantifying induced fit penalties and variabilities in normal modes calculations. Decomposing interaction energies on a per-residue basis yields more useful insights into the natures of the binding modes and suggests that the real value of such calculations lies in understanding interactions rather than outright prediction. © 2006 Wiley Periodicals, Inc. J Comput Chem, 2007 [source]


Modelling consumer entertainment software choice: An exploratory examination of key attributes, and differences by gamer segment

JOURNAL OF CONSUMER BEHAVIOUR, Issue 5 2010
Sunita Prugsamatz
From virtually nowhere 20 years ago to sales of US$9.5 billion in 2007, the video game industry has now overtaken movie industry box-office receipts in terms of annual sales, and blockbuster video games can out perform blockbuster movies for opening-week sales. This dramatic growth is likely to continue in coming years. Yet there has been little scholarly attention to consumers within the industry. This research fills this gap by providing a comprehensive study of consumer behaviour in the gaming industry, using the Theory of Planned Behaviour (TPB); a widely used, robust and reliable consumer research instrument. The study elicits key salient attributes for the major constructs in the TPB model , attitude, subjective norm, and perceived behavioural control , and shows how these key constructs affect purchase intention. To avoid aggregation error in analysing overall market data, this study segments the market and examines differences in perspective by gamer type. We therefore examine differences in these key salient attributes by gamer type to understand consumer motivations better. As the first systematic study to examine consumer behaviour issues in the gaming industry, this study provides useful insights to consumers' behaviour in a large, growing industry. Consumer perceptions and behaviour toward entertainment software is complex and this study is not the final word, but it is the first available empirical evidence and can thus move forward the discussion from speculation to replication, extension, and alternative approaches. For managers in this industry, this study demonstrates how a comprehensive model can be applied to entertainment software. Copyright © 2010 John Wiley & Sons, Ltd. [source]


Improving the implementation of evidence-based practice: a knowledge management perspective

JOURNAL OF EVALUATION IN CLINICAL PRACTICE, Issue 3 2006
John Sandars MSc FRCGP MRCP CertEd
Abstract Experience of knowledge management initiatives in non-health care organizations can offer useful insights, and strategies, to implement evidence-based practice in health care. Knowledge management offers a structured process for the generation, storage, distribution and application of knowledge in organizations. This includes both tacit knowledge (personal experience) and explicit knowledge (evidence). Communities of practice are a key component of knowledge management and have been recognized to be essential for the implementation of change in organizations. It is within communities of practice that tacit knowledge is actively integrated with explicit knowledge. Organizational factors that limit the development of knowledge management, including communities of practice, in non-health care organizations need to be overcome if the potential is to be achieved within health care. [source]


Correlations between physiology and lifespan , two widely ignored problems with comparative studies

AGING CELL, Issue 4 2005
John R. Speakman
Summary Comparative differences between species provide a powerful source of information that may inform our understanding of the aging process. However, two problems regularly attend such analyses. The co-variation of traits with body mass is frequently ignored, along with the lack of independence of the data due to a shared phylogenetic history. These problems undermine the use of simple correlations between various factors and maximum lifespan potential (MLSP) across different species as evidence that the factors in question have causal effects on aging. Both of these problems have been widely addressed by comparative biologists working in fields other than aging research, and statistical solutions to these issues are available. Using these statistical approaches, of making analyses of residual traits with the effects of body mass removed, and deriving phylogenetically independent contrasts, will allow analyses of the relationships between physiology and maximum lifespan potential to proceed unhindered by these difficulties, potentially leading to many useful insights into the aging process. [source]


Managing new-style currency crises: the swan diagram approach revisited,

JOURNAL OF INTERNATIONAL DEVELOPMENT, Issue 5 2007
Ramkishen S. Rajan
Abstract The new-style currency crises that have afflicted a number of developing and emerging economies of late are characterised by sudden stops in capital inflows and adverse balance sheet effects. Given the potential high costs of these crises, there remains an ongoing debate on how they might best be managed when they do arise. This paper argues that the age-old Swan diagram, appropriately modified, is able to provide useful insights into how a country might manage a new-style crisis via a combination of adjustment (which involves expenditure switching and reducing polices) and financing. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Genotype-dependent priming to self- and xeno-cannibalism in heterozygous and homozygous lymphoblasts from patients with Huntington's disease

JOURNAL OF NEUROCHEMISTRY, Issue 4 2006
Elisabetta Mormone
Abstract In the present work, we studied the mitochondrial function and cell death pathway(s) in heterozygous and homozygous immortalized cell lines from patients with Huntington's disease (HD). Heterozygosis was characterized by specific alterations in mitochondrial membrane potential, a constitutive hyperpolarization state of mitochondria, and was correlated with an increased susceptibility to apoptosis. Lymphoblasts from homozygous patients, on the other hand, were characterized by a significant percentage of cells displaying autophagic vacuoles. These cells also demonstrated a striking attitude towards significant cannibalistic activity. Considering the pathogenic role of cell death in HD, our study provides new and useful insights into the role of mitochondrial dysfunction, i.e. hyperpolarization, in hijacking HD heterozygous cells towards apoptosis and HD homozygous cells towards a peculiar phenotype characterized by both self- and xeno-cannibalism. These events can, however, be viewed as an ultimate attempt to survive rather than a way to die. The present work underlines the possibility that HD-associated mitochondrial defects could tentatively be by-passed by the cells by activating cellular ,phagic' activities, including so-called ,mitophagy' and ,cannibalism', that only finally lead to cell death. [source]


Design strategies for controlling the molecular weight and rate using reversible addition,fragmentation chain transfer mediated living radical polymerization

JOURNAL OF POLYMER SCIENCE (IN TWO SECTIONS), Issue 15 2005
Michael J. Monteiro
Abstract Living radical polymerization has allowed complex polymer architectures to be synthesized in bulk, solution, and water. The most versatile of these techniques is reversible addition,fragmentation chain transfer (RAFT), which allows a wide range of functional and nonfunctional polymers to be made with predictable molecular weight distributions (MWDs), ranging from very narrow to quite broad. The great complexity of the RAFT mechanism and how the kinetic parameters affect the rate of polymerization and MWD are not obvious. Therefore, the aim of this article is to provide useful insights into the important kinetic parameters that control the rate of polymerization and the evolution of the MWD with conversion. We discuss how a change in the chain-transfer constant can affect the evolution of the MWD. It is shown how we can, in principle, use only one RAFT agent to obtain a polymer with any MWD. Retardation and inhibition are discussed in terms of (1) the leaving R group reactivity and (2) the intermediate radical termination model versus the slow fragmentation model. © 2005 Wiley Periodicals, Inc. J Polym Sci Part A: Polym Chem 43: 3189,3204, 2005 [source]