End Users (end + user)

Distribution by Scientific Domains

Selected Abstracts

Networking lessons in delivering ,Software as a Service',Part II

David Greschler
In part I of this paper, we described the origins and evolution of Software as a Service (SaaS) and its value proposition to Corporate IT, Service Providers, Independent Software Vendors and End Users. SaaS is a model in which software applications are deployed, managed, updated and supported on demand,like a utility,and are served to users centrally using servers that are internal or external to the enterprise. Applications are no longer installed locally on a user's desktop PC; instead, upgrades, licensing and version control, metering, support and provisioning are all managed at the server level. In part we examine the lessons learned in researching, building and running an SaaS service. Copyright 2002 John Wiley & Sons, Ltd. [source]

Matching Technologies with Potential End Users: A Knowledge Engineering Approach for Agricultural Research Management

J. David Reece
This paper addresses the problem of priority setting that faces developing country agricultural research, a problem whose relevance has been sharpened by the current context of demands for greater efficiency and targeted impact. A new method for ex ante estimation of the impact of developing each of several alternative proposed technologies is described and illustrated through an example from West Africa. This method is based on the notion of market segmentation, which normally makes intensive use of secondary data-sets that are simply not available for rural areas of developing countries. To circumvent this lack of secondary data, the method adopts a knowledge engineering approach based on the views of an expert panel familiar with the region to be served. Descriptions of proposed technologies are matched with the interests and resources of identified market segments, together with the characteristics of their farming systems and locations, to identify those segments whose members are likely to use the proposed technology. Further development of the method is discussed. [source]

End Users: Actors in the Industrial Relations System?

Guy Bellemare
The paradigm elaborated by John T. Dunlop in his landmark 1958 volume, Industrial Relations Systems, described this system as consisting of three actors: unions, employers and the State. Over the past few years, the call to expand upon the notion of actors in the industrial relations environment has become more and more widespread, but no one has yet suggested how this integration might be implemented. The main objective of this paper is to propose an analytical model of the actor and to explore how the latter could be applied in the case of public urban transit users. [source]

Allen Denver Russell Memorial Lecture, 2006

The use of microbiocides in infection control: a critical look at safety, applications, testing
Abstract Microbial pathogens continue as major threats to health. Indeed, many ongoing societal changes are enhancing our vulnerability and exposure to several frank and opportunistic pathogens. This, together with rampant antimicrobial resistance and reduced prospects for newer drugs and vaccines, is forcing a higher reliance on microbiocides in infection prevention and control. That this reliance may not be well-founded becomes apparent from a closer look at current ways of testing and registering microbiocides, their label claims as well as human and environmental safety of certain widely used microbicidal chemicals. Many methods to test microbiocides for registration are flawed and/or entail test conditions irrelevant to field use. Pathogens listed on product labels may not be among those amenable to interruption through microbiocide use. The wide variations and discrepancies in existing national/regional regulations for registering microbiocides for sale stifle innovation. This is a critical look at the above-mentioned issues with emphasis on chemicals meant for use on environmental surfaces and medical devices. It highlights better ways to test microbiocides and to attain global harmonization of testing and product registration. It also details the known and potential dangers of microbiocide use and what to consider in choosing such formulations for optimal safety and effectiveness. End users are advised to be more critical and prudent in the selection and application of microbicidal chemicals, manufacturers are encouraged to explore infection control products and technologies that are safer in the workplace and for the environment, and regulators are urged to review and update the requirements and procedures for premarket review of microbiocide efficacy data and label claims. Independent investigations are also urgently needed to document the proportion of nosocomial infections that would be amenable to prevention through chemical disinfection of environmental surfaces. [source]

Modeling and simulation of mixed traffic on a prioritized shared medium

Jeffrey J. Evans
Network access systems (NAS) such as digital loop carriers (DLC) are increasingly utilizing a shared medium, such as Hybrid Fiber Coax (HFC) to provide point-to-multi-point access from the public switched telephone network (PSTN) to the end user (consumer). New services, such as direct access to the packet switched network (PSN, WWW) have been added to DLC equipment in such a way as to provide for a prioritized set of services over a shared medium in an effort to take advantage of otherwise unused bandwidth. The introduction of such services requires the modeling and analysis of these network access systems. This becomes complex when considering the variability in different service type traffic characteristics. This work identifies a traffic engineering problem of prioritized circuit switched and packet switched (PSTN/PSN) traffic over the same shared medium as it may relate to "perceived" quality of service (QoS). Copyright 2002 John Wiley & Sons, Ltd. [source]

Forecasting volatility for options valuation

Mahdjouba Belaifa
The petroleum sector plays a neuralgic role in the basement of world economies, and market actors (producers, intermediates, as well as consumers) are continuously subjected to the dynamics of unstable oil market. Huge amounts are being invested along the production chain to make one barrel of crude oil available to the end user. Adding to that are the effect of geopolitical dynamics as well as geological risks as expressed in terms of low chances of successful discoveries. In addition, fiscal regimes and regulations, technology and environmental concerns are also among some of the major factors that contribute to the substantial risk in the oil industry and render the market structure vulnerable to crises. The management of these vulnerabilities require modern tools to reduce risk to a certain level, which unfortunately is a non-zero value. The aim of this paper is, therefore, to provide a modern technique to capture the oil price stochastic volatility that can be implemented to value the exposure of an investor, a company, a corporate or a Government. The paper first analyses the regional dependence on oil prices, through a historical perspective and then looks at the evolution of pricing environment since the large price jumps of the 1970s. The main causes of oil prices volatility are treated in the third part of the paper. The rest of the article deals with volatility models and forecasts used in risk management, with an implication for pricing derivatives. [source]

User-Focused Public Space(M)UTOPIA in Denmark

Serban Cornea
Abstract The Danish practice MUTOPIA brings to public space a strong sense of delight and playfulness, while demonstrating an overriding concern with the end user. As Serban Cornea of MUTOPIA explains, a temporary plaza for the extensive development of Orestad Nord in Copenhagen aims ,to speed up the process of creating the area's own identity', while the practice's housing for Lyngby-Taarb'k, Hovedstaden, audaciously puts the ,garden' back into the ,garden suburb' by relocating the transport infrastructure to the rooftops. Copyright 2008 John Wiley & Sons, Ltd. [source]

MolProbity: all-atom structure validation for macromolecular crystallography

Vincent B. Chen
MolProbity is a structure-validation web service that provides broad-spectrum solidly based evaluation of model quality at both the global and local levels for both proteins and nucleic acids. It relies heavily on the power and sensitivity provided by optimized hydrogen placement and all-atom contact analysis, complemented by updated versions of covalent-geometry and torsion-angle criteria. Some of the local corrections can be performed automatically in MolProbity and all of the diagnostics are presented in chart and graphical forms that help guide manual rebuilding. X-ray crystallography provides a wealth of biologically important molecular data in the form of atomic three-dimensional structures of proteins, nucleic acids and increasingly large complexes in multiple forms and states. Advances in automation, in everything from crystallization to data collection to phasing to model building to refinement, have made solving a structure using crystallography easier than ever. However, despite these improvements, local errors that can affect biological interpretation are widespread at low resolution and even high-resolution structures nearly all contain at least a few local errors such as Ramachandran outliers, flipped branched protein side chains and incorrect sugar puckers. It is critical both for the crystallographer and for the end user that there are easy and reliable methods to diagnose and correct these sorts of errors in structures. MolProbity is the authors' contribution to helping solve this problem and this article reviews its general capabilities, reports on recent enhancements and usage, and presents evidence that the resulting improvements are now beneficially affecting the global database. [source]

Visualizing flood forecasting uncertainty: some current European EPS platforms,COST731 working group 3

M. Bruen
Abstract Cooperation in Science and Technology (COST) funding allows European scientists to establish international links, communicate their work to colleagues, and promote international research cooperation. COST731 was established to study the propagation of uncertainty from hydrometeorological observations through meteorological and hydrological models to the final flood forecast. Our focus is on how information about uncertainty is presented to the end user and how it is used. COST731 has assembled a number of demonstrations/case studies that illustrate a variety of practical approaches and these are presented here. While there is yet no consensus on how such information is presented, many end users do find it useful. Copyright 2010 Royal Meteorological Society [source]

End-to-end diagnostics in IPTV architectures

Kamakshi Sridhar
The introduction of new revenue-generating services like Internet Protocol television (IPTV) promises to bring to the end user a much more personal-ized communication and entertainment experience at an affordable cost. IPTV brings new features like video on demand (VoD), broadcast TV, and customized ad insertion, along with more traditional voice and data services whose realization requires a wholesale deployment of existing and new protocols and new network elements. Configuration, maintenance and troubleshooting of such networks, customized for each end user, are complex, and it is widely believed that providing diagnostics mechanisms is of substantial importance for rollout of IPTV. This paper describes research efforts at Alcatel-Lucent toward the definition and development of end-to-end diagnostics for IPTV architectures, comprising probes and mechanisms to detect problems in the network and to issue corrective measures. 2008 Alcatel-Lucent. [source]

A scheme for authentication and dynamic key exchange in wireless networks

Uri Blumenthal
Despite significant shortcomings in the initial security architecture, 802.11 wireless LANs have experienced explosive growth in recent years. Ongoing work in IEEE standards bodies is currently attempting to fix these shortcomings. One specific topic that has received extensive attention is how to enable these networks to authenticate users and to dynamically establish per-user per-session cryptographic keys. The IEEE 802.1x Port-Based Access Control standard, which formalizes a new EAP-over-LAN (EAPOL) protocol, has emerged as the preferred way to achieve this. The EAPOL protocol employs the extensible authentication protocol (EAP), standardized by the Internet Engineering Task Force, to allow the use of existing and new authentication methods and authentication, authorization, and accounting (AAA) infrastructure. In this paper we present a new EAP scheme,called shared key exchange (SKE),suitable for use in 802.11 private or public access wireless LANs. The scheme relies on secure pre-shared secret keys in wireless LAN mobile nodes devices and AAA servers. When instantiated with relatively minor changes to RADIUS and EAP,the resulting protocol is provably secure and offers a full set of security features. A second, simplified protocol results from minimal modifications to existing RADIUS and EAP standards, but it provides a lower level of security. Both protocols efficiently support roaming scenarios wherein an end user roams across different networks and requires frequent re-authentication with low latency. The protocols can easily be extended to support migration to new AAA protocols such as DIAMETER. 2002 Lucent Technologies Inc. [source]

Reference-Free Damage Classification Based on Cluster Analysis

Hoon Sohn
The ultimate goal of this study was to develop an in-site non-destructive testing (NDT) technique that can continuously and autonomously inspect the bonding condition between a carbon FRP (CFRP) layer and a host reinforced concrete (RC) structure, when the CRFP layer is used for strengthening the RC structure. The uniqueness of this reference-free NDT is two-fold: First, features, which are sensitive to CFRP debonding but insensitive to operational and environmental variations of the structure, have been extracted only from current data without direct comparison with previously obtained baseline data. Second, damage classification is performed instantaneously without relying on predetermined decision boundaries. The extraction of the reference-free features is accomplished based on the concept of time reversal acoustics, and the instantaneous decision-making is achieved using cluster analysis. Monotonic and fatigue load tests of large-scale CFRP-strengthened RC beams are conducted to demonstrate the potential of the proposed reference-free debonding monitoring technique. Based on the experimental studies, it has been shown that the proposed reference-free NDT technique may minimize false alarms of debonding and unnecessary data interpretation by end users. [source]

PASSing the provenance challenge

David A. Holland
Abstract Provenance-aware storage systems (PASS) are a new class of storage system treating provenance as a first-class object, providing automatic collection, storage, and management of provenance as well as query capabilities. We developed the first PASS prototype between 2005 and 2006, targeting scientific end users. Prior to undertaking the provenance challenge, we had focused on provenance collection and storage, without much emphasis on a query model or language. The challenge forced us to (quickly) develop a query model and infrastructure implementing this model. We present a brief overview of the PASS prototype and a discussion of the evolution of the query model that we developed for the challenge. Copyright 2007 John Wiley & Sons, Ltd. [source]

Automation in an addiction treatment research clinic: Computerised contingency management, ecological momentary assessment and a protocol workflow system

Abstract Introduction and Aims. A challenge in treatment research is the necessity of adhering to protocol and regulatory strictures while maintaining flexibility to meet patients' treatment needs and to accommodate variations among protocols. Another challenge is the acquisition of large amounts of data in an occasionally hectic environment, along with the provision of seamless methods for exporting, mining and querying the data. Design and Methods. We have automated several major functions of our outpatient treatment research clinic for studies in drug abuse and dependence. Here we describe three such specialised applications: the Automated Contingency Management (ACM) system for the delivery of behavioural interventions, the transactional electronic diary (TED) system for the management of behavioural assessments and the Protocol Workflow System (PWS) for computerised workflow automation and guidance of each participant's daily clinic activities. These modules are integrated into our larger information system to enable data sharing in real time among authorised staff. Results. ACM and the TED have each permitted us to conduct research that was not previously possible. In addition, the time to data analysis at the end of each study is substantially shorter. With the implementation of the PWS, we have been able to manage a research clinic with an 80 patient capacity, having an annual average of 18 000 patient visits and 7300 urine collections with a research staff of five. Finally, automated data management has considerably enhanced our ability to monitor and summarise participant safety data for research oversight. Discussion and Conclusions. When developed in consultation with end users, automation in treatment research clinics can enable more efficient operations, better communication among staff and expansions in research methods. [Vahabzadeh M, Lin J-L, Mezghanni M, Epstein DH, Preston KL. Automation in an addiction treatment research clinic: Computerised contingency management, ecological momentary assessment and a protocol workflow system. Drug Alcohol Rev 2009;28:3,11] [source]

An accessible micro-capillary electrophoresis device using surface-tension-driven flow

Swomitra K. Mohanty
Abstract We present a rapidly fabricated micro-capillary electrophoresis chip that utilizes surface-tension-driven flow for sample injection and extraction of DNA. Surface-tension-driven flow (i.e. passive pumping) [G. M. Walker et al., Lab. Chip. 2002, 2, 131,134] injects a fixed volume of sample that can be predicted mathematically. Passive pumping eliminates the need for tubing, valves, syringe pumps, and other equipment typically needed for interfacing with microelectrophoresis chips. This method requires a standard micropipette to load samples before separation, and remove the resulting bands after analysis. The device was made using liquid phase photopolymerization to rapidly fabricate the chip without the need of special equipment typically associated with the construction of microelectrophoresis chips (e.g. cleanroom) [A. K. Agarwal et al., J. Micromech. Microeng. 2006, 16, 332,340; S. K. Mohanty et al., Electrophoresis 2006, 27, 3772,3778]. Batch fabrication time for the device presented here was 1.5,h including channel coating time to suppress electroosmotic flow. Devices were constructed out of poly-isobornyl acrylate and glass. A standard microscope with a UV source was used for sample detection. Separations were demonstrated using Promega BenchTop 100,bp ladder in hydroxyl ethyl cellulose (HEC) and oligonucleotides of 91 and 118,bp were used to characterize sample injection and extraction of DNA bands. The end result was an inexpensive micro-capillary electrophoresis device that uses tools (e.g. micropipette, electrophoretic power supplies, and microscopes) already present in most labs for sample manipulation and detection, making it more accessible for potential end users. [source]

Enhancing technology development through integrated environmental analysis: Toward sustainable nonlethal military systems

Oral S. Saulters
Abstract New technologies are not only critical in supporting traditional industrial and military success but also play a pivotal role in advancing sustainability and sustainable development. With the current global economic challenges, resulting in tighter budgets and increased uncertainty, synergistic paradigms and tools that streamline the design and dissemination of key technologies are more important than ever. Accordingly, a proactive and holistic approach can facilitate efficient research, design, testing, evaluation, and fielding for novel and off-the-shelf products, thereby assisting developers, end users, and other diverse stakeholders in better understanding tradeoffs in the defense industry and beyond. By prioritizing mechanisms such as strategic life-cycle environmental assessments (LCEA); programmatic environment, safety, and occupational health evaluations (PESHE); health hazard assessments (HHA); and other innovative platforms and studies early within systems engineering, various nonlethal military technologies have been successfully developed and deployed. These efforts provide a framework for addressing complex environment, safety, and occupational health risks that affect personnel, infrastructure, property, socioeconomic, and natural/cultural resources. Moreover, integrated, comprehensive, multidisciplinary, and iterative analyses involving flexible groups of specialists/subject matter experts can be applied at various spatiotemporal scales in support of collaborations. This paper highlights the Urban Operations Laboratory process utilized for inclusive and transformative environmental analysis, which can translate into advantages and progress toward sustainable systems. Integr Environ Assess Manag 2010;6:281,286. 2009 SETAC [source]

A network-centric approach for access and interface selection in heterogeneous wireless environments

George Koundourakis
Abstract In this paper, we introduce a network-based approach for access and interface selection (AIS) in the context of resource management in heterogeneous wireless environments (UMTS, WLAN and DVB-T). We focus on the optimization of resource utilization, while ensuring acceptable quality of service (QoS) provision to the end users. Our objective is to optimally manage the overall system resources and minimize the possibility of QoS handovers (non-mobility handovers). The adopted architecture applies to typical heterogeneous environments and network entities (Access Routers) are enhanced with extra functionalities. We propose an AIS algorithm that exploits the multihoming concept and globally manages network resources at both radio access and IP backbone networks. The algorithm can estimate near-optimal solutions in real time and we also introduce a novel triggering policy. We present simulation results of typical scenarios that demonstrate the advantages of our approach. System performance metrics, derived from the simulations, show minimum degradations in high load and congestion situations. Copyright 2007 John Wiley & Sons, Ltd. [source]

New product development practices of urban regeneration units: a comparative international study

Roger Bennett
The new product development (NPD) activities of 14 not-for-profit urban regeneration organisations in three cities (London, Copenhagen and Boston) were examined to establish the degree to which they reflected the best practices recommended by the academic NPD literature in the for-profit field. Executives in each organisation were questioned about the stages of the NPD process that they activated most intensively, relationships between marketing staff and technical urban development specialists, mechanisms for consulting end users of place products, methods for generating new ideas and the major problems they experienced. Parallels between the NPD behaviour of nonprofit urban regeneration organisations managing projects involving widespread change and that previously observed among for-profit organisations engaged in the development of radically new products were investigated. Copyright 2004 Henry Stewart Publications [source]

Consumer participation: Ensuring suicide postvention research counts for end users

Anne Wilson PhD RN BN MN
Wilson A. International Journal of Nursing Practice 2010; 16: 7,13 Consumer participation: Ensuring suicide postvention research counts for end users Primary health-care research is about working with those who have a vested interest in the outcomes of that research, including consumers, service providers and service organizations. This article describes how consumers were included in the research processes of a South Australian study into suicide postvention services, and illustrates important principles to consider when including consumers in research. A concurrent mixed-method approach facilitated the collection of mixed data through the application of questionnaires. The study was conducted in an Australian metropolitan area. Because of media releases, a large number of people rang to enquire and volunteer their participation. From over 200 expressions of interest, 161 individuals participated. The participation of consumers in the research process ensured the findings were relevant for end users. A number of recommendations for the care and support of those bereaved through suicide were developed as a result. [source]

Accountability, Participation and Foreign Aid Effectiveness

Matthew S. Winters
Foreign aid involves a chain of accountability relationships stretching from international donors through national governments and implementing agencies to a set of ultimate end users of the goods and services financed by the aid. In this paper, I review five different accountability relationships that exist in foreign aid projects among donors, governments, implementing agencies and end users. Then I summarize existing empirical evidence demonstrating that foreign aid functions better,both at the macro-level of aid flows and at the micro-level of individual aid projects,when there is more government and implementing agency accountability. Specifying several mechanisms that facilitate accountability, I emphasize that participation is a tool often used to produce accountability within aid projects. However, in terms of donor accountability to aid-receiving countries and the end users in them, recent pushes for increased participation have not resulted in more accountability in the design of aid programs. Ultimately, although enthusiasm for participatory models of aid design and delivery is warranted, participation is not a panacea for all the accountability problems in foreign aid programs. [source]

Public disclosure of comparative clinical performance data: lessons from the Scottish experience

Russell Mannion PhD
Abstract There is growing international interest in making information available on the clinical quality and performance of health care providers. In the United States of America, where public reporting is most advanced, comparative performance information in the form of ,report cards', ,provider profiles' and ,consumer reports' has been published for over a decade. In Europe, Scotland has been at the forefront of releasing clinical performance data and has disseminated such information since 1994. This paper reviews the Scottish experience of public disclosure and distils the key lessons for other countries seeking to implement similar programmes. It is based on the findings of the first empirical evaluation of a national clinical reporting initiative outside the United States. The study examined the impact of publication of Scottish (CRAG) clinical outcome indicators on four key stakeholder groups: health care providers, regional government health care purchasers, general practitioners and consumer advocacy agencies. We conclude that those responsible for developing clinical reporting systems should not only pay close attention to developing technically valid and professionally credible data which are tailored to the information needs of different end users, but should also focus on developing a suitable incentive structure and organizational environment that fosters the constructive use of such information. [source]

Generation and visualization of large-scale three-dimensional reconstructions from underwater robotic surveys

Matthew Johnson-Roberson
Robust, scalable simultaneous localization and mapping (SLAM) algorithms support the successful deployment of robots in real-world applications. In many cases these platforms deliver vast amounts of sensor data from large-scale, unstructured environments. These data may be difficult to interpret by end users without further processing and suitable visualization tools. We present a robust, automated system for large-scale three-dimensional (3D) reconstruction and visualization that takes stereo imagery from an autonomous underwater vehicle (AUV) and SLAM-based vehicle poses to deliver detailed 3D models of the seafloor in the form of textured polygonal meshes. Our system must cope with thousands of images, lighting conditions that create visual seams when texturing, and possible inconsistencies between stereo meshes arising from errors in calibration, triangulation, and navigation. Our approach breaks down the problem into manageable stages by first estimating local structure and then combining these estimates to recover a composite georeferenced structure using SLAM-based vehicle pose estimates. A texture-mapped surface at multiple scales is then generated that is interactively presented to the user through a visualization engine. We adapt established solutions when possible, with an emphasis on quickly delivering approximate yet visually consistent reconstructions on standard computing hardware. This allows scientists on a research cruise to use our system to design follow-up deployments of the AUV and complementary instruments. To date, this system has been tested on several research cruises in Australian waters and has been used to reliably generate and visualize reconstructions for more than 60 dives covering diverse habitats and representing hundreds of linear kilometers of survey. 2009 Wiley Periodicals, Inc. [source]


ABSTRACT The northeastern hills of India are endowed with rich source of rice germplasm, which may be safely estimated about 9,000 accessions, excluding the redundancies. Even though much of the germplasm have been collected, studies on nutritional aspects of these local cultivars are still lacking. Fifteen important indigenous rice genotypes collected from different rice growing ecosystem of this region were studied for physical and nutritional qualities. Kernel color of the genotypes varied from white to dark purple. All the genotypes except Manipuri were of bold-grain type. Most of the genotypes studied have fat contents more than 2.0%. The protein content was found higher in Chahou angouba and Naga special. Five cultivars were identified as high-protein cultivars of rice, with 10,12.07% protein content. Amylose content varied from 2.27 to 24.5%. Most of long-grained genotypes recorded lesser amylose than short grained. Chahou varieties were found aromatic and glutinous, which demand higher market prices in local market. PRACTICAL APPLICATION The north-eastern hills of India are endowed with rich source of rice germplasm, and much of the germplasm have been collected, but studies on basic and advanced nutritional aspects of these local cultivars are still lacking. This part of India has valuable rice genotypes of strong aroma, glutinous characters and slender grains with high amount of protein, fat and fiber. Having not known to the rest of the world and even to indigenous end users, some of such cultivars have already been lost, and some more are at the verge of extinction. Quality evaluation done in the present study provided useful information on their commercial exploitation and utilization in breeding programs of nutritional enhancement of rice to fight malnutrition among rice-consuming population, which is largest in the world. [source]

Half a Century of Public Software Institutions: Open Source as a Solution to Hold-Up Problem

We argue that the intrinsic inefficiency of proprietary software has historically created a space for alternative institutions that provide software as a public good. We discuss several sources of such inefficiency, focusing on one that has not been described in the literature: the underinvestment due to fear of hold-up. An inefficient hold-up occurs when a user of software must make complementary investments, when the return on such investments depends on future cooperation of the software vendor, and when contracting about a future relationship with the software vendor is not feasible. We also consider how the nature of the production function of software makes software cheaper to develop when the code is open to the end users. Our framework explains why open source dominates certain sectors of the software industry (e.g., programming languages), while being almost non existent in some other sectors (e.g., computer games). We then use our discussion of efficiency to examine the history of institutions for provision of public software from the early collaborative projects of the 1950s to the modern "open source" software institutions. We look at how such institutions have created a sustainable coalition for provision of software as a public good by organizing diverse individual incentives, both altruistic and profit-seeking, providing open source products of tremendous commercial importance, which have come to dominate certain segments of the software industry. [source]


ABSTRACT Clothing is continuously in an interaction with the body both thermally and mechanically. Different sensations constituting the comfort status of a person arise as a result of this interaction. Coolness sensation perceived during skin-fabric contact is one of these sensations arising from the transient heat flow from skin to the fabric as skin is usually warmer than clothing. In this study, coolness to touch and dampness sensations created by knitted fabrics having different compositions and physical surface characteristics were investigated by forearm test conducted on seven males. Besides physical properties (weight, yarn count, thickness, density), surface roughness and friction properties of the inner surfaces of the fabrics touching the skin were also determined. Microscopic photographs were taken to have an idea about hairiness properties of the inner surfaces and optical porosity values were calculated by analysis of the microscopic images by using MATLAB software. It was found out that coolness and dampness sensations arise during skin,fabric contact are mostly related to the permeability and surface roughness characteristics of fabrics, and the effect of fabric material is more on dampness sensation than coolness sensation. PRACTICAL APPLICATIONS In the recent years, consumers pay attention more to the mechanical, thermal and visual sensations stimulated by the dynamic body,clothing interactions besides the aesthetic properties of their clothing. They take into consideration feelings they have during first touch with the clothing into their purchase decisions. Coolness to touch sensation perceived during first contact with the fabric and dampness sensation , which is very important during wear conditions including sweating , are two of them and they are related to the thermophysiological aspect of clothing comfort. For producing garments giving desirable feelings, it is very important to determine fabric properties influencing these sensations. A subjective evaluation method , the forearm test , was used to find out the relationships between coolness and dampness sensations and fabric properties. Results of this study are thought to be beneficial data for fabric manufacturers aiming to produce clothing for specific end users. [source]

Twenty-five years of end-user searching, Part 1: Research findings

Karen Markey
This is the first part of a two-part article that reviews 25 years of published research findings on end-user searching in online information retrieval (IR) systems. In Part 1 (Markey, 2007), the author seeks to answer the following questions: What characterizes the queries that end users submit to online IR systems? What search features do people use? What features would enable them to improve on the retrievals they have in hand? What features are hardly ever used? What do end users do in response to the system's retrievals? Are end users satisfied with their online searches? Summarizing searches of online IR systems by the search features people use everyday makes information retrieval appear to be a very simplistic one-stop event. In Part 2, the author examines current models of the information retrieval process, demonstrating that information retrieval is much more complex and involves changes in cognition, feelings, and/or events during the information seeking process. She poses a host of new research questions that will further our understanding about end-user searching of online IR systems. [source]

State digital library usability: Contributing organizational factors

Hong (Iris) Xie
Usage and user feedback about a state digital library, in which the developers/designers, content providers, different types of libraries and their staffs, and a variety of user groups represent a loose federation of separate organizations with diverse expectations and needs, are investigated. Through corroboratory evidence from usage statistics of Internet-based database services available through the digital library, responses to a statewide-administered library survey, and a Web-based survey of end users, the authors identify contributing factors for the organizational usability of state digital libraries. The authors refine and enhance an organizational usability model for the unique environment of state digital libraries and identify three modes of interaction (influence, communication, activity) and the challenges each interaction presents: in addressing diverse player needs and expectations; the unequal awareness and training in using state digital libraries; and the lack of sufficient communication channels among players. In addition, the findings highlight the double-edged impact of physical libraries on the state digital library. [source]

Communication in performance-based training and instruction: From design to practice

Josephine A. Larbi-Apau
Communication is inextricably important to instructional design and performance-based training. Promoting effective communication as an integral part of the performance support system improves professional instructional design functions and offers greater avenues for meaningful discourse among end users of the instruction. In this article, we highlight communication in performance training and instruction for meaningful learning and effective exchange of knowledge. Internal and external communications are discussed as a means to promoting successful relationships, commitment, and ownership. [source]

Evaluation of photovoltaic modules based on sampling inspection using smoothed empirical quantiles,

Ansgar Steland
Abstract An important issue for end users and distributors of photovoltaic (PV) modules is the inspection of the power output specification of a shipment. The question is whether or not the modules satisfy the specifications given in the data sheet, namely the nominal power output under standard test conditions, relative to the power output tolerance. Since collecting control measurements of all modules is usually unrealistic, decisions have to be based on random samples. In many cases, one has access to flash data tables of final output power measurements (flash data) from the producer. We propose to rely on the statistical acceptance sampling approach as an objective decision framework, which takes into account both the end users and producers risk of a false decision. A practical solution to the problem is discussed which has been recently found by the authors. The solution consists of estimates of the required optimal sample size and the associated critical value where the estimation uses the information contained in the additional flash data. We propose and examine an improved solution which yields even more reliable estimated sampling plans as substantiated by a Monte Carlo study. This is achieved by employing advanced statistical estimation techniques. Copyright 2009 John Wiley & Sons, Ltd. [source]

Visualizing flood forecasting uncertainty: some current European EPS platforms,COST731 working group 3

M. Bruen
Abstract Cooperation in Science and Technology (COST) funding allows European scientists to establish international links, communicate their work to colleagues, and promote international research cooperation. COST731 was established to study the propagation of uncertainty from hydrometeorological observations through meteorological and hydrological models to the final flood forecast. Our focus is on how information about uncertainty is presented to the end user and how it is used. COST731 has assembled a number of demonstrations/case studies that illustrate a variety of practical approaches and these are presented here. While there is yet no consensus on how such information is presented, many end users do find it useful. Copyright 2010 Royal Meteorological Society [source]