Manager

Distribution by Scientific Domains
Distribution within Business, Economics, Finance and Accounting

Kinds of Manager

  • care manager
  • case manager
  • development manager
  • fund manager
  • nurse manager
  • project manager
  • resource manager

  • Terms modified by Manager

  • manager characteristic
  • manager level
  • manager role

  • Selected Abstracts


    Clear Communications and Feedback Can Improve Manager and Employee Effectiveness

    EMPLOYMENT RELATIONS TODAY, Issue 2 2002
    Stephen Xavier
    First page of article [source]


    Liquidity: Considerations of a Portfolio Manager

    FINANCIAL MANAGEMENT, Issue 1 2009
    Laurie Simon Hodrick
    This paper examines liquidity and how it affects the behavior of portfolio managers, who account for a significant portion of trading in many assets. We define an asset to be perfectly liquid if a portfolio manager can trade the quantity she desires when she desires at a price not worse than the uninformed expected value. A portfolio manager is limited by both what she needs to attain and the ease with which she can attain it, making her sensitive to three dimensions of liquidity: price, timing, and quantity. Deviations from perfect liquidity in any of these dimensions impose shadow costs on the portfolio manager. By focusing on the trade-off between sacrificing on price and quantity instead of the canonical price-time trade-off, the model yields several novel empirical implications. Understanding a portfolio manager's liquidity considerations provides important insights into the liquidity of many assets and asset classes. [source]


    IT project managers' construction of successful project management practice: a repertory grid investigation

    INFORMATION SYSTEMS JOURNAL, Issue 3 2009
    Nannette P. Napier
    Abstract Although effective project management is critical to the success of information technology (IT) projects, little empirical research has investigated skill requirements for IT project managers (PMs). This study addressed this gap by asking 19 practicing IT PMs to describe the skills that successful IT PMs exhibit. A semi-structured interview method known as the repertory grid (RepGrid) technique was used to elicit these skills. Nine skill categories emerged: client management, communication, general management, leadership, personal integrity, planning and control, problem solving, systems development and team development. Our study complements existing research by providing a richer understanding of several skills that were narrowly defined (client management, planning and control, and problem solving) and by introducing two new skill categories that had not been previously discussed (personal integrity and team development). Analysis of the individual RepGrids revealed four distinct ways in which study participants combined skill categories to form archetypes of effective IT PMs. We describe these four IT PM archetypes , General Manager, Problem Solver, Client Representative and Balanced Manager , and discuss how this knowledge can be useful for practitioners, researchers and educators. The paper concludes with suggestions for future research. [source]


    Neural Signal Manager: a collection of classical and innovative tools for multi-channel spike train analysis

    INTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, Issue 11 2009
    Antonio Novellino
    Abstract Recent developments in the neuroengineering field and the widespread use of the micro electrode arrays (MEAs) for electrophysiological investigations made available new approaches for studying the dynamics of dissociated neuronal networks as well as acute/organotypic slices maintained ex vivo. Importantly, the extraction of relevant parameters from these neural populations is likely to involve long-term measurements, lasting from a few hours to entire days. The processing of huge amounts of electrophysiological data, in terms of computational time and automation of the procedures, is actually one of the major bottlenecks for both in vivo and in vitro recordings. In this paper we present a collection of algorithms implemented within a new software package, named the Neural Signal Manager (NSM), aimed at analyzing a huge quantity of data recorded by means of MEAs in a fast and efficient way. The NSM offers different approaches for both spike and burst analysis, and integrates state-of-the-art statistical algorithms, such as the inter-spike interval histogram or the post stimulus time histogram, with some recent ones, such as the burst detection and its related statistics. In order to show the potentialities of the software, the application of the developed algorithms to a set of spontaneous activity recordings from dissociated cultures at different ages is presented in the Results section. Copyright © 2008 John Wiley & Sons, Ltd. [source]


    Technology in nursing scholarship: Use of citation reference managers

    INTERNATIONAL JOURNAL OF MENTAL HEALTH NURSING, Issue 3 2007
    Cheryl M. Smith
    ABSTRACT:, Nurses, especially those in academia, feel the pressure to publish but have a limited time to write. One of the more time-consuming and frustrating tasks of research, and subsequent publications, is the collection and organization of accurate citations of sources of information. The purpose of this article is to discuss three types of citation reference managers (personal bibliographic software) and how their use can provide consistency and accuracy in recording all the information needed for the research and writing process. The advantages and disadvantages of three software programs, EndNote, Reference Manager, and ProCite, are discussed. These three software products have a variety of options that can be used in personal data management to assist researchers in becoming published authors. [source]


    SOMALIA: Foreign Fund Manager

    AFRICA RESEARCH BULLETIN: ECONOMIC, FINANCIAL AND TECHNICAL SERIES, Issue 7 2009
    Article first published online: 27 AUG 200
    No abstract is available for this article. [source]


    Meta-analysis: The efficacy and safety of monoclonal antibody targeted to epidermal growth factor receptor in the treatment of patients with metastatic colorectal cancer

    JOURNAL OF DIGESTIVE DISEASES, Issue 4 2009
    Fang NIE
    OBJECTIVE: To evaluate systematically the efficacy and safety of anti-epidermal growth factor receptor (EGFR) monoclonal antibody added to a chemotherapeutic regimen in the treatment of patients with metastatic colorectal cancer (mCRC). METHODS: Eligible articles were identified by searching electronic databases. All randomized trials comparing the arm with an anti-EGFR monoclonal antibody to the arm without an anti-EGFR monoclonal antibody during the treatment of mCRC were included. A statistical analysis was performed with Review Manager 4.2.8. RESULTS: Seven randomized trials (n= 4186) were identified. The pooled response rates were 25.4% and 17.6% by intention-to-treat analyses for patients with or without an anti-EGFR monoclonal antibody, respectively, the OR was 3.36 (95% CI 1.42,7.95); the incidence of grades 3,4 adverse events were 71.2% and 54.3% for two groups, respectively, the OR was 2.23 (95% CI 1.74,2.86). The incidence of diarrhea, skin toxicity, hypomagnesemia was 62.3% versus 55.7%; 79.3% versus 19.7%; 27.2% versus 5.6%; and the summary OR was 1.36 (95% CI 1.03,1.80); 33.47 (95% CI 14.81,75.61); 6.73 (95% CI 3.84,11.82), respectively. CONCLUSION: Our results confirmed that monoclonal antibody targeted to EGFR could be effective in increasing response rates and could be a key therapeutic agent in the optimal treatment of mCRC, despite a moderate increase in grades 3,4 adverse events. [source]


    A phenomenological exploration of intellectual disability: nurse's experiences of managerial support

    JOURNAL OF NURSING MANAGEMENT, Issue 6 2010
    GERALDINE GALVIN MSc, RNID
    galvin g. & timmins f. (2010) Journal of Nursing Management 18, 726,735 A phenomenological exploration of intellectual disability: nurse's experiences of managerial support Aim, The present study aimed to explore Registered Nurse Intellectual Disabilities (RNIDs) experiences of managerial support. Background, The current work environment for RNIDs is undergoing immense change. These changes include the introduction of social care leaders and care staff to care for people with an intellectual disability (ID) and community-based approaches to care. This has led to ambiguity and marginalization for RNIDs thus requiring them to re-establish their role boundaries. Support is thus required, through this change process, with managers required to lead and support RNIDs through this process. Methods, A Heideggerian constructivist phenomenological approach was used. Findings, Four overarching themes emerged from the data: The Professional Role of the Clinical Nurse Manager (CNM), Leadership Role of the CNM, Personal Supports and the Effects of CNM support. Conclusion, The themes found in this research study correlate with findings of other research studies on nurses' experiences of managerial support in various nurse settings. Implications for nursing management, The findings of this research study will illuminate and create an understanding for nurses, nurse managers and ID services of what managerial supports are to this specific group of RNIDs working in this service for people with an ID. [source]


    Valuing knowledge sharing in Lafarge

    KNOWLEDGE AND PROCESS MANAGEMENT: THE JOURNAL OF CORPORATE TRANSFORMATION, Issue 1 2006
    Alexandre Perrin
    This paper describes the detailed process of a knowledge sharing strategy at Lafarge, a global player in the construction materials industry. The case study explains why this company uses such a strategy to create value for stakeholders, provide a local access to know-how and build a knowledge sharing culture among divisions. It argues that a well-articulated approach of a knowledge sharing strategy consists in creating a knowledge portfolio, supporting a knowledge management structure, providing tools for collaboration and nurture a culture of knowledge sharing through awards. At the end, we discuss the lessons learned by the Corporate Knowledge Manager on this strategy and we study how she can quantify the value brought by its action. In a whole, this paper gives insights of critical issues in moving a global company towards a knowledge-sharing organization. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    Strategies and students: beginning teachers' early encounters with national policy

    LITERACY, Issue 2 2006
    Andrey Rosowsky
    Abstract The 1-year Postgraduate Certificate in Education Secondary English method course at the University of Sheffield's School of Education has, since 2001, asked its students to write an essay of around 4000 words on their initial understanding and experience of the National Strategies promoted by the United Kingdom's Department for Education and Skills. The essay expects a critical, reflective and analytical piece of writing that records the student teacher's developing views on the place, role and value of the National Strategies in the classroom. Using grounded theory and content analysis techniques, this small-scale study of the 2005 cohort identifies common perceptions regarding the National Strategies among student teachers of English and seeks to categorise these to account for their developing identities as future English teachers. Drawing on Twiselton's identification of teacher types, Task Manager, Curriculum Deliverer and Concept/Skill Builder, and Shulman's classification of knowledges necessary for teaching, this article will argue that the National Strategies and their respective Frameworks, while successful in moving teachers on from the role of ,Task Managers', runs the risk of locking teachers into being ,Curriculum Deliverers', and not developing the pedagogical content knowledge necessary for teaching English expertly. [source]


    Patient-controlled Analgesia in Intrathecal Therapy for Chronic Pain: Safety and Effective Operation of the Model 8831 Personal Therapy Manager with a Pre-implanted SynchroMed Infusion System

    NEUROMODULATION, Issue 3 2003
    Jan Maeyaert
    Abstract The Model 8831 Personal Therapy Manager (PTM) offers a patient-controlled analgesia (PCA) option for the SynchroMed Infusion System (Medtronic Inc., Minneapolis, MN). The safety and effective operation of the PTM activator was evaluated in 45 patients in five European centers receiving intrathecal drug infusion for the treatment of chronic pain via a SynchroMed pump. The total volume of drug delivered intrathecally over a four-week follow-up period was calculated. Adverse events were recorded and pain levels were measured via the Visual Analog pain Scale (VAS), Brief Pain Inventory, and SF-12 Quality of Life scores. Patient satisfaction with the device and its instruction manual was also assessed by questionnaire. The expected and calculated intrathecal drug volumes (and therefore drug doses) were the same, demonstrating that the device worked as intended. There were no device-related serious adverse events. Overall, 96% of patients were satisfied with the activator. Patients appreciated being able to control their pain and considered the device and its instructions easy to use. The PTM was shown to be safe and functioning properly in the intrathecal treatment of pain. The successful addition of a PCA function to the SynchroMed system may create a new standard in intrathecal pain therapy. [source]


    Clinical outcomes of corneal wavefront customized ablation strategies with SCHWIND CAM in LASIK treatments

    OPHTHALMIC AND PHYSIOLOGICAL OPTICS, Issue 5 2009
    Maria Clara Arbelaez
    Abstract Purpose:, To evaluate the clinical outcomes of aspheric corneal wavefront (CW) ablation profiles in LASIK treatments. Methods:, Thirty eyes treated with CW ablation profiles were included after a follow-up of 6 months. In all cases, standard examinations including preoperative and postoperative wavefront analysis with a CW topographer (Optikon Keratron Scout) were performed. Custom Ablation Manager (CAM) software was used to plan corneal wavefront customized aspheric treatments, and the ESIRIS flying spot excimer laser system was used to perform the ablations (both SCHWIND eye-tech-solutions, Kleinhostheim, Germany). Clinical outcomes were evaluated in terms of predictability, refractive outcome, safety, and wavefront aberration. Results:, In general, the postoperative uncorrected visual acuity and the best corrected visual acuity improved (p < 0.001). In particular, the trefoil, coma, and spherical aberrations, as well as the total root-mean-square values of higher order aberrations, were significantly reduced (p < 0.05) when the pre-existing aberrations were greater than the repeatability and the biological noise. Conclusions:, The study results indicate that the aspheric corneal wavefront customized CAM approach for planning ablation volumes yields visual, optical, and refractive results comparable to those of other wavefront-guided customized techniques for correction of myopia and myopic astigmatism. The CW customized approach shows its strength in cases where abnormal optical systems are expected. Apart from the risk of additional ablation of corneal tissue, systematic wavefront-customized corneal ablation can be considered as a safe and beneficial method. [source]


    Retracted: Static analytical models: Applications within a military domain

    PERFORMANCE IMPROVEMENT, Issue 3 2008
    J. Brett Hollowell
    The following article from Performance Improvement entitled "Static Analytical Models: Applications Within a Military Domain," by J. Brett Hollowell and Vanessa R. Mazurek, published in March 2008, Volume 28, Issue 3, has been retracted by agreement between the authors, the Editor, Holly Burkett, the Publications Manager, John Chen, the President, Armed Forces Chapter, Roger Chevalier, and John Wiley & Sons, Inc. The International Society for Performance Improvement and the Editorial Board of Performance Improvement have concluded that the article did not contain certain necessary references and citations to document N74 (Models and Simulations) created by the Human Performance Center Orlando. A number of unaccredited models, figures, text excerpts and lists appear identically in both document N74 and the above article. Although the Editorial Board felt a retraction was necessary, it wishes to note its belief that the authors had the best of intentions in contributing the article with the purpose of broadening our readers's knowledge of the field, and that neither Mr. Hollowell nor Ms. Mazurek have been accused of any misconduct with respect to the contents of the article. John Chen, Publications Manager Holly Burkett, Editor Roger Chevalier, President, Armed Forces Chapter [source]


    Improving feature detection and analysis of surface-enhanced laser desorption/ionization-time of flight mass spectra

    PROTEINS: STRUCTURE, FUNCTION AND BIOINFORMATICS, Issue 11 2005
    Scott M. Carlson
    Abstract Discovering valid biological information from surface-enhanced laser desorption/ionization-time of flight mass spectrometry (SELDI-TOF MS) depends on clear experimental design, meticulous sample handling, and sophisticated data processing. Most published literature deals with the biological aspects of these experiments, or with computer-learning algorithms to locate sets of classifying biomarkers. The process of locating and measuring proteins across spectra has received less attention. This process should be tunable between sensitivity and false-discovery, and should guarantee that features are biologically meaningful in that they represent chemical species that can be identified and investigated. Existing feature detection in SELDI-TOF MS is not optimal for acquiring biologically relevant data. Most methods have so many user-defined settings that reproducibility and comparability among studies suffer considerably. To address these issues, we have developed an approach, called simultaneous spectrum analysis (SSA), which (i) locates proteins across spectra, (ii),measures their abundance, (iii),subtracts baseline, (iv),excludes irreproducible measurements, and (v),computes normalization factors for comparing spectra. SSA uses only two key parameters for feature detection and one parameter each for quality thresholds on spectra and peaks. The effectiveness of SSA is demonstrated by identifying proteins differentially expressed in SELDI-TOF spectra from plasma of wild-type and knockout mice for plasma glutathione peroxidase. Comparing analyses by SSA and CiphergenExpress Data Manager,2.1 finds similar results for large signal peaks, but SSA improves the number and quality of differences betweens groups among lower signal peaks. SSA is also less likely to introduce systematic bias when normalizing spectra. [source]


    Interfirm Modularity and Its Implications for Product Development,

    THE JOURNAL OF PRODUCT INNOVATION MANAGEMENT, Issue 4 2005
    Nancy Staudenmayer
    Industries characterized by interfirm modularity, in which the component products of different firms work together to create a system, are becoming increasingly widespread. In such industries, the existence of a common architecture enables consumers to mix and match the products of different firms. Industries ranging from stereos, cameras, and bicycles to computers, printing, and wireless services are now characterized by interfirm modularity. While the increasing presence of this context has been documented, the implications for the product development process remain underdeveloped. For the present study, in-depth field-based case studies of seven firms experiencing an environment of interfirm modularity were conducted in order to deepen understanding of this important phenomenon. What unique challenges did this context pose and why? What solutions did firms experiment with, and which seemed to work? Based on an inductive process of data analysis from these case studies, three primary categories of challenges raised by this environment were identified. First, firms were frustrated at their lack of control over the definition of their own products. The set of features and functions in products were constrained to a great extent by an architecture that the firm did not control. Second, while an environment of interfirm modularity should in theory eliminate interdependencies among firms since interfaces between products are defined ex-ante, the present study found, ironically, that interdependencies were ubiquitous. Interdependencies continually emerged throughout the product development process, despite efforts to limit them. Third, firms found that the quantity and variegated nature of external relationships made their management exceedingly difficult. The sheer complexity was daunting, given both the size of the external network as well as the number of ties per external collaborator. Partners with whom control over the architecture was shared often had divergent interests,or at least not fully convergent interests. The solutions to these challenges were creative and in many cases counter to established wisdom. For instance, research has suggested many ways for a firm to influence architectural standards. While the firms in the present sample followed some of this advice, they also focused on a more neglected aspect of architecture,the compliance and testing standards that accompany modules and interfaces. By concentrating their efforts in a different area, even smaller firms in this sample were able to have some influence. Instead of focusing on the elimination of interdependencies, it was found that firms benefited from concentrating on the management of interdependencies as they emerged. Finally, while layers of management and "bureaucracy" are often viewed as unproductive, these firms found that adding structure, through positions such as Relationship Manager, was highly beneficial in handling the coordination and control of a wide range of external relationships. [source]


    The diffusion of marketing science in the practitioners' community: opening the black box

    APPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY, Issue 4-5 2005
    Albert C. Bemmaor
    Abstract This editorial discusses an illustration of the potential hindrances to the diffusion of modern methodologies in the practitioners' (i.e. the buyers of research, not the consultants) community. Taking the example of classical regression analysis based on store-level scanner data, the authors discuss the potential limitations of the classical regression model, with the example of the occurrence of ,wrong' signs and of coefficients with unexpected magnitudes. In an interview with one of the authors, a (randomly picked) Senior Marketing Research Manager at a leading firm of packaged goods reports his/her experience with econometric models. To him/her, econometric models are presented as a ,black box' (his/her written words). In his/her experience, they provided results that were ,quite good' in a ,much focused' context only. There were experimental data obtained with a Latin square design and the analysis included a single brand with only four stock-keeping units (SKUs). The company ,dropped' the more ,ambitious' studies, which analysed the effect of the retail promotions run by all the actors in a market because of a lack of predictive accuracy (his/her written words are in quotes). The authors suggest that Bayesian methodology can help open the black box and obtain more acceptable results than those obtained at present. Copyright © 2005 John Wiley & Sons, Ltd. [source]


    Similar effects of disease-modifying antirheumatic drugs, glucocorticoids, and biologic agents on radiographic progression in rheumatoid arthritis: Meta-analysis of 70 randomized placebo-controlled or drug-controlled studies, including 112 comparisons

    ARTHRITIS & RHEUMATISM, Issue 10 2010
    Niels Graudal
    Objective To define the differences in effects on joint destruction in rheumatoid arthritis (RA) patients between therapy with single and combination disease-modifying antirheumatic drugs (DMARDs), glucocorticoids, and biologic agents. Methods Randomized controlled trials in RA patients, investigating the effects of drug treatment on the percentage of the annual radiographic progression rate (PARPR) were included in a meta-analysis performed with the use of Review Manager 5.0 software according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) Statement protocol. Results Data from 70 trials (112 comparisons, 16 interventions) were summarized in 21 meta-analyses. Compared with placebo, the PARPR was 0.65% smaller in the single-DMARD group (P < 0.002) and 0.54% smaller in the glucocorticoid group (P < 0.00001). Compared with single-DMARD treatment, the PARPR was 0.62% smaller in the combination-DMARD group (P < 0.001) and 0.61% smaller in the biologic agent plus methotrexate (MTX) group (P < 0.00001). The effect of a combination of 2 DMARDs plus step-down glucocorticoids did not differ from the effect of a biologic agent plus MTX (percentage mean difference ,0.07% [95% confidence interval ,0.25, 0.11]) (P = 0.44). Conclusion Treatment with DMARDs, glucocorticoids, biologic agents, and combination agents significantly reduced radiographic progression at 1 year, with a relative effect of 48,84%. A direct comparison between the combination of a biologic agent plus MTX and the combination of 2 DMARDs plus initial glucocorticoids revealed no difference. Consequently, biologic agents should still be reserved for patients whose RA is resistant to DMARD therapy. Future trials of the effects of biologic agents on RA should compare such agents with combination treatments involving DMARDs and glucocorticoids. [source]


    In Need of Review?

    BRITISH JOURNAL OF SPECIAL EDUCATION, Issue 3 2002
    Statements of Special Educational Needs, The Audit Commission's Report on Statutory Assessment
    Since spring 2001, the Audit Commission has been carrying out research into provision for children with special educational needs. In this article, Anne Pinney, Project Manager with the Public Services Research section at the Audit Commission, summarises the findings presented in an interim report published in June 2002. She reveals widespread dissatisfaction with current approaches to assessment; the process of developing a Statement; the allocation of resources to support children with special educational needs; and the procedures used by schools and LEAs to ensure that SEN provision is effective. Anne Pinney goes on to set out the recommendations made by the Audit Commission in its interim report. These include a collaborative approach to review involving schools and LEAs; increased delegation of resources to schools; and the development of more effective inter,agency approaches to assessment and intervention. This article concludes with a call for a high level independent review of SEN policy and practice focused on resolving the tensions in the current system. Anne Pinney also looks forward to a number of other outcomes from the Audit Commission's work in relation to children with special educational needs. BJSE will be bringing you news of these developments in future issues. [source]


    ,-Adrenoceptor agonists for the treatment of vasovagal syncope: a meta-analysis of worldwide published data

    ACTA PAEDIATRICA, Issue 7 2009
    Ying Liao
    Abstract Aim:, The present study was aimed at evaluating present randomized controlled trials (RCTs) regarding the effect of ,-adrenoceptor agonists on vasovagal syncope (VVS). Methods:, According to inclusion and exclusion criteria, articles were selected from medical electronic databases. RCTs were then assessed based on the Juni assessment, and meta-analysis was completed using the Review Manager 4.2 software. Indication to further evaluate effects was the recurrence of syncope during follow-up treatment or a response in the head-up tilt test (HUT) after treatment. The results were stated as odd ratio (OR), with a 95% confidence interval (CI) and a p < 0.05 significant level. Results:, In total, six RCTs were selected. Funnel plot analysis showed possible publication bias. Meta-analysis of the six RCTs, including all 165 patients in the treatment group and 164 patients in the control group, indicated that ,-adrenoceptor agonists were more effective than placebos in treating VVS (OR = 0.21, 95% CI: 0.06,0.77, p = 0.02). The further, weighted independent t- test disclosed that the weighted mean percentage of responders for midodrine (76.3%± 7.7%) was significantly higher than that for etilefrine (65.5%± 15.4%) (t = 5.863, p < 0.001). Conclusion:, The currently published RCTs support that ,-adrenoceptor agonists might be effective for VVS. Midodrine can be regarded as a better choice compared with etilefrine. [source]


    Quality in the outsourcing process: part I. The quality outsourcer

    QUALITY ASSURANCE JOURNAL, Issue 4 2001
    Yvonne Russell Dr.
    Abstract In clinical research, the definition of quality and overall responsibility for ensuring that performance parameters are adequately tracked and the necessary corrective action taken, lies firmly in the hands of the outsourcing project director/manager (outsourcer or sponsor). Meticulous planning of requirements prior to project start and strong ,general management' throughout the life of the outsourced project play a critical role in influencing both the outcome of the study and also in determining the quality of the research data. For ,quality in the outsourcing process', read ,quality outsourcing'. The implementation of a carefully formulated project-specific outsourcing strategy means that macromanagement (general management), not micromanagement (defined here as a form of ,intensive therapy for the ailing project' management), will be the primary task of the sponsor. A research team that is well-defined, with all members (sponsor and vendor alike) mindful of their individual responsibilities, in addition to being well-directed, will achieve consensus of opinion faster and deliver a quality product. An outsourcing strategy for the full clinical development program, clearly defined outsourcing standard operating procedures (SOPs) and a strategy tailored to each individually outsourced project form an integral part of the recipe to outsourcing success. Individual components of an outsourcing strategy and how one can safeguard that an outsourced project is brought to completion successfully, and to the prescribed quality standards, are addressed in detail in this article. Part two of this article, entitled ,Quality in the Outsourcing Process: II. The Vendor Selection Process and The Quality Vendor', will provide the quality outsourcer with tips and tools on how to make a quality decision in the vendor selection process and addresses further issues that are fundamental to the maintenance of quality in the outsourcing process. Copyright © 2001 John Wiley & Sons, Ltd. [source]


    DRAMA MANAGEMENT AND PLAYER MODELING FOR INTERACTIVE FICTION GAMES

    COMPUTATIONAL INTELLIGENCE, Issue 2 2010
    Manu Sharma
    A growing research community is working toward employing drama management components in story-based games. These components gently guide the story toward a narrative arc that improves the player's gaming experience. In this article we evaluate a novel drama management approach deployed in an interactive fiction game called Anchorhead. This approach uses player's feedback as the basis for guiding the personalization of the interaction. The results indicate that adding our Case-based Drama manaGer (C-DraGer) to the game guides the players through the interaction and provides a better overall player experience. Unlike previous approaches to drama management, this article focuses on exhibiting the success of our approach by evaluating results using human players in a real game implementation. Based on this work, we report several insights on drama management which were possible only due to an evaluation with real players. [source]


    Management Strategies and Improvement of Performance of Sewer Networks

    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 7 2007
    Denys Breysse
    Even when they are conscious about the needs of maintenance to keep the system in a good condition, they lack efficient methods and tools that may help them in taking appropriate decisions. One can say that no really satisfactory and efficient tool exists, enabling the optimization of Inspection, Maintenance, or Rehabilitation (IMR) strategies on such systems. Sewer managers and researchers have been involved for many years in the French National Research Project for Renewal of Non Man Entry Sewer System (RERAU,Réhabilitation des Réseaux d'Assainissement Urbains, in French) to improve their knowledge of these systems and the management policies. During the RERAU project, a specific action has been dedicated to the modeling of asset ageing and maintenance. A special attention has been dedicated to the description of defects and dysfunctions, to the evaluation of performances and its modeling, accounting for its various dimensions (from the point of view of the manager, of the user, of the environment,). After having defined an Index of Technical Performance (ITp), we will introduce the Index of Technical and Economic Performance (ITEp) that is a combined measure of performance (including social costs) and technical costs. This index provides an objective standard tool for managers to compare different alternatives. It is used in the article to compare some simple IMR strategies. It sets the basis of a new method for no-man entry sewer system management, enabling us to analyze the profitableness of investment in terms of both technical and economic performance. [source]


    A Probabilistic Framework for Bayesian Adaptive Forecasting of Project Progress

    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 3 2007
    Paolo Gardoni
    An adaptive Bayesian updating method is used to assess the unknown model parameters based on recorded data and pertinent prior information. Recorded data can include equality, upper bound, and lower bound data. The proposed approach properly accounts for all the prevailing uncertainties, including model errors arising from an inaccurate model form or missing variables, measurement errors, statistical uncertainty, and volitional uncertainty. As an illustration of the proposed approach, the project progress and final time-to-completion of an example project are forecasted. For this illustration construction of civilian nuclear power plants in the United States is considered. This application considers two cases (1) no information is available prior to observing the actual progress data of a specified plant and (2) the construction progress of eight other nuclear power plants is available. The example shows that an informative prior is important to make accurate predictions when only a few records are available. This is also the time when forecasts are most valuable to the project manager. Having or not having prior information does not have any practical effect on the forecast when progress on a significant portion of the project has been recorded. [source]


    Bi-level Programming Formulation and Heuristic Solution Approach for Dynamic Traffic Signal Optimization

    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 5 2006
    Dazhi Sun
    Conventional methods of signal timing optimization assume given traffic flow pattern, whereas traffic assignment is performed with the assumption of fixed signal timing. This study develops a bi-level programming formulation and heuristic solution approach (HSA) for dynamic traffic signal optimization in networks with time-dependent demand and stochastic route choice. In the bi-level programming model, the upper level problem represents the decision-making behavior (signal control) of the system manager, while the user travel behavior is represented at the lower level. The HSA consists of a Genetic Algorithm (GA) and a Cell Transmission Simulation (CTS) based Incremental Logit Assignment (ILA) procedure. GA is used to seek the upper level signal control variables. ILA is developed to find user optimal flow pattern at the lower level, and CTS is implemented to propagate traffic and collect real-time traffic information. The performance of the HSA is investigated in numerical applications in a sample network. These applications compare the efficiency and quality of the global optima achieved by Elitist GA and Micro GA. Furthermore, the impact of different frequencies of updating information and different population sizes of GA on system performance is analyzed. [source]


    Maximizing revenue in Grid markets using an economically enhanced resource manager

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 14 2010
    M. Macías
    Abstract Traditional resource management has had as its main objective the optimization of throughput, based on parameters such as CPU, memory, and network bandwidth. With the appearance of Grid markets, new variables that determine economic expenditure, benefit and opportunity must be taken into account. The Self-organizing ICT Resource Management (SORMA) project aims at allowing resource owners and consumers to exploit market mechanisms to sell and buy resources across the Grid. SORMA's motivation is to achieve efficient resource utilization by maximizing revenue for resource providers and minimizing the cost of resource consumption within a market environment. An overriding factor in Grid markets is the need to ensure that the desired quality of service levels meet the expectations of market participants. This paper explains the proposed use of an economically enhanced resource manager (EERM) for resource provisioning based on economic models. In particular, this paper describes techniques used by the EERM to support revenue maximization across multiple service level agreements and provides an application scenario to demonstrate its usefulness and effectiveness. Copyright © 2008 John Wiley & Sons, Ltd. [source]


    Towards an autonomic approach for edge computing

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 14 2007
    Mikael Desertot
    Abstract Nowadays, one of the biggest challenges for companies is to cope with the high cost of their information technologies infrastructure. Edge computing is a new computing paradigm designed to allocate on-demand computing and storage resources. Those resources are Web cache servers scattered over the ISP backbones. We argue that this paradigm could be applied for on-demand full application hosting, helping to reduce costs. In this paper, we present a J2EE (Java Enterprise Edition) dynamic server able to deploy/host J2EE applications on demand and its autonomic manager. For this, we reengineer and experiment with JOnAS, an open-source J2EE static server. Two management policies of the autonomic manager were stressed by a simulation of a worldwide ISP network. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    Object combining: a new aggressive optimization for object intensive programs

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 5-6 2005
    Ronald Veldema
    Abstract Object combining tries to put objects together that have roughly the same life times in order to reduce strain on the memory manager and to reduce the number of pointer indirections during a program's execution. Object combining works by appending the fields of one object to another, allowing allocation and freeing of multiple objects with a single heap (de)allocation. Unlike object inlining, which will only optimize objects where one has a (unique) pointer to another, our optimization also works if there is no such relation. Object inlining also directly replaces the pointer by the inlined object's fields. Object combining leaves the pointer in place to allow more combining. Elimination of the pointer accesses is implemented in a separate compiler optimization pass. Unlike previous object inlining systems, reference field overwrites are allowed and handled, resulting in much more aggressive optimization. Our object combining heuristics also allow unrelated objects to be combined, for example, those allocated inside a loop; recursive data structures (linked lists, trees) can be allocated several at a time and objects that are always used together can be combined. As Java explicitly permits code to be loaded at runtime and allows the new code to contribute to a running computation, we do not require a closed-world assumption to enable these optimizations (but it will increase performance). The main focus of object combining in this paper is on reducing object (de)allocation overhead, by reducing both garbage collection work and the number of object allocations. Reduction of memory management overhead causes execution time to be reduced by up to 35%. Indirection removal further reduces execution time by up to 6%. Copyright © 2005 John Wiley & Sons, Ltd. [source]


    Testing the "Inverted-U" Phenomenon in Moral Development on Recently Promoted Senior Managers and Partners,

    CONTEMPORARY ACCOUNTING RESEARCH, Issue 2 2004
    RICHARD A. BERNARDI
    Abstract This paper examines the change in the average level of moral development over a 7.5-year period of promotion, attrition, and survival in five Big 6 firms. The study improves upon previous cross-sectional studies that found decreases in the average level of moral development at the senior manager and partner levels, which has been referred to as the "inverted-U" phenomenon. Problems with these studies that limit the generalizability of their findings include their cross-sectional nature and samples that usually come from one or two firms. Over a 7.5-year period, we found that the participating Big 6 firms retained auditors with higher average levels of moral development (measured using the defining issues test), while those with lower average levels left the firms. The average level of moral development for new partners was at least as high as the group from which they came. This research suggests that the concern about Big 6 firms retaining a higher proportion of auditors with lower moral development may be an artifact of research design. [source]


    Production Efficiency and the Pricing of Audit Services,

    CONTEMPORARY ACCOUNTING RESEARCH, Issue 1 2003
    Nicholas Dopuch
    Abstract In this paper, we examine the relative efficiency of audit production by one of the then Big 6 public accounting firms for a sample of 247 geographically dispersed audits of U.S. companies performed in 1989. To test the relative efficiency of audit production, we use both stochastic frontier estimation (SFE) and data envelopment analysis (DEA). A feature of our research is that we also test whether any apparent inefficiencies in production, identified using SFE and DEA, are correlated with audit pricing. That is, do apparent inefficiencies cause the public accounting firm to reduce its unit price (billing rate) per hour of labor utilized on an engagement? With respect to results, we do not find any evidence of relative (within-sample) inefficiencies in the use of partner, manager, senior, or staff labor hours using SFE. This suggests that the SFE model may not be sufficiently powerful to detect inefficiencies, even with our reasonably large sample size. However, we do find apparent inefficiencies using the DEA model. Audits range from about 74 percent to 100 percent relative efficiency in production, while the average audit is produced at about an 88 percent efficiency level, relative to the most efficient audits in the sample. Moreover, the inefficiencies identified using DEA are correlated with the firm's realization rate. That is, average billing rates per hour fall as the amount of inefficiency increases. Our results suggest that there are moderate inefficiencies in the production of many of the subject public accounting firm's audits, and that such inefficiencies are economically costly to the firm. [source]


    Audit Review: Managers' Interpersonal Expectations and Conduct of the Review,

    CONTEMPORARY ACCOUNTING RESEARCH, Issue 3 2002
    Michael Gibbins
    Abstract This paper presents an interpersonal model of audit file review centered on the audit manager. A manager's conduct of the review is affected by four components: the manager's expectations about the client, expectations about the preparer, expectations about the partner, and the manager's own approach and circumstances. The paper then presents a comprehensive field-based analysis of how a working paper review is conducted. It supplements the mostly experimental research on working paper review by reporting the results of a retrospective field questionnaire that asked audit managers to report on their behavior and their relationships with preparers and partners on actual audit engagements. The extent of review was sensitive to specific features of the client and the file (including risk factors), to features of the preparer, and particularly to the style of the reviewer, which was quite stable across cases. Although the evidence of managers' awareness of preparers' "stylizing" the file to suit the manager was weak, the evidence of managers' stylizing for the partners was pervasive, affecting both work done and documentation. Managers believed that good reviews emphasized key issues and risks rather than detail. Other new descriptive evidence on the nature of the review process is provided, including the purpose of the review process, how frequently surprises are found in the review process, and the qualities of good reviewers compared with poor reviewers. The implications of our model and our results for future research are outlined. [source]