Many Projects (many + project)

Distribution by Scientific Domains


Selected Abstracts


Find project risk before it bites you

JOURNAL OF CORPORATE ACCOUNTING & FINANCE, Issue 4 2008
Timothy Iijima
Projects are a significant part of many companies' spending. Many projects are small and end quickly. But some can last for years,and create surprising risk that can come back and bite you. To prevent that, you must audit those projects. The author reveals how to do it. © 2008 Wiley Periodicals, Inc. [source]


MASS LOAD ESTIMATION ERRORS UTILIZING GRAB SAMPLING STRATEGIES IN A KARST WATERSHED,

JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 6 2003
Alex W. Fogle
ABSTRACT: Developing a mass load estimation method appropriate for a given stream and constituent is difficult due to inconsistencies in hydrologic and constituent characteristics. The difficulty may be increased in flashy flow conditions such as karst. Many projects undertaken are constrained by budget and manpower and do not have the luxury of sophisticated sampling strategies. The objectives of this study were to: (1) examine two grab sampling strategies with varying sampling intervals and determine the error in mass load estimates, and (2) determine the error that can be expected when a grab sample is collected at a time of day when the diurnal variation is most divergent from the daily mean. Results show grab sampling with continuous flow to be a viable data collection method for estimating mass load in the study watershed. Comparing weekly, biweekly, and monthly grab sampling, monthly sampling produces the best results with this method. However, the time of day the sample is collected is important. Failure to account for diurnal variability when collecting a grab sample may produce unacceptable error in mass load estimates. The best time to collect a sample is when the diurnal cycle is nearest the daily mean. [source]


Environmental cleanup of the nation's former nuclear weapons sites: Unprecedented public-private challenges at the largest facilities

REMEDIATION, Issue 3 2006
Henry Mayer
In 1994, the U.S. Department of Energy (DOE) initiated a contract reform program intended to strengthen oversight capabilities and encourage the creation of contract and incentive structures, which would effectively facilitate the treatment of onsite contamination and waste. The remedia-tion and disposal of these legacy wastes is the core of the Department's environmental manage-ment mission (Government Accountability Office [GAO], 2003). Despite a concerted effort toward achieving the goals of the reform, progress has been slow. Many projects continue to necessitate cost and time extensions above those originally agreed upon. Although the Department insti-tuted an accelerated cleanup program in 2002, promising to shave some $50 billion and 35 years from its earlier cost and schedule projections, there have been delays in critical project areas that call into question the attainability of the proposed reductions (GAO, 2005). Numerous explana-tions have been offered as to why achieving these goals has proven so difficult, many of which have concluded that flawed contracting practices are to blame. This article concludes that the root of the problem is much deeper and that the organizational criticisms aimed at DOE are as much a legacy as the waste itself. Although the focus of this article is on large former nuclear weapons sites, these types of contracting and organizational issues are often found at other gov-ernment and private complex hazardous waste sites. © 2006 Wiley Periodicals, Inc. [source]


Intelligent interaction design: the role of human-computer interaction research in the design of intelligent systems

EXPERT SYSTEMS, Issue 1 2001
Ann BlandfordArticle first published online: 16 DEC 200
As more intelligent systems are introduced into the marketplace, it is becoming increasingly urgent to consider usability for such systems. Historically, the two fields of artificial intelligence (AI) and human- computer interaction (HCI) have had little in common. In this paper, we consider how established HCI techniques can usefully be applied to the design and evaluation of intelligent systems, and where there is an urgent need for new approaches. Some techniques - notably those for requirements acquisition and empirical evaluation - can usefully be adopted, and indeed are, within many projects. However, many of the tools and techniques developed within HCI to support design and theory-based evaluation cannot be applied in their present forms to intelligent systems because they are based on inappropriate assumptions; there is consequently a need for new approaches. Conversely, there are approaches that have been developed within AI - e.g. in research on dialogue and on ontologies - that could usefully be adapted and encapsulated to respond to this need. These should form the core of a future research agenda for intelligent interaction design. [source]


Estimation of origin,destination trip rates in Leicester

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES C (APPLIED STATISTICS), Issue 4 2001
Martin L. Hazelton
The road system in region RA of Leicester has vehicle detectors embedded in many of the network's road links. Vehicle counts from these detectors can provide transportation researchers with a rich source of data. However, for many projects it is necessary for researchers to have an estimate of origin-to-destination vehicle flow rates. Obtaining such estimates from data observed on individual road links is a non-trivial statistical problem, made more difficult in the present context by non-negligible measurement errors in the vehicle counts collected. The paper uses road link traffic count data from April 1994 to estimate the origin,destination flow rates for region RA. A model for the error prone traffic counts is developed, but the resulting likelihood is not available in closed form. Nevertheless, it can be smoothly approximated by using Monte Carlo integration. The approximate likelihood is combined with prior information from a May 1991 survey in a Bayesian framework. The posterior is explored using the Hastings,Metropolis algorithm, since its normalizing constant is not available. Preliminary findings suggest that the data are overdispersed according to the original model. Results for a revised model indicate that a degree of overdispersion exists, but that the estimates of origin,destination flow rates are quite insensitive to the change in model specification. [source]


Diffraction cartography: applying microbeams to macromolecular crystallography sample evaluation and data collection

ACTA CRYSTALLOGRAPHICA SECTION D, Issue 8 2010
Matthew W. Bowler
Crystals of biological macromolecules often exhibit considerable inter-crystal and intra-crystal variation in diffraction quality. This requires the evaluation of many samples prior to data collection, a practice that is already widespread in macromolecular crystallography. As structural biologists move towards tackling ever more ambitious projects, new automated methods of sample evaluation will become crucial to the success of many projects, as will the availability of synchrotron-based facilities optimized for high-throughput evaluation of the diffraction characteristics of samples. Here, two examples of the types of advanced sample evaluation that will be required are presented: searching within a sample-containing loop for microcrystals using an X-ray beam of 5,µm diameter and selecting the most ordered regions of relatively large crystals using X-ray beams of 5,50,µm in diameter. A graphical user interface developed to assist with these screening methods is also presented. For the case in which the diffraction quality of a relatively large crystal is probed using a microbeam, the usefulness and implications of mapping diffraction-quality heterogeneity (diffraction cartography) are discussed. The implementation of these techniques in the context of planned upgrades to the ESRF's structural biology beamlines is also presented. [source]