Performance Metrics (performance + metric)

Distribution by Scientific Domains
Distribution within Engineering


Selected Abstracts


Integrated Environmental and Financial Performance Metrics for Investment Analysis and Portfolio Management

CORPORATE GOVERNANCE, Issue 3 2007
Simon Thomas
This paper introduces a new measure, based on a study by Trucost and Dr Robert Repetto, combining external environmental costs with established measures of economic value added, and demonstrates how this measure can be incorporated into financial analysis. We propose that external environmental costs are relevant to all investors: universal investors are concerned about the scale of external costs whether or not regulations to internalise them are likely; mainstream investors need to understand external costs as an indication of future regulatory compliance costs; and SRI investors need to evaluate companies on both financial and social performance. The paper illustrates our new measure with data from US electric utilities and illustrates how the environmental exposures of different fund managers and portfolios can be compared. With such measures fund managers can understand and control portfolio-wide environmental risks, demonstrate their environmental credentials quantitatively and objectively and compete for the increasing number of investment mandates that have an environmental component. [source]


Tunable scheduling in a GridRPC framework

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 9 2008
A. Amar
Abstract Among existing grid middleware approaches, one simple, powerful, and flexible approach consists of using servers available in different administrative domains through the classic client,server or remote procedure call paradigm. Network Enabled Servers (NES) implement this model, also called GridRPC. Clients submit computation requests to a scheduler, whose goal is to find a server available on the grid using some performance metric. The aim of this paper is to give an overview of a NES middleware developed in the GRAAL team called distributed interactive engineering toolbox (DIET) and to describe recent developments around plug-in schedulers, workflow management, and tools. DIET is a hierarchical set of components used for the development of applications based on computational servers on the grid. Copyright © 2007 John Wiley & Sons, Ltd. [source]


Bank and Business Performance Measurement

ECONOMIC NOTES, Issue 2 2002
Stuart M. Turnbull
Given the objective of maximizing the wealth of existing shareholders, this paper discusses some of the issues that arise in attempting to measure the performance of individual businesses within a bank. The paper describes two return measures , return on assets within a business and the return on the ,equity' of an individual business , and discusses the appropriate bench,marks. The paper ends with a discussion of the cost of unused allocated capital and the appropriate performance metric. (J.E.L.: G30, G31). [source]


Risk-Based Capital and Credit Insurance Portfolios

FINANCIAL MARKETS, INSTITUTIONS & INSTRUMENTS, Issue 1 2010
Van Son Lai
This paper analyzes the risk-management practices of a vulnerable credit insurer by studying the effects of time-varying correlations, asset risks and loan maturities on the risk-based capital that backs credit insurance portfolios. Since asset correlations may change over a business cycle, we have analyzed these effects by means of a one-factor Gaussian stochastic model as part of an extended contingent claims analysis. Our results show the need to account for cyclical changes to correlations in the pricing of credit insurance. When compared with the reserve of risk-based capital recommended by the Basel II Internal Ratings-Based (IRB) approach, our model provides a better capital buffer against extreme credit losses, especially in times of recession and/or in a risky business environment. Using a risk-adjusted performance metric (RAPM), we find insurers perform better when insuring relatively short-term loans. We also make several policy recommendations on creating a reserve of risk-based capital to protect against possible loan losses. [source]


Ontogeny of escape swimming performance in the spotted salamander

FUNCTIONAL ECOLOGY, Issue 3 2010
Tobias Landberg
Summary 1.,The life stage suffering the highest predation rate is expected to have the highest escape performance unless developmental or functional constraints interfere. Peak aquatic escape performance in ephemeral pond-breeding amphibians is expected to develop early in the larval period, and metamorphosis is expected to reduce or completely disrupt aquatic escape performance. In anurans, exceptionally low escape performance during metamorphosis creates selection favouring rapid metamorphosis , which minimizes the time individuals spend in the vulnerable transition between tadpole and frog. 2.,We investigated the development of aquatic escape performance in the spotted salamander, Ambystoma maculatum (Shaw, 1802), from embryonic development through metamorphosis. We expected performance to peak early in the larval period as hatchlings face high rates of predation but embryos must first develop escape behaviours. We also tested whether escape performance during metamorphosis was intermediate, as predicted by tail fin resorption, or lower than larvae and adults indicating a major physiological disruption. 3.,Escape performance shows a complex ontogeny that is first positively influenced by embryonic and early larval development and then negatively correlated with tail resorption and body size. Escape distance was the only performance metric not affected by life stage. In contrast, both escape velocity and duration showed ontogenetic peaks early in the larval period with the lowest performance found in early embryos and adults and intermediate performance during metamorphosis. 4.,This pattern suggests that metamorphosis does not impose a major physiological disruption on escape performance. Because spotted salamanders do not pass through a frog-like ,ontogenetic performance valley' during metamorphosis, they may be less subject than anurans to selection favouring rapid metamorphosis. 5.,Functional implications of phenotypic variation should be considered in an ontogenetic framework because the relationship between body size and escape performance can be reversed on either side of an ontogenetic performance peak. The assumption that metamorphosis radically disrupts basic functions such as predator evasion does not seem universally warranted and suggests examination of ontogenetic performance trajectories in a diversity of animals with complex life cycles. [source]


Error resilient data transport in sensor network applications: A generic perspective,

INTERNATIONAL JOURNAL OF CIRCUIT THEORY AND APPLICATIONS, Issue 2 2009
Rachit Agarwal
Abstract The error recovery problem in wireless sensor networks is studied from a generic resource-constrained energy-optimization perspective. To characterize the features of error recovery schemes that suit the majority of applications, an energy model is developed and inferences are drawn based on a suitable performance metric. For applications that require error control coding, an efficient scheme is proposed based on an interesting observation related to shortened Reed,Solomon (RS) codes for packet reliability. It is shown that multiple instances (,) of RS codes defined on a smaller alphabet combined with interleaving results in smaller resource usage, while the performance exceeds the benefits of a shortened RS code defined over a larger alphabet. In particular, the proposed scheme can have an error correction capability of up to , times larger than that of the conventional RS scheme without changing the rate of the code with much lower power, timing and memory requirements. Implementation results show that such a scheme is 43% more power efficient compared with the RS scheme with the same code rate. Besides, such an approach results in 46% faster computations and 53% reduction in memory requirements. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Integration of mobility and intrusion detection for wireless ad hoc networks

INTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, Issue 6 2007
Bo Sun
Abstract One of the main challenges in building intrusion detection systems (IDSs) for mobile ad hoc networks (MANETs) is to integrate mobility impacts and to adjust the behaviour of IDSs correspondingly. In this paper, we first introduce two different approaches, a Markov chain-based approach and a Hotelling's T2 test based approach, to construct local IDSs for MANETs. We then demonstrate that nodes' moving speed, a commonly used parameter in tuning IDS performances, is not an effective metric to tune IDS performances under different mobility models. To solve this problem, we further propose an adaptive scheme, in which suitable normal profiles and corresponding proper thresholds can be selected adaptively by each local IDS through periodically measuring its local link change rate, a proposed unified performance metric. We study the proposed adaptive mechanism at different mobility levels, using different mobility models such as random waypoint model, random drunken model, and obstacle mobility model. Simulation results show that our proposed adaptive scheme is less dependent on the underlying mobility models and can further reduce false positive ratio. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Flexible design-planning of supply chain networks

AICHE JOURNAL, Issue 7 2009
José Miguel Laínez
Abstract Nowadays market competition is essentially associated to supply chain (SC) improvement. Therefore, the locus of value creation has shifted to the chain network. The strategic decision of determining the optimal SC network structure plays a vital role in the later optimization of SC operations. This work focuses on the design and retrofit of SCs. Traditional approaches available in literature addressing this problem usually utilize as departing point a rigid predefined network structure which may restrict the opportunities of adding business value. Instead, a novel flexible formulation approach which translates a recipe representation to the SC environment is proposed to solve the challenging design-planning problem of SC networks. The resulting mixed integer linear programming model is aimed to achieve the best NPV as key performance metric. The potential of the presented approach is highlighted through illustrative examples of increasing complexity, where results of traditional rigid approaches and those offered by the flexible framework are compared. The implications of exploiting this potential flexibility to improve the SC performance are highlighted and are the subject of our further research work. © 2009 American Institute of Chemical Engineers AIChE J, 2009 [source]


Modeling and Managing the Percentage of Satisfied Customers in Hidden and Revealed Waiting Line Systems

PRODUCTION AND OPERATIONS MANAGEMENT, Issue 1 2006
Chester Chambers
We perform an analysis of various queueing systems with an emphasis on estimating a single performance metric. This metric is defined to be the percentage of customers whose actual waiting time was less than their individual waiting time threshold. We label this metric the Percentage of Satisfied Customers (PSC.) This threshold is a reflection of the customers' expectation of a reasonable waiting time in the system given its current state. Cases in which no system state information is available to the customer are referred to as "hidden queues." For such systems, the waiting time threshold is independent of the length of the waiting line, and it is randomly drawn from a distribution of threshold values for the customer population. The literature generally assumes that such thresholds are exponentially distributed. For these cases, we derive closed form expressions for our performance metric for a variety of possible service time distributions. We also relax this assumption for cases where service times are exponential and derive closed form results for a large class of threshold distributions. We analyze such queues for both single and multi-server systems. We refer to cases in which customers may observe the length of the line as "revealed" queues." We perform a parallel analysis for both single and multi-server revealed queues. The chief distinction is that for these cases, customers may develop threshold values that are dependent upon the number of customers in the system upon their arrival. The new perspective this paper brings to the modeling of the performance of waiting line systems allows us to rethink and suggest ways to enhance the effectiveness of various managerial options for improving the service quality and customer satisfaction of waiting line systems. We conclude with many useful insights on ways to improve customer satisfaction in waiting line situations that follow directly from our analysis. [source]


Do Older Rural and Urban Veterans Experience Different Rates of Unplanned Readmission to VA and Non-VA Hospitals?

THE JOURNAL OF RURAL HEALTH, Issue 1 2009
William B. Weeks MD
ABSTRACT:,Context: Unplanned readmission within 30 days of discharge is an indicator of hospital quality. Purpose: We wanted to determine whether older rural veterans who were enrolled in the VA had different rates of unplanned readmission to VA or non-VA hospitals than their urban counterparts. Methods: We used the combined VA/Medicare dataset to examine 3,513,912 hospital admissions for older veterans that occurred in VA or non-VA hospitals between 1997 and 2004. We calculated 30-day readmission rates and odds ratios for rural and urban veterans, and we performed a logistic regression analysis to determine whether living in a rural setting or initially using the VA for hospitalization were independent risk factors for unplanned 30-day readmission, after adjusting for age, sex, length of stay of the index admission, and morbidity. Findings: Overall, rural veterans had slightly higher 30-day readmission rates than their urban counterparts (17.96% vs 17.86%; OR 1.006, 95% CI: 1.0004, 1.013). For both rural- and urban-dwelling veterans, readmission after using a VA hospital was more common than after using a non-VA hospital (20.7% vs 16.8% for rural veterans, 21.2% vs 16.1% for urban veterans). After adjusting for other variables, readmission was more likely for rural veterans and following admission to a VA hospital. Conclusions: Our findings suggest that VA should consider using the unplanned readmission rate as a performance metric, using the non-VA experience of veterans as a performance benchmark, and helping rural veterans select higher performing non-VA hospitals. [source]


MUTUAL VERSUS PROPRIETARY OWNERSHIP: AN EMPIRICAL STUDY FROM THE UK UNIT TRUST INDUSTRY WITH A COMPANY-PRODUCT MEASURE

ANNALS OF PUBLIC AND COOPERATIVE ECONOMICS, Issue 2 2010
Yoshikatsu Shinozawa
ABSTRACT,:,In the debate of the relative merits of differing ownership forms, most empirical studies examine either corporate performance or the product characteristics of the financial products that are available in the financial services industry. Based on the UK unit trust industry, this paper assesses which ownership form, mutual or proprietary is more efficient in managing unit trust operations and providing high return generating unit trusts. Using a combined corporate performance and product range performance metric, this study reveals no significant differences between the two ownership forms in terms of the corporate-product performance score. The results indicate that the owner-customer fused role in the mutual organization must be considered in the mutual versus proprietary ownership debate. [source]


A comparative study of awareness methods for peer-to-peer distributed virtual environments

COMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 5 2008
S. Rueda
Abstract The increasing popularity of multi-player online games is leading to the widespread use of large-scale Distributed Virtual Environments (DVEs) nowadays. In these systems, peer-to-peer (P2P) architectures have been proposed as an efficient and scalable solution for supporting massively multi-player applications. However, the main challenge for P2P architectures consists of providing each avatar with updated information about which other avatars are its neighbors. This problem is known as the awareness problem. In this paper, we propose a comparative study of the performance provided by those awareness methods that are supposed to fully solve the awareness problem. This study is performed using well-known performance metrics in distributed systems. Moreover, while the evaluations shown in the literature are performed by executing P2P simulations on a single (sequential) computer, this paper evaluates the performance of the considered methods on actually distributed systems. The evaluation results show that only a single method actually provides full awareness to avatars. This method also provides the best performance results. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Performance and effectiveness trade-off for checkpointing in fault-tolerant distributed systems

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 1 2007
Panagiotis Katsaros
Abstract Checkpointing has a crucial impact on systems' performance and fault-tolerance effectiveness: excessive checkpointing results in performance degradation, while deficient checkpointing incurs expensive recovery. In distributed systems with independent checkpoint activities there is no easy way to determine checkpoint frequencies optimizing response-time and fault-tolerance costs at the same time. The purpose of this paper is to investigate the potentialities of a statistical decision-making procedure. We adopt a simulation-based approach for obtaining performance metrics that are afterwards used for determining a trade-off between checkpoint interval reductions and efficiency in performance. Statistical methodology including experimental design, regression analysis and optimization provides us with the framework for comparing configurations, which use possibly different fault-tolerance mechanisms (replication-based or message-logging-based). Systematic research also allows us to take into account additional design factors, such as load balancing. The method is described in terms of a standardized object replication model (OMG FT-CORBA), but it could also be applied in other (e.g. process-based) computational models. Copyright © 2006 John Wiley & Sons, Ltd. [source]


SCALEA: a performance analysis tool for parallel programs

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 11-12 2003
Hong-Linh Truong
Abstract Many existing performance analysis tools lack the flexibility to control instrumentation and performance measurement for code regions and performance metrics of interest. Performance analysis is commonly restricted to single experiments. In this paper we present SCALEA, which is a performance instrumentation, measurement, analysis, and visualization tool for parallel programs that supports post-mortem performance analysis. SCALEA currently focuses on performance analysis for OpenMP, MPI, HPF, and mixed parallel programs. It computes a variety of performance metrics based on a novel classification of overhead. SCALEA also supports multi-experiment performance analysis that allows one to compare and to evaluate the performance outcome of several experiments. A highly flexible instrumentation and measurement system is provided which can be controlled by command-line options and program directives. SCALEA can be interfaced by external tools through the provision of a full Fortran90 OpenMP/MPI/HPF frontend that allows one to instrument an abstract syntax tree at a very high-level with C-function calls and to generate source code. A graphical user interface is provided to view a large variety of performance metrics at the level of arbitrary code regions, threads, processes, and computational nodes for single- and multi-experiment performance analysis. Copyright © 2003 John Wiley & Sons, Ltd. [source]


Using Buyer,Supplier Performance Frontiers to Manage Relationship Performance,

DECISION SCIENCES, Issue 1 2009
Anthony D. Ross
ABSTRACT This article presents a consensus-building methodology to implement dyadic performance measurement. It focuses on transmuting supplier performance and buyer performance metrics on several important attributes into actionable relationship management plans using Clark's (1996) theory of performance frontiers. Access to the supplier performance management program of a Fortune 100 corporation was granted to the research team. Direct observation of practice and in-depth discussions with several managers provided a roadmap for investigating both the literature on quantitative evaluation methods and the empirically derived theory on buyer,supplier relationships from several perspectives. This study describes a multiphase, iterative framework that uses current methods and theory on dyadic buyer,supplier evaluation to consider: (i) evaluation criteria and their importance; (ii) whether the improvement focus should be on strengths, weaknesses, or both; and (iii) whether the referent role supplier should be the ideal supplier, best supplier, or best-in-strategic-group supplier in the focal supply base. We illustrate a unifying approach by reporting results from a large buyer and 35 of its key suppliers. This research makes the case for managing supplier relationships through the dyadic performance lens. The outputs from this framework provide individual supplier improvement paths which are actionable prescriptions for each buyer,supplier dyad, as well as recommendations for strategic group formation. [source]


Making the case for objective performance metrics in newborn screening by tandem mass spectrometry

DEVELOPMENTAL DISABILITIES RESEARCH REVIEW, Issue 4 2006
Piero Rinaldo
Abstract The expansion of newborn screening programs to include multiplex testing by tandem mass spectrometry requires understanding and close monitoring of performance metrics. This is not done consistently because of lack of defined targets, and interlaboratory comparison is almost nonexistent. Between July 2004 and April 2006 (N = 176,185 cases), the overall performance metrics of the Minnesota program, limited to MS/MS testing, were as follows: detection rate 1:1,816, positive predictive value 37% (54% in 2006 till date), and false positive rate 0.09%. The repeat rate and the proportion of cases with abnormal findings actually been reported are new metrics proposed here as an objective mean to express the overall noise in a program, where noise is defined as the total number of abnormal results obtained using a given set of cut-off values. On the basis of our experience, we propose the following targets as evidence of adequate analytical and postanalytical performance: detection rate 1:3,000 or higher, positive predictive value >20%, and false positive rate <0.3%. © 2006 Wiley-Liss, Inc. MRDD Research Reviews 2006;12:255,261. [source]


Lateralised motor behaviour leads to increased unevenness in front feet and asymmetry in athletic performance in young mature Warmblood horses

EQUINE VETERINARY JOURNAL, Issue 5 2010
M. C. V. Van HEEL
Summary Reason for performing study: Foot stance in grazing significantly influences hoof conformation and development from foal to yearling age. Objectives: To conduct a longitudinal study to establish if the relationship between motor laterality and uneven front feet persisted in 3-year-old horses at the time of studbook selection and to investigate if such laterality and unevenness might influence the horses' ability to perform symmetrically while trotting, cantering and free jumping. Methods: Seventeen clinically sound but untrained (with only minimal experience of handling) and sound Warmblood horses that had participated in a previous study were assessed as per the protocol reported. Laterality was tested in a preference test (PT) and z -values were calculated for analysis purposes. Laterality and hoof unevenness were related to both relative limb length and relative head size, while the ability to perform symmetrically was tested in free trot-canter transitions and free jumping exercises. Differences in performance between horses with and without a limb preference in the PT and those with ,uneven' and ,even' feet were tested for differences in performance metrics using Students' t test, while linearity was tested using a regression analysis (P<0.05). Results: Significant laterality was still present in 24% of the 3-year-old horses and the relationship between laterality and uneven feet pairs was stronger than at foal and yearling stages. Horses with significant motor laterality had almost 4 times more unevenness, a smaller head and longer limbs and the relationship between body conformation and laterality was still present. There was a strong linear relation between unevenness, laterality and a bias or side preference for trot-canter transitions. However, this relationship was not significant during the free jumping exercise. Conclusion: Motor laterality and uneven feet pairs were still present and significantly related in the 3-year-old horses and both variables were also strongly related to sidedness in trot-canter transitions. Potential relevance: Warmblood studbooks should include quantitative data on laterality at the time of studbook admission as part of the selection criteria. [source]


Analytical modelling of users' behaviour and performance metrics in key distribution schemes

EUROPEAN TRANSACTIONS ON TELECOMMUNICATIONS, Issue 1 2010
Massimo Tornatore
Access control for group communications must ensure that only legitimate users can access the authorised data streams. This could be done by distributing an encrypting key to each member of the group to be secured. To achieve a high level of security, the group key should be changed every time a user joins or leaves the group, so that a former group member has no access to current communications and a new member has no access to previous communications. Since group memberships could be very dynamic, the group key should be changed frequently. So far, different schemes for efficient key distribution have been proposed to limit the key-distribution overhead. In previous works, the performance comparison among these different schemes have been based on simulative experiments, where users join and leave secure groups according to a basic statistical model of users' behaviour. In this paper, we propose a new statistical model to account for the behaviour of users and compare it to the modelling approach so far adopted in the literature. Our new model is able to to lead the system to a steady state (allowing a superior statistical confidence of the results), as opposed to current models in which the system is permanently in a transient and diverging state. We also provide analytical formulations of the main performance metrics usually adopted to evaluate key distribution systems, such as rekey overheads and storage overheads. Then, we validate our simulative outcomes with results obtained by analytical formulations. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Circle hooks, ,J' hooks and drop-back time: a hook performance study of the south Florida recreational live-bait fishery for sailfish, Istiophorus platypterus

FISHERIES MANAGEMENT & ECOLOGY, Issue 2 2007
E. D. PRINCE
Abstract, This study evaluates the performance of two types of non-offset circle hooks (traditional and non-traditional) and a similar-sized ,J' hook commonly used in the south Florida recreational live-bait fishery for Atlantic sailfish, Istiophorus platypterus (Shaw). A total of 766 sailfish were caught off south Florida (Jupiter to Key West, FL, USA) to assess hook performance and drop-back time, which is the interval between the fish's initial strike and exertion of pressure by the fisher to engage the hook. Four drop-back intervals were examined (0,5, 6,10, 11,15 and >15 s), and hook performance was assessed in terms of proportions of successful catch, undesirable hook locations, bleeding events and undesirable release condition associated with physical hook damage and trauma. In terms of hook performance, the traditionally-shaped circle hook had the greatest conservation benefit for survival after release. In addition, this was the only hook type tested that performed well during each drop-back interval for all performance metrics. Conversely, ,J' hooks resulted in higher proportions of undesirable hook locations (as much as twofold), bleeding and fish released in undesirable condition, particularly during long drop-back intervals. Non-traditional circle hooks had performance results intermediate to the other hook types, but also had the worst performance relative to undesirable release condition during the first two drop-back intervals. Choice of hook type and drop-back interval can significantly change hook wounding, and different models of non-offset circle hooks should not be assumed to perform equivalently. [source]


Australian retail fund performance persistence

ACCOUNTING & FINANCE, Issue 1 2005
Chris Bilson
G11; G12; G23 Abstract The present study extends the Australian fund performance persistence literature through the use of five performance metrics: raw returns, the Sharpe ratio, the single-factor model and two multifactor models, the Carhart (1997) model and the Gruber (1996) model, in analysis of Australian retail fund performance over the period 1991,2000. Analysis suggests that performance persistence is sensitive to fund objective and appears to be driven by inadequate adjustment for risk. [source]


On the effects of triangulated terrain resolution on distributed hydrologic model response

HYDROLOGICAL PROCESSES, Issue 11 2005
Enrique R. Vivoni
Abstract Distributed hydrologic models based on triangulated irregular networks (TIN) provide a means for computational efficiency in small to large-scale watershed modelling through an adaptive, multiple resolution representation of complex basin topography. Despite previous research with TIN-based hydrology models, the effect of triangulated terrain resolution on basin hydrologic response has received surprisingly little attention. Evaluating the impact of adaptive gridding on hydrologic response is important for determining the level of detail required in a terrain model. In this study, we address the spatial sensitivity of the TIN-based Real-time Integrated Basin Simulator (tRIBS) in order to assess the variability in the basin-averaged and distributed hydrologic response (water balance, runoff mechanisms, surface saturation, groundwater dynamics) with respect to changes in topographic resolution. Prior to hydrologic simulations, we describe the generation of TIN models that effectively capture topographic and hydrographic variability from grid digital elevation models. In addition, we discuss the sampling methods and performance metrics utilized in the spatial aggregation of triangulated terrain models. For a 64 km2 catchment in northeastern Oklahoma, we conduct a multiple resolution validation experiment by utilizing the tRIBS model over a wide range of spatial aggregation levels. Hydrologic performance is assessed as a function of the terrain resolution, with the variability in basin response attributed to variations in the coupled surface,subsurface dynamics. In particular, resolving the near-stream, variable source area is found to be a key determinant of model behaviour as it controls the dynamic saturation pattern and its effect on rainfall partitioning. A relationship between the hydrologic sensitivity to resolution and the spatial aggregation of terrain attributes is presented as an effective means for selecting the model resolution. Finally, the study highlights the important effects of terrain resolution on distributed hydrologic model response and provides insight into the multiple resolution calibration and validation of TIN-based hydrology models. Copyright © 2005 John Wiley & Sons, Ltd. [source]


Functional Covalent Chemistry of Carbon Nanotube Surfaces

ADVANCED MATERIALS, Issue 6 2009
Xiaohui Peng
Abstract In this Progress Report, we update covalent chemical strategies commonly used for the focused functionalization of single-walled carbon nanotube (SWNT) surfaces. In recent years, SWNTs have been treated as legitimate nanoscale chemical reagents. Hence, herein we seek to understand, from a structural and mechanistic perspective, the breadth and types of controlled covalent reactions SWNTs can undergo in solution phase, not only at ends and defect sites but also along sidewalls. We explore advances in the formation of nanotube derivatives that essentially maintain and even enhance their performance metrics after precise chemical modification. We especially highlight molecular insights (and corresponding correlation with properties) into the binding of functional moieties onto carbon nanotube surfaces. Controllable chemical functionalization suggests that the unique optical, electronic, and mechanical properties of SWNTs can be much more readily tuned than ever before, with key implications for the generation of truly functional nanoscale working devices. [source]


A network-centric approach for access and interface selection in heterogeneous wireless environments

INTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, Issue 5 2008
George Koundourakis
Abstract In this paper, we introduce a network-based approach for access and interface selection (AIS) in the context of resource management in heterogeneous wireless environments (UMTS, WLAN and DVB-T). We focus on the optimization of resource utilization, while ensuring acceptable quality of service (QoS) provision to the end users. Our objective is to optimally manage the overall system resources and minimize the possibility of QoS handovers (non-mobility handovers). The adopted architecture applies to typical heterogeneous environments and network entities (Access Routers) are enhanced with extra functionalities. We propose an AIS algorithm that exploits the multihoming concept and globally manages network resources at both radio access and IP backbone networks. The algorithm can estimate near-optimal solutions in real time and we also introduce a novel triggering policy. We present simulation results of typical scenarios that demonstrate the advantages of our approach. System performance metrics, derived from the simulations, show minimum degradations in high load and congestion situations. Copyright © 2007 John Wiley & Sons, Ltd. [source]


Joint packet scheduling and dynamic base station assignment for CDMA data networks

INTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, Issue 2 2008
Christian Makaya
Abstract In current code division multiple access (CDMA) based wireless systems, a base station (BS) schedules packets independently of its neighbours, which may lead to resource wastage and the degradation of the system's performance. In wireless networks, in order to achieve an efficient packet scheduling, there are two conflicting performance metrics that have to be optimized: throughput and fairness. Their maximization is a key goal, particularly in next-generation wireless networks. This paper proposes joint packet scheduling and BS assignment schemes for a cluster of interdependent neighbouring BSs in CDMA-based wireless networks, in order to enhance the system performance through dynamic load balancing. The proposed schemes are based on sector subdivision in terms of average required resource per mobile station and utility function approach. The fairness is achieved by minimizing the variance of the delay for the remaining head-of-queue packets. Inter-cell and intra-cell interferences from scheduled packets are also minimized in order to increase the system capacity and performance. The simulation results show that our proposed schemes perform better than existing schemes available in the open literature. Copyright © 2007 John Wiley & Sons, Ltd. [source]


Combinatorial Material Mechanics: High-Throughput Polymer Synthesis and Nanomechanical Screening,

ADVANCED MATERIALS, Issue 21 2005
A. Tweedie
Combinatorial materials science requires parallel advances in materials characterization. A high-throughput nanoscale synthesis/nanomechanical profiling approach capable of accurately screening the mechanical properties of 1,700 photopolymerizable materials (see Figure, scale bar: 100 ,m) within a large, discrete polymer library is presented. This approach enables rapid correlation of polymer composition, processing, and structure with mechanical performance metrics. [source]


Towards a debugging system for sensor networks

INTERNATIONAL JOURNAL OF NETWORK MANAGEMENT, Issue 4 2005
Nithya Ramanathan
Due to their resource constraints and tight physical coupling, sensor networks afford limited visibility into an application's behavior. As a result it is often difficult to debug issues that arise during development and deployment. Existing techniques for fault management focus on fault tolerance or detection; before we can detect anomalous behavior in sensor networks, we need first to identify what simple metrics can be used to infer system health and correct behavior. We propose metrics and events that enable system health inferences, and present a preliminary design of Sympathy, a debugging tool for pre- and post-deployment sensor networks. Sympathy will contain mechanisms for collecting system performance metrics with minimal memory overhead; mechanisms for recognizing application-defined events based on these metrics; and a system for collecting events in their spatiotemporal context. The Sympathy system will help programmers draw correlations between seemingly unrelated, distributed events, and produce graphs that highlight those correlations. As an example, we describe how we used a preliminary version of Sympathy to help debug a complex application, Tiny Diffusion.,Copyright © 2005 John Wiley & Sons, Ltd. [source]


Efficient packet scheduling for heterogeneous multimedia provisioning over broadband satellite networks: An adaptive multidimensional QoS-based design

INTERNATIONAL JOURNAL OF SATELLITE COMMUNICATIONS AND NETWORKING, Issue 1 2009
Hongfei Du
Abstract With their inherent broadcast capabilities and reliable extensive geographical coverage, the broadband satellite networks are emerging as a promising approach for the delivery of multimedia services in 3G and beyond systems. Given the limited capacity of the satellite component, to meet the diverse quality of service (QoS) demands of multimedia applications, it is highly desired that the available resources can be adaptively utilized in an optimized way. In this paper, we draw our attention on the development and evaluation of an efficient packet scheduling scheme in a representative broadband satellite system, namely satellite digital multimedia broadcasting (SDMB), which is positioned as one of the most attractive solutions in the convergence of a closer integration with the terrestrial mobile networks for a cost-effective delivery of point-to-multipoint services. By taking into account essential aspects of a successful QoS provisioning while preserving the system power/resource constraints, the proposed adaptive multidimensional QoS-based (AMQ) packet scheduling scheme in this paper aims to effectively satisfy diverse QoS requirements and adaptively optimize the resource utilization for the satellite multimedia broadcasting. The proposed scheme is formulated via an adaptive service prioritization algorithm and an adaptive resource allocation algorithm. By taking into account essential performance criteria, the former is capable of prioritizing contending flows based on the QoS preferences and performance dynamics, while the latter allocates the resources, in an adaptive manner, according to the current QoS satisfaction degree of each session. Simulation results show that the AMQ scheme achieves significantly better performance than those of existing schemes on multiple performance metrics, e.g. delay, throughput, channel utilization and fairness. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Large Shareholder Entrenchment and Performance: Empirical Evidence from Canada

JOURNAL OF BUSINESS FINANCE & ACCOUNTING, Issue 1-2 2008
Yves Bozec
Abstract:, Recent empirical evidence indicates that the largest publicly traded companies throughout the world have concentrated ownership. This is the case in Canada where voting rights are often concentrated in the hands of large shareholders, mostly wealthy families. Such concentrated ownership structures can generate specific agency problems, such as large shareholders expropriating wealth from minority shareholders. These costs are aggravated when large shareholders don't bear the full costs of their decisions because of the presence of mechanisms (dual class voting shares, pyramids) which lead to voting rights being greater than the cash flow rights (separation). We assess the impact of separation on various performance metrics while controlling for situations when the large shareholder has (1) the opportunity to expropriate (high free cash flows in the firm) and (2) the incentive to expropriate (low cash flow rights). We also control for when the large shareholder has the power to expropriate (high voting rights, outright control and insider management) and for the presence of family ownership. The results support our hypotheses and indicate that firm performance is lower when large shareholders have both the incentives and the opportunity to expropriate minority shareholders. [source]


Experimental validation and field performance metrics of a hybrid mobile robot mechanism

JOURNAL OF FIELD ROBOTICS (FORMERLY JOURNAL OF ROBOTIC SYSTEMS), Issue 3 2010
Pinhas Ben-Tzvi
This paper presents the experimental validation and field testing of a novel hybrid mobile robot (HMR) system using a complete physical prototype. The mobile robot system consists of a hybrid mechanism whereby the locomotion platform and manipulator arm are designed as one entity to support both locomotion and manipulation symbiotically and interchangeably. The mechanical design is briefly described along with the related control hardware architecture based on an embedded onboard wireless communication network between the robot's subsystems, including distributed onboard power using Li-ion batteries. The paper focuses on demonstrating through extensive experimental results the qualitative and quantitative field performance improvements of the mechanical design and how it significantly enhances mobile robot functionality in terms of the new operative locomotion and manipulation capabilities that it provides. In terms of traversing challenging obstacles, the robot was able to surmount cylindrical obstacles up to 0.6-m diameter; cross ditches with at least 0.635-m width; climb and descend step obstacles up to 0.7-m height; and climb and descend stairs of different materials (wood, metal, concrete, plastic plaster, etc.), different stair riser and run sizes, and inclinations up to 60 deg. The robot also demonstrated the ability to manipulate objects up to 61 kg before and after flipping over, including pushing capacity of up to 61 kg when lifting objects from underneath. The above-mentioned functions are critical in various challenging applications, such as search and rescue missions, military and police operations, and hazardous site inspections. © 2010 Wiley Periodicals, Inc. [source]


Operational performance metrics for mars exploration rovers

JOURNAL OF FIELD ROBOTICS (FORMERLY JOURNAL OF ROBOTIC SYSTEMS), Issue 8-9 2007
Edward Tunstel
The concept of operational performance metrics and associated measurement issues is explored for deployed robotic systems. The focus is on performance of mobility and robotic arm autonomy exercised on the NASA Mars Exploration Rovers surface mission. This planetary rover mission has been underway for nearly 3 years since January 2004 using two rovers performing separate missions. The autonomy functions of surface navigation, short-distance approach to surface science targets, and robotic placement of armmounted instruments on science targets are considered. Operational metrics that measure performance of these functions relative to system requirements are advocated. Specific metrics are defined and computed using telemetry from the rovers' multiyear operations on Mars and applied to rate performance during their respective missions. An existing methodology is applied to compute metrics with respect to required performance and to aggregate multiple metrics into a composite performance score for each rover. The formulation is augmented to accommodate importance weights that add flexibility in use of the metrics by different potential endusers, e.g., sponsors, program managers, systems engineers, and technologists. A brief example illustrates the application and effect of importance weights on overall rover performance scores. © 2007 Wiley Periodicals, Inc. [source]