Home About us Contact | |||
Cost Minimization (cost + minimization)
Selected AbstractsCost minimization in wireless networks with a bounded and unbounded number of interfacesNETWORKS: AN INTERNATIONAL JOURNAL, Issue 3 2009Ralf Klasing Given a graph G = (V,E) with |V| = n and |E| = m, which models a set of wireless devices (nodes V) connected by multiple radio interfaces (edges E), the aim is to switch on the minimum cost set of interfaces at the nodes to satisfy all the connections. A connection is satisfied when the endpoints of the corresponding edge share at least one active interface. Every node holds a subset of all the possible k interfaces. Depending on whether k is a priori bounded or not, the problem is called Cost Minimization in Multi-Interface Networks or Cost Minimization in Unbounded Multi-Interface Networks, respectively. We distinguish two main variations for both problems by treating the cost of maintaining an active interface as uniform (i.e., the same for all interfaces), or nonuniform. For bounded k, we show that the problem is APX-hard while we obtain an approximation factor of min for the uniform caseand a (k , 1)-approximation for the nonuniform case. For unbounded k, i.e., k is not set a priori but depends on the given instance, we prove that the problem is not approximable within O(log k) while the same approximation factor of the k -bounded case holds in the uniform case, and a min -approximation factor holds for the nonuniform case. Next, we also provide hardness and approximation results for several classes of networks: with bounded degree, trees, planar, and complete graphs. © 2008 Wiley Periodicals, Inc. NETWORKS, 2009 [source] Deterministic and stochastic scheduling with teamwork tasksNAVAL RESEARCH LOGISTICS: AN INTERNATIONAL JOURNAL, Issue 6 2004Xiaoqiang Cai Abstract We study a class of new scheduling problems which involve types of teamwork tasks. Each teamwork task consists of several components, and requires a team of processors to complete, with each team member to process a particular component of the task. Once the processor completes its work on the task, it will be available immediately to work on the next task regardless of whether the other components of the last task have been completed or not. Thus, the processors in a team neither have to start, nor have to finish, at the same time as they process a task. A task is completed only when all of its components have been processed. The problem is to find an optimal schedule to process all tasks, under a given objective measure. We consider both deterministic and stochastic models. For the deterministic model, we find that the optimal schedule exhibits the pattern that all processors must adopt the same sequence to process the tasks, even under a general objective function GC = F(f1(C1), f2(C2), , , fn(Cn)), where fi(Ci) is a general, nondecreasing function of the completion time Ci of task i. We show that the optimal sequence to minimize the maximum cost MC = max fi(Ci) can be derived by a simple rule if there exists an order f1(t) , , , fn(t) for all t between the functions {fi(t)}. We further show that the optimal sequence to minimize the total cost TC = , fi(Ci) can be constructed by a dynamic programming algorithm. For the stochastic model, we study three optimization criteria: (A) almost sure minimization; (B) stochastic ordering; and (C) expected cost minimization. For criterion (A), we show that the results for the corresponding deterministic model can be easily generalized. However, stochastic problems with criteria (B) and (C) become quite difficult. Conditions under which the optimal solutions can be found for these two criteria are derived. © 2004 Wiley Periodicals, Inc. Naval Research Logistics, 2004 [source] The Cost Effectiveness of the UK's Sovereign Debt Portfolio,OXFORD BULLETIN OF ECONOMICS & STATISTICS, Issue 4 2005Patrick J. Coe Abstract This paper provides a recursive empirical analysis of the scope for cost minimization in public debt management when the debt manager faces a given short-term interest rate dictated by monetary policy as well as risk and market impact constraints. It simulates the ,real-time' interest costs of alternative portfolios for UK government debt between April 1985 and March 2000. These portfolios are constructed using forecasts of return spreads based on a recursive modelling procedure. While we find statistically significant evidence of predictability, the interest cost savings are quite small when portfolio shares are constrained to lie within historical bounds. [source] A new approach to solving problems of multi-state system reliability optimizationQUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, Issue 2 2001Gregory Levitin Abstract Usually engineers try to achieve the required reliability level with minimal cost. The problem of total investment cost minimization, subject to reliability constraints, is well known as the reliability optimization problem. When applied to multi-state systems (MSS), the system has many performance levels, and reliability is considered as a measure of the ability of the system to meet the demand (required performance). In this case, the outage effect will be essentially different for units with different performance rate. Therefore, the performance of system components, as well as the demand, should be taken into account. In this paper, we present a technique for solving a family of MSS reliability optimization problems, such as structure optimization, optimal expansion, maintenance optimization and optimal multistage modernization. This technique combines a universal generating function (UGF) method used for fast reliability estimation of MSS and a genetic algorithm (GA) used as an optimization engine. The UGF method provides the ability to estimate relatively quickly different MSS reliability indices for series-parallel and bridge structures. It can be applied to MSS with different physical nature of system performance measure. The GA is a robust, universal optimization tool that uses only estimates of solution quality to determine the direction of search. Copyright © 2001 John Wiley & Sons, Ltd. [source] Optimization and its discontents in regulatory design: Bank regulation as an exampleREGULATION & GOVERNANCE, Issue 1 2010William H. Simon Abstract Economists and lawyers trained in economics tend to speak about regulation from a perspective organized around the basic norm of optimization. In contrast, an important managerial literature espouses a perspective organized around the basic norm of reliability. The perspectives are not logically inconsistent, but the economist's view sometimes leads in practice to a preoccupation with decisional simplicity and cost minimization at the expense of complex judgment and learning. Drawing on a literature often ignored by economists and lawyers, I elaborate the contrast between the optimization and reliability perspectives. I then show how the contrast illuminates current discussions of the reform of bank regulation. [source] |