Home About us Contact | |||
Incremental
Terms modified by Incremental Selected AbstractsBlending Incremental and Stratified Layering Techniques to Produce an Esthetic Posterior Composite Resin Restoration with a Predictable PrognosisJOURNAL OF ESTHETIC AND RESTORATIVE DENTISTRY, Issue 2 2001DAVID KLAFF BDS ABSTRACT Composite resin restorations play an ever-increasing role as routine restorations in everyday clinical practice. However, the long-term prognosis of these restorations is still widely debated and open to question. The restorative protocols are still evolving, whether for direct or indirect placement, and little evidence is available in the scientific literature as to the ideal choice of site, technique, and category for placement. This article discusses the problems encountered and suggests a clinical restorative protocol to optimize composite resin placement. [source] Asymmetric Taxation under Incremental and Sequential InvestmentJOURNAL OF PUBLIC ECONOMIC THEORY, Issue 5 2005PAOLO M. PANTEGHINI This paper discusses the effects of an asymmetric tax scheme on incremental and sequential investment strategies. The tax base is equal to the firm's return, net of an imputation rate. When the firm's return is less than this rate, however, no tax refunds are allowed. This scheme is neutral under both income and capital uncertainty. [source] Cost utility analysis of physical activity counselling in general practiceAUSTRALIAN AND NEW ZEALAND JOURNAL OF PUBLIC HEALTH, Issue 1 2006Kim Dalziel Objective:To evaluate the economic performance of the ,Green Prescription' physical activity counselling program in general practice. Methods:Cost utility analysis using a Markov model was used to estimate the cost utility of the Green Prescription program over full life expectancy. Program effectiveness was based on published trial data (878 inactive patients presenting to NZ general practice). Costs were based on detailed costing information and were discounted at 5% per anum. The main outcome measure is cost per quality adjusted life year (QALY) gained. Extensive one-way sensitivity analyses were performed along with probabilistic (stochastic) analysis. Results:Incremental, modelled cost utility of the Green Prescription program compared with ,usual care' was NZ2,053 per QALY gained over full life expectancy (range NZ827 to NZ37,516 per QALY). Based on the probabilistic sensitivity analysis, 90% of ICERs fell below NZ7,500 per QALY. Conclusions:Based on a plausible and conservative set of assumptions, if decision makers are willing to pay at least NZ2,000 per QALY gained the Green Prescription program is likely to represent better value for money than ,usual care'. Implications:The Green Prescription program performs well, representing a good buy relative to other published cost effectiveness estimates. Policy makers should consider encouraging general practitioners to prescribe physical activity advice in the primary care setting, in association with support from exercise specialists. [source] Incremental Benefit of 80-Lead Electrocardiogram Body Surface Mapping Over the 12-Lead Electrocardiogram in the Detection of Acute Coronary Syndromes in Patients Without ST-elevation Myocardial Infarction: Results from the Optimal Cardiovascular Diagnostic Evaluation Enabling Faster Treatment of Myocardial Infarction (OCCULT MI) TrialACADEMIC EMERGENCY MEDICINE, Issue 9 2010Brian J. O'Neil MD ACADEMIC EMERGENCY MEDICINE 2010; 17:932,939 © 2010 by the Society for Academic Emergency Medicine Abstract Background:, The initial 12-lead (12L) electrocardiogram (ECG) has low sensitivity to detect myocardial infarction (MI) and acute coronary syndromes (ACS) in the emergency department (ED). Yet, early therapies in these patients have been shown to improve outcomes. Objectives:, The Optimal Cardiovascular Diagnostic Evaluation Enabling Faster Treatment of Myocardial Infarction (OCCULT-MI) trial was a multicenter trial comparing a novel 80-lead mapping system (80L) to standard 12L ECG in patients with chest pain and presumed ACS. This secondary analysis analyzed the incremental value of the 80L over the 12L in the detection of high-risk ECG abnormalities (ST-segment elevation or ST depression) in patients with MI and ACS, after eliminating all patients diagnosed with ST-elevation MI (STEMI) by 12L ECG. Methods:, Chest pain patients presenting to one of 12 academic EDs were diagnosed and treated according to the standard care of that site and its clinicians; the clinicians were blinded to 80L results. MI was defined by discharge diagnosis of non,ST-elevation MI (NSTEMI) or unstable angina (UA) with an elevated troponin. ACS was defined as discharge diagnosis of NSTEMI or UA with at least one positive test result (troponin, stress test, angiogram) or revascularization procedure. Results:, Of the 1,830 patients enrolled in the trial, 91 patients with physician-diagnosed STEMI and 225 patients with missing 80L or 12L data were eliminated from the analysis; no discharge diagnosis was available for one additional patient. Of the remaining 1,513 patients, 408 had ACS, 206 had MI, and one had missing status. The sensitivity of the 80L was significantly higher than that of the 12L for detecting MI (19.4% vs. 10.4%, p = 0.0014) and ACS (12.3% vs. 7.1%, p = 0.0025). Specificities remained high for both tests, but were somewhat lower for 80L than for 12L for detecting both MI and ACS. Negative and positive likelihood ratios (LR) were not statistically different between groups. In patients with severe disease (defined by stenosis > 70% at catheterization, percutaneous coronary intervention, coronary artery bypass graft, or death from any cause), the 80L had significantly higher sensitivity for detecting MI (with equivalent specificity), but not ACS. Conclusions: Among patients without ST elevation on the 12L ECG, the 80L body surface mapping technology detects more patients with MI or ACS than the 12L, while maintaining a high degree of specificity. [source] Prescribed Burning to Restore Mixed-Oak Communities in Southern Ohio: Effects on Breeding- Bird PopulationsCONSERVATION BIOLOGY, Issue 5 2001Vanessa L. Artman We studied the effects of repeated burning (1,4 years of annual burning) and recovery (1 year after burning,) on the breeding bird community. Burning resulted in incremental but temporary reductions in the availability of leaf litter, shrubs, and saplings, but it did not affect trees, snags, or understory vegetation cover. Of 30 bird species monitored, 4 were affected negatively and 2 were affected positively by burning. Population densities of Ovenbirds ( Seiurus aurocapillus), Worm-eating Warblers ( Helmitheros vermivorus), and Hooded Warblers ( Wilsonia citrina) declined incrementally in response to repeated burning and did not recover within 1 year after burning, suggesting a lag time in response to the changes in habitat conditions. Densities of Northern Cardinals ( Cardinalis cardinalis) fluctuated among years in the control units, but remained low in the burned units. Densities of American Robins ( Turdus migratorius) and Eastern Wood-Pewees ( Contopus virens) increased in response to burning, but these increases were apparent only after several years of repeated burning. In general, burning resulted in short-term reductions in the suitability of habitat for ground- and low-shrub-nesting birds, but it improved habitat for ground- and aerial-foraging birds. Overall, there were no changes in the composition of the breeding-bird community. Total breeding bird population levels were also unaffected by burning. Our results suggest that prescribed burning applied on a long-term basis or across large spatial scales is likely to have adverse effects on ground- and low-shrub-nesting bird species, but other changes in the composition of the breeding-bird community are likely to be minimal as long as the closed-canopy forest structure is maintained within the context of prescribed burning. Resumen: Se está reintroduciendo fuego artificialmente en los bosque del sur de Ohio para determinar su efectividad para restaurar y mantener comunidades de bosques mixtos de encino ( Quercus spp.). Estudiamos los efectos de quemas repetidas (1,4 años de quema anual,) y de recuperación (1 año después de la quema) sobre la comunidad de aves reproductivas. La quema resultó en reducciones temporales en la disponibilidad de hojarasca, arbustos y renuevos, pero no afectó a los árboles, tocones o la cubierta vegetal del sotobosque. De 30 especies de aves monitoredas, 4 fueron afectadas negativamente por la quema y 2 fueron afectadas positivamente. Las densidades de población de Seiurus aurocapillus, de Helmitheros vermivorus y de Wilsonia citrina declinaron incrementalmente en respuesta a quemas repetidas y no se recuperaron en un año después de la quema, sugiriendo un retraso en el tiempo de respuesta a los cambios en las condiciones del hábitat. Las densidades de Cardinalis cardinalis fluctuaron entre años en las unidades control, pero permanecieron bajas en las unidades quemadas. Las densidades de Turdus migratorius y de Contopus virens aumentaron en respuesta a la quema, pero estos incrementos fueron evidentes sólo hasta varios años después de quemas repetidas. En general, en el corto plazo la quema resultó en reducciones en la calidad del hábitat para aves que anidan sobre el suelo y en arbustos bajos, pero mejoró el hábitat para aves que forrajean en el suelo y el aire. En general, no hubo cambios en la composición de la comunidad de aves reproductivas. Los niveles totales de poblaciones de aves reproductivas tampoco fueron afectados por la quema. Nuestros resultados sugieren la posibilidad de que la quema prescrita aplicada a largo plazo o en escalas espaciales grandes tenga efectos adversos sobre especies de aves que anidan sobre el suelo y en arbustos bajos, pero la posibilidad de cambios en la composición de la comunidad de aves reproductivas es mínima. [source] Impaired oxygen kinetics in beta-thalassaemia major patientsACTA PHYSIOLOGICA, Issue 3 2009I. Vasileiadis Abstract Aim:, Beta-thalassaemia major (TM) affects oxygen flow and utilization and reduces patients' exercise capacity. The aim of this study was to assess phase I and phase II oxygen kinetics during submaximal exercise test in thalassaemics and make possible considerations about the pathophysiology of the energy-producing mechanisms and their expected exercise limitation. Methods:, Twelve TM patients with no clinical evidence of cardiac or respiratory disease and 10 healthy subjects performed incremental, symptom-limited cardiopulmonary exercise testing (CPET) and submaximal, constant workload CPET. Oxygen uptake (Vo2), carbon dioxide output and ventilation were measured breath-by-breath. Results:, Peak Vo2 was reduced in TM patients (22.3 ± 7.4 vs. 28.8 ± 4.8 mL kg,1 min,1, P < 0.05) as was anaerobic threshold (13.1 ± 2.7 vs. 17.4 ± 2.6 mL kg,1 min,1, P = 0.002). There was no difference in oxygen cost of work at peak exercise (11.7 ± 1.9 vs. 12.6 ± 1.9 mL min,1 W,1 for patients and controls respectively, P = ns). Phase I duration was similar in TM patients and controls (24.6 ± 7.3 vs. 23.3 ± 6.6 s respectively, P = ns) whereas phase II time constant in patients was significantly prolonged (42.8 ± 12.0 vs. 32.0 ± 9.8 s, P < 0.05). Conclusion:, TM patients present prolonged phase II on-transient oxygen kinetics during submaximal, constant workload exercise, compared with healthy controls, possibly suggesting a slower rate of high energy phosphate production and utilization and reduced oxidative capacity of myocytes; the latter could also account for their significantly limited exercise tolerance. [source] Continuous and Discontinuous Innovation: Overcoming the Innovator DilemmaCREATIVITY AND INNOVATION MANAGEMENT, Issue 4 2007Mariano Corso Challenged by competition pressures and unprecedented pace of change, firms can no longer choose whether to concentrate on the needs of today's customers or on the anticipation of those of tomorrow: they must be excellent in both. This requires managing two related balancing acts: on the one side, being excellent in both exploitation and exploration of their capabilities and, on the other side, being excellent in managing both incremental and radical innovation. These balances are critical since exploitation and exploration, on the one side, and incremental and radical innovation, on the other, require different approaches that have traditionally been considered difficult to combine within the same organization. Working on evidence and discussion from the 7th CINet Conference held in Lucca (Italy) in 2006, this Special Section is aimed at contributing to theory and practice on these two complex balancing acts that today represent a hot issue in innovation management. [source] Cracking the Incremental Paradigm of Japanese CreativityCREATIVITY AND INNOVATION MANAGEMENT, Issue 4 2004Tony Proctor This paper points out the differences between incremental and paradigm shift approaches to creativity in management that exists between Japanese and Western schools of thought. A number of examples are used to illustrate how a systematic incremental process that places emphasis on continuous improvement is key to Japanese creativity in management. A framework that captures the cornerstones of Japanese creativity is outlined. The paper concludes by discussing the contribution of this research and outlines a plan for further work. [source] A NEW THEORY OF THE BUDGETARY PROCESSECONOMICS & POLITICS, Issue 1 2006SOUMAYA M. TOHAMY This paper offers an alternative to the view that budgetary decisions are incremental because they are complex, extensive, and conflicted. Our model interprets incrementalism as the result of a legislative political strategy in response to interest group politics and economic conditions. Accordingly, a legislator chooses between single-period budgeting or multiperiod budgeting, where single-period budgeting is associated with a greater chance of non-incremental budgeting outcomes. We use a statistical procedure developed by Dezhbakhsh et al. (2003) for identifying non-incremental outcomes to test the implications of the model. Results support the model's predictions: a higher discount rate and a persistently large deficit appear to cause departures from incremental budgeting; Democrats' control over the political process have a similar effect, while a higher inflation rate has an opposite effect. [source] Abnormal vascular reactivity at rest and exercise in obese boysEUROPEAN JOURNAL OF CLINICAL INVESTIGATION, Issue 2 2009L. Karpoff Abstract Background, Obese children exhibit vascular disorders at rest depending on their pubertal status, degree of obesity, and level of insulin resistance. However, data regarding their vascular function during exercise remain scarce. The aims of the present study were to evaluate vascular morphology and function at rest, and lower limb blood flow during exercise, in prepubertal boys with mild-to-moderate obesity and in lean controls. Materials and methods, Twelve moderately obese prepubertal boys [Body Mass Index (BMI: 23·9 ± 2·6 kg m,2)] and thirteen controls (BMI:17·4 ± 1·8 kg m,2), matched for age (mean age: 11·6 ± 0·6 years) were recruited. We measured carotid intima-media thickness (IMT) and wall compliance and incremental elastic modulus, resting brachial flow-mediated dilation (FMD) and nitrate-dependent dilation (NDD), lower limb blood flow during local knee-extensor incremental and maximal exercise, body fat content (DEXA), blood pressure, blood lipids, insulin and glucose. Results, Compared to lean controls, obese boys had greater IMT (0·47 ± 0·06 vs. 0·42 ± 0·03 mm, P < 0·05) but lower FMD (4·6 ± 2·8 vs. 8·8 ± 3·2%, P < 0·01) in spite of similar maximal shear rate, without NDD differences. Lower limb blood flow (mL min,1·100 g,1) increased significantly from rest to maximal exercise in both groups, although obese children reached lower values than lean counterparts whatever the exercise intensity. Conclusions, Mild-to-moderate obesity in prepubertal boys without insulin resistance is associated with impaired endothelial function and blunted muscle perfusion response to local dynamic exercise without alteration of vascular smooth muscle reactivity. [source] Does a critical mass exist?EUROPEAN JOURNAL OF POLITICAL RESEARCH, Issue 2 2002A comparative analysis of women's legislative representation since 1950 It has often been argued theoretically that a ,critical mass,' ranging from 10 to 35 per cent women, is needed before major changes in legislative institutions, behaviour, policy priorities and policy voting occurs. This paper examines one of the less-explored dimensions of the critical mass concept: Is there a process by which women reaching a critical mass of the legislature accelerates the election of further women? Using data from the Inter-Parliamentary Union and International Institute for Democracy and Electoral Assistance, we analyze this question for 20 industrialized democracies over a period of half a century, longer than any other relevant research. Descriptive results indicate that gains in women's representation have been incremental rather than a critical mass accelerating the election of women to legislatures. In a multivariate analysis of the percentage of women in the lower house of the legislature, the critical mass is tested against established explanations of women's gains in seats: institutional rules, egalitarian political culture, political parties and economic development. Of two measures of the critical mass theory, one has no impact and the second results in only a small increase in women's gains. Far from being clearly demonstrated, critical mass theories need empirical testing. [source] Methods for crack opening load and crack tip shielding determination: a reviewFATIGUE & FRACTURE OF ENGINEERING MATERIALS AND STRUCTURES, Issue 11 2003S. STOYCHEV and ABSTRACT In this paper a review of the literature on crack closure/opening load and crack tip shielding effects determination methods is presented. Commonly used ,subjective' (visual) and ,non-subjective' approaches have been included. Procedures associated with the determination of an effective crack driving force for both Elber type and that of partial (or incremental) crack closure models have been covered. Comparison among different methods of analyses based on compliance and fatigue crack growth rate measurements is discussed together with their implications and difficulties in fatigue crack growth correlations. [source] FIGHTING FIRE WITH A BROKEN TEACUP: A COMPARATIVE ANALYSIS OF SOUTH AFRICA'S LAND-REDISTRIBUTION PROGRAM,GEOGRAPHICAL REVIEW, Issue 3 2008WILLIAM G. MOSELEY ABSTRACT. Since the rise of its first democratically elected government in 1994, South Africa has sought to redress its highly inequitable land distribution through a series of land-reform programs. In this study we examine land-redistribution efforts in two of South Africa's provinces, the Western Cape and Limpopo. By analyzing a cross-section of projects in these two locales we develop a political ecology of stymied land-reform possibilities to explain the limited progress to date. Given South Africa's ambitious goal of redistributing 30 percent of its white-owned land by 2014 and the incremental and flawed nature of its redistribution program, we argue that the process is like trying to put out a fire with a broken teacup. Our results are based on interviews with policymakers, commercial farmers, and land-redistribution beneficiaries, as well as on an analysis of land-use change in Limpopo Province. [source] The Hidden Politics of Administrative Reform: Cutting French Civil Service Wages with a Low-Profile InstrumentGOVERNANCE, Issue 1 2007PHILIPPE BEZES The article addresses internal and hidden politics of changes in bureaucracies by focusing on the introduction and use of policy instruments as institutional change without radical or explicit shifts in administrative systems. Beneath public administrative reforms, it examines the use of "low-profile instruments" characterized by their technical and goal-oriented dimension but also by their low visibility to external actors due to the high complexity of their commensurating purpose and the automaticity of their use. The core case study of the paper offers a historical sociology of a technique for calculating the growth of the French civil service wage bill from the mid-1960s to the 2000s. The origins, uses, and institutionalisation of this method in the French context are explored to emphasize the important way of governing the bureaucracy at times of crisis through automatic, unobtrusive, incremental, and low-profile mechanisms. While insisting on the salience of techniques for calculating, measuring, classifying, and indexing in the contemporary art of government, it also suggests the need for observing and explaining "everyday forms of retrenchment" in bureaucracies. [source] Banishing Bureaucracy or Hatching a Hybrid?GOVERNANCE, Issue 2 2000The CanadianFood Inspection Agency, the Politics of Reinventing Government The Canadian Food Inspection Agency (CFIA) is a means to overcoming long-standing bureaucratic politics while attaining some major policy ends.Contrary to some of the new public management bravado of transforming the public sector, the CFIA is not a bureaucratic revolution in reshaping the Canadian State. Changes in scientific staffing, funding, and inspection have been more incremental than fundamental. Moreover, the CFIA is something less than the special and separate operating agency models discussed in the alternative service delivery literature in terms of autonomy and market orientation, but something more autonomous and entrepreneurial than traditional government departments. These organizational and managerial reforms are modest extensions providing a means for achieving economies and enhanced effectiveness in carrying out the mandate of safety, consumer protection, and market access for Canadian food, animal, plant, and forestry products. [source] FROM REVOLUTION TO MODERNIZATION: THE PARADIGMATIC TRANSITION IN CHINESE HISTORIOGRAPHY IN THE REFORM ERAHISTORY AND THEORY, Issue 3 2010HUAIYIN LI ABSTRACT Chinese historiography of modern China in the 1980s and 1990s underwent a paradigmatic transition: in place of the traditional revolutionary historiography that bases its analyses on Marxist methodologies and highlights rebellions and revolutions as the overarching themes in modern Chinese history, the emerging modernization paradigm builds its conceptual framework on borrowed modernization theory and foregrounds top-down, incremental reforms as the main force propelling China's evolution to modernity. This article scrutinizes the origins of the new paradigm in the context of a burgeoning modernization discourse in reform-era China. It further examines the fundamental divides between the two types of historiography in their respective constructions of master narratives and their different approaches to representing historical events in modern China. Behind the prevalence of the modernization paradigm in Chinese historiography is Chinese historians' unchanged commitment to serving present political needs by interpreting the past. [source] Smoothing Mechanisms in Defined Benefit Pension Accounting Standards: A Simulation Study,ACCOUNTING PERSPECTIVES, Issue 2 2009Cameron Morrill ABSTRACT The accounting for defined benefit (DB) pension plans is complex and varies significantly across jurisdictions despite recent international convergence efforts. Pension costs are significant, and many worry that unfavorable accounting treatment could lead companies to terminate DB plans, a result that would have important social implications. A key difference in accounting standards relates to whether and how the effects of fluctuations in market and demographic variables on reported pension cost are "smoothed". Critics argue that smoothing mechanisms lead to incomprehensible accounting information and induce managers to make dysfunctional decisions. Furthermore, the effectiveness of these mechanisms may vary. We use simulated data to test the volatility, representational faithfulness, and predictive ability of pension accounting numbers under Canadian, British, and international standards (IFRS). We find that smoothed pension expense is less volatile, more predictive of future expense, and more closely associated with contemporaneous funding than is "unsmoothed" pension expense. The corridor method and market-related value approaches allowed under Canadian GAAP have virtually no smoothing effect incremental to the amortization of actuarial gains and losses. The pension accrual or deferred asset is highly correlated with the pension plan deficit/surplus. Our findings complement existing, primarily archival, pension accounting research and could provide guidance to standard-setters. [source] Regularized sequentially linear saw-tooth softening modelINTERNATIONAL JOURNAL FOR NUMERICAL AND ANALYTICAL METHODS IN GEOMECHANICS, Issue 7-8 2004Jan G. Rots Abstract After a brief discussion on crack models, it is demonstrated that cracking is often accompanied by snaps and jumps in the load,displacement response which complicate the analysis. This paper provides a solution by simplifying non-linear crack models into sequentially linear saw-tooth models, either saw-tooth tension-softening for unreinforced material or saw-tooth tension-stiffening for reinforced material. A linear analysis is performed, the most critical element is traced, the stiffness and strength of that element are reduced according to the saw-tooth curve, and the process is repeated. This approach circumvents the use of incremental,iterative procedures and negative stiffness moduli and is inherently stable. The main part of the paper is devoted to a regularization procedure that provides mesh-size objectivity of the saw-tooth model. The procedure is similar to the one commonly used in the smeared crack framework but, in addition, both the initial tensile strength and the ultimate strain are rescaled. In this way, the dissipated fracture energy is invariant with respect not only to the mesh size, but also to the number of saw-teeth adopted to discretize the softening branch. Finally, the potential of the model for large-scale fracture analysis is demonstrated. A masonry façade subjected to tunnelling induced settlements is analysed. The very sharp snap-backs associated with brittle fracture of the façade automatically emerge with sequentially linear analysis, whereas non-linear analysis of the façade using smeared or discrete crack models shows substantial difficulties despite the use of arc-length schemes. Copyright © 2004 John Wiley & Sons, Ltd. [source] PID adaptive control of incremental and arclength continuation in nonlinear applicationsINTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN FLUIDS, Issue 11 2009A. M. P. Valli Abstract A proportional-integral-derivative (PID) control approach is developed, implemented and investigated numerically in conjunction with continuation techniques for nonlinear problems. The associated algorithm uses PID control to adapt parameter stepsize for branch,following strategies such as those applicable to turning point and bifurcation problems. As representative continuation strategies, incremental Newton, Euler,Newton and pseudo-arclength continuation techniques are considered. Supporting numerical experiments are conducted for finite element simulation of the ,driven cavity' Navier,Stokes benchmark over a range in Reynolds number, the classical Bratu turning point problem over a reaction parameter range, and for coupled fluid flow and heat transfer over a range in Rayleigh number. Computational performance using PID stepsize control in conjunction with inexact Newton,Krylov solution for coupled flow and heat transfer is also examined for a 3D test case. Copyright © 2009 John Wiley & Sons, Ltd. [source] Entropy-based metrics in swarm clusteringINTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 9 2009Bo Liu Ant-based clustering methods have received significant attention as robust methods for clustering. Most ant-based algorithms use local density as a metric for determining the ants' propensities to pick up or deposit a data item; however, a number of authors in classical clustering methods have pointed out the advantages of entropy-based metrics for clustering. We introduced an entropy metric into an ant-based clustering algorithm and compared it with other closely related algorithms using local density. The results strongly support the value of entropy metrics, obtaining faster and more accurate results. Entropy governs the pickup and drop behaviors, while movement is guided by the density gradient. Entropy measures also require fewer training parameters than density-based clustering. The remaining parameters are subjected to robustness studies, and a detailed analysis is performed. In the second phase of the study, we further investigated Ramos and Abraham's (In: Proc 2003 IEEE Congr Evol Comput, Hoboken, NJ: IEEE Press; 2003. pp 1370,1375) contention that ant-based methods are particularly suited to incremental clustering. Contrary to expectations, we did not find substantial differences between the efficiencies of incremental and nonincremental approaches to data clustering. © 2009 Wiley Periodicals, Inc. [source] Flexible models with evolving structureINTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 4 2004Plamen P. Angelov A type of flexible model in the form of a neural network (NN) with evolving structure is discussed in this study. We refer to models with amorphous structure as flexible models. There is a close link between different types of flexible models: fuzzy models, fuzzy NN, and general regression models. All of them are proven universal approximators and some of them [Takagi-Sugeno fuzzy model with singleton outputs and radial-basis function] are interchangeable. The evolving NN (eNN) considered here makes use of the recently introduced on-line approach to identification of Takagi-Sugeno fuzzy models with evolving structure (eTS). Both TS and eNN differ from the other model schemes by their gradually evolving structure as opposed to the fixed structure models, in which only parameters are subject to optimization or adaptation. The learning algorithm is incremental and combines unsupervised on-line recursive clustering and supervised recursive on-line output parameter estimation. eNN has potential in modeling, control (if combined with the indirect learning mechanism), fault detection and diagnostics etc. Its computational efficiency is based on the noniterative and recursive procedure, which combines the Kalman filter with proper initializations and on-line unsupervised clustering. The eNN has been tested with data from a real air-conditioning installation. Applications to real-time adaptive nonlinear control, fault detection and diagnostics, performance analysis, time-series forecasting, knowledge extraction and accumulation, are possible directions of their use in future research. © 2004 Wiley Periodicals, Inc. [source] Incremental learning of collaborative classifier agents with new class acquisition: An incremental genetic algorithm approachINTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 11 2003Sheng-Uei Guan A number of soft computing approaches such as neural networks, evolutionary algorithms, and fuzzy logic have been widely used for classifier agents to adaptively evolve solutions on classification problems. However, most work in the literature focuses on the learning ability of the individual classifier agent. This article explores incremental, collaborative learning in a multiagent environment. We use the genetic algorithm (GA) and incremental GA (IGA) as the main techniques to evolve the rule set for classification and apply new class acquisition as a typical example to illustrate the incremental, collaborative learning capability of classifier agents. Benchmark data sets are used to evaluate proposed approaches. The results show that GA and IGA can be used successfully for collaborative learning among classifier agents. © 2003 Wiley Periodicals, Inc. [source] The successful management of organisational change in tourism SMEs: initial findings in UK visitor attractionsINTERNATIONAL JOURNAL OF TOURISM RESEARCH, Issue 4 2008Rune Todnem By Abstract Organisational change management theory for small and medium-sized enterprises (SMEs) within the tourism industry is an under-researched field. Changing political, economic, social and technological factors can leave unprepared SMEs exposed to external as well as internal pressures, which can lead to underperformance, or in worst case scenario, business failure. This paper, reporting on the findings of exploratory research of nine UK-based visitor attractions, all qualifying as SMEs, suggests that the successful management of change is crucial for SMEs' survival and success. The findings argue that the current approach taken to organisational change management within the industry is bumpy incremental, bumpy continuous and planned. Hence, the paper provides a framework for managing organisational change based on eight critical success factors identified by the study: adaptability and flexibility, commitment and support, communication and co-operation, continuous learning and improvement, formal strategies, motivation and reward, pragmatism, and the right people. Copyright © 2008 John Wiley & Sons, Ltd. [source] Japan's never,ending social security reformsINTERNATIONAL SOCIAL SECURITY REVIEW, Issue 4 2002Noriyuki Takayama This paper examines implications of the 2002 population projections for future trends in pension and healthcare costs in Japan. Current redistributive pension and healthcare programmes have resulted in considerably higher per capita income for the aged than the non,aged population. Substantive reforms are needed to lessen the extent of such redistribution, but political considerations have meant that only incremental reforms have been feasible. A start, however, has been made on introducing private initiatives in pensions; and shifting from the command and control model operated by the central government to a contracting model for healthcare. [source] Highly Valued Equity and Discretionary AccrualsJOURNAL OF BUSINESS FINANCE & ACCOUNTING, Issue 1-2 2010Robert E. Houmes Abstract:, Overvalued equity provides a strong incentive for managers to report earnings that do not disappoint the market ( Jensen, 2005). We find that this can be extended to highly valued equity more generally. In the year following the classification as highly valued and compared to firms with less extreme valuations, highly valued firms have significantly higher discretionary accruals and exhibit a more pronounced positive association between discretionary accruals and proxies for the likelihood of failing to meet earnings targets. These findings are consistent with the use of discretionary accruals to manage earnings in support of extreme valuation. Because highly valued equity will likely result in CEOs with valuable stock and stock option portfolios, we test whether and show that the overvalued equity incentive is incremental to a CEO's equity portfolio incentive. One implication is that directors and audit committees should be especially on guard for possible earnings management when a firm has extremely high valuation multiples and when the CEO has a lot of equity at risk. [source] Implied Standard Deviations and Post-earnings Announcement VolatilityJOURNAL OF BUSINESS FINANCE & ACCOUNTING, Issue 3-4 2002Daniella Acker This paper investigates volatility increases following annual earnings announcements. Standard deviations implied by options prices are used to show that announcements of bad news result in a lower volatility increase than those of good news, and delay the increase by a day. Reports that are difficult to interpret also delay the volatility increase. This delay is incremental to that caused by reporting bad news, although the effect of bad news on slowing down the reaction time is dominant. It is argued that the delays reflect market uncertainty about the implications of the news. [source] Factors Associated with Differences in the Magnitude of Abnormal Returns Around NYSE Versus Nasdaq Firms' Earnings AnnouncementsJOURNAL OF BUSINESS FINANCE & ACCOUNTING, Issue 9-10 2001Youngsoon Susan Cheon This study provides an explanation for the ,exchange effect' puzzle documented in prior accounting research. Grant (1980) finds that the magnitude of earnings announcement week abnormal returns is higher, on average, for firms traded over-the-counter than for NYSE firms. Atiase (1987) shows that this incremental ,exchange effect' persists even after controlling for firm size. We investigate potential explanations for this incremental exchange effect. We first show that even after controlling for differences in firm size, Nasdaq firms have less rich information environments and enjoy greater growth opportunities than NYSE firms. We then investigate whether differential predisclosure information environments and/or growth opportunities can explain the incremental exchange effect. The results indicate that although the absolute magnitude of the earnings announcement-related abnormal returns is inversely related to proxies for the amount of predisclosure information, the incremental exchange effect cannot be explained by differences in the predisclosure information environment. In contrast, after controlling for differences in growth opportunities across NYSE versus Nasdaq firms, and investors' heightened sensitivity to Nasdaq firms' growth opportunities in particular, there is no significant incremental exchange effect (whether or not we control for predisclosure information). These results suggest that the incremental exchange effect puzzle documented in prior research is more likely to reflect growth-related phenomena than differences in the predisclosure information environment. [source] Psychometric properties of the Brief Life Satisfaction ScalesJOURNAL OF CLINICAL PSYCHOLOGY, Issue 1 2004Bernard Lubin The development, psychometric characteristics, and clinical utility of a brief measure of life satisfaction, the Brief Life Satisfaction Scales (BLSS), are presented. Factor analysis of the 10-item scale yielded three factors: self-satisfaction, interpersonal satisfaction, and social role satisfaction. Reliability (internal consistency and retest) and validity (concurrent, construct, discriminant, known group, predictive, and incremental) were adequate to good. Clinical uses of the BLSS and additional studies are discussed. © 2003 Wiley Periodicals, Inc. J Clin Psychol. [source] Community organizational learning: Case studies illustrating a three-dimensional model of levels and orders of changeJOURNAL OF COMMUNITY PSYCHOLOGY, Issue 3 2007Douglas D. Perkins We present a three-dimensional cube framework to help community organizational researchers and administrators think about an organization's learning and empowerment-related structures and processes in terms of firstorder (incremental or ameliorative) and second-order (transformative) change at the individual, organizational, and community levels. To illustrate application of the framework, case studies of three different types of exemplary nonprofit organizations (a participatory neighborhood planning organization, a grassroots faith-based social action coalition, and a larger community-based human service agency) were based on qualitative interviews and participant observations. Our analysis, rooted in organizational learning theory, suggests that organizations that empower staff and volunteers through opportunities for learning and participation at the individual level are better able to succeed in terms of organizational-level learning and transformation. Community-level change is particularly difficult but must be made a more explicit goal. Learning that can lead to second-order change at each level must help participants engage in critical analysis of (a) the organization's demonstrated goals and values; (b) the power relationships implicit in decision making at each level; (c) the interdependent role of participant stakeholders and organizations as part of a complex, community-wide (or larger) system; and (d) how to work toward transformative change of all of the above. © 2007 Wiley Periodicals, Inc. [source] Sustainable development and institutional change: evidence from the Tiogo Forest in Burkina Faso,JOURNAL OF INTERNATIONAL DEVELOPMENT, Issue 8 2007Philippe Dulbecco Abstract The management of forest resources in developing countries is often inefficient and this is particularly the case when forests are a public good managed by the state. These inefficiencies are generally the result of both externalities and free-riding behaviour. The solution usually considered is to change the property rights structure of the resource, that is, privatisation of forests. It appears, however, that privatisation also has inefficiencies of its own, particularly when it is imposed on local populations. The aim of our contribution is to go beyond the usual state management versus privatisation debate, and to propose instead a property rights structure and related co-ordination scheme which take into account the specific institutional circumstances of the economic setting in which the natural resources are being exploited. The purpose is to suggest solutions based on the need to attain coherence between the external institutional structure and the behaviour of local players. In others words, the challenge is to establish the conditions necessary for an induced,rather than imposed,institutional change. A property rights structure of a resource must consequently be analysed from two perspectives. The first, and more traditional one, sees property rights as an efficient institutional structure of production enabling a reduction in transaction costs. The second proposes to evaluate any given property rights structure from the standpoint of its ability to offer a solution to the issue of an effective link between the legal framework and the behaviour of the players. Our analysis will make use of our knowledge of the forest of Tiogo in Burkina Faso based on a survey organised in 12 riverside villages, and using a sample of 300 households. The case of the Tiogo Forest suggests that institutional change needs to follow an incremental and path-dependent process within which the state is invited to play a major role together with the local communities. Indeed the institutional choices of the Tiogo Forest households indicate that they favour an inclusion of the local population in resource management and co-administration of forestry resources with the state. Such an institutional structure favours a negotiated rather than an imposed scheduling of measures, and seeks a minimum of consensus to ensure the adhesion of actors and users to the new institutional arrangements, whilst limiting the number of bad players. Copyright © 2007 John Wiley & Sons, Ltd. [source] |