New Variable (new + variable)

Distribution by Scientific Domains


Selected Abstracts


Operational Risk Measurement in Banking Institutions and Investment Firms: New European Evidences

FINANCIAL MARKETS, INSTITUTIONS & INSTRUMENTS, Issue 4 2008
Enrique Bonsón
The banking/investment sector must deal with a new variable, Operational Risk, for explaining various recent crises and bankruptcies. Operational Risk, which can be defined briefly as the risk generated by possible failures of a entity's Information Systems (IS), must be measured, covered, mitigated and managed by applying a series of methodologies, each of which assumes that the IS of the bank operates at a certain Stage of Sophistication. The present study proposes a scheme of evolution that details the stages of enhancement in the sophistication of their IS that banking entities may implement, so as to be capable of capturing, mitigating and managing Operational Risk. Using econometric methods, we create a proxy variable to capture the IS Sophistication of each entity. Then, the factor of entity size has been analyzed, and the country effect is explored. Additionally, the importance of intangible assets is weighted, among others entity aspects. The entity size has been revealed as the variable with most influence on the plans formulated in this respect by European entities, against other variables also considered in the present study, such as the country effect or the importance of intangible assets. The work shows how IS decisions referring to Operational Risk management are very influenced by size. It could introduce competition differences in the European banking system. [source]


Financial Intermediaries and Interest Rate Risk: II

FINANCIAL MARKETS, INSTITUTIONS & INSTRUMENTS, Issue 5 2006
Sotiris K. Staikouras
The current work extends and updates the previous survey (Staikouras, 2003) by looking at other aspects of the financial institutions' yield sensitivity. The study starts with an extensive discussion of the origins of asset-liability management and the subsequent work to identify effective ways of measuring and managing interest rate risk. The discussion implicates both regulatory and market-based approaches along with any issues surrounding their applicability. The literature is enriched by recognizing that structural and regulatory shifts affect financial institutions in different ways depending on the size and nature of their activities. It is also noted that such shifts could change the bank's riskiness, and force banks to adjust their balance sheet size by altering their maturity intermediation function. Besides yield changes, market cycles are also held responsible for asymmetric effects on corporate values. Furthermore, nonstandard investigations are considered, where embedded options and basis risk are significant above and beyond the intermediary's rate sensitivity, while shocks to the slope of the yield curve is identified as a new variable. When the discount privilege is modeled as an option, it is shown that its value is incorporated in the equities of qualifying banks. Finally, volatility clustering is further established while constant relative risk aversion is not present in the U.S. market. Although some empirical findings may be quite mixed, there is a general consensus that all forms of systematic risk, risk premia, and the risk-return trade-off do exhibit some form of variability, not only over time but also across corporate sizes and segments. [source]


Optimal Thermal Unit Commitment Integrated with Renewable Energy Sources Using Advanced Particle Swarm Optimization

IEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING, Issue 5 2009
Shantanu Chakraborty Student member
Abstract This paper presents a methodology for solving generation planning problem for thermal units integrated with wind and solar energy systems. The renewable energy sources are included in this model due to their low electricity cost and positive effect on environment. The generation planning problem also known by unit commitment problem is solved by a genetic algorithm operated improved binary particle swarm optimization (PSO) algorithm. Unlike trivial PSO, this algorithm runs the refinement process through the solutions within multiple populations. Some genetic algorithm operators such as crossover, elitism, and mutation are stochastically applied within the higher potential solutions to generate new solutions for next population. The PSO includes a new variable for updating velocity in accordance with population best along with conventional particle best and global best. The algorithm performs effectively in various sized thermal power system with equivalent solar and wind energy system and is able to produce high quality (minimized production cost) solutions. The solution model is also beneficial for reconstructed deregulated power system. The simulation results show the effectiveness of this algorithm by comparing the outcome with several established methods. Copyright © 2009 Institute of Electrical Engineers of Japan. Published by John Wiley & Sons, Inc. [source]


From mixed finite elements to finite volumes for elliptic PDEs in two and three dimensions

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 3 2004
Anis Younes
Abstract The link between Mixed Finite Element (MFE) and Finite Volume (FV) methods applied to elliptic partial differential equations has been investigated by many authors. Recently, a FV formulation of the mixed approach has been developed. This approach was restricted to 2D problems with a scalar for the parameter used to calculate fluxes from the state variable gradient. This new approach is extended to 2D problems with a full parameter tensor and to 3D problems. The objective of this new formulation is to reduce the total number of unknowns while keeping the same accuracy. This is achieved by defining one new variable per element. For the 2D case with full parameter tensor, this new formulation exists for any kind of triangulation. It allows the reduction of the number of unknowns to the number of elements instead of the number of edges. No additional assumptions are required concerning the averaging of the parameter in hetero- geneous domains. For 3D problems, we demonstrate that the new formulation cannot exist for a general 3D tetrahedral discretization, unlike in the 2D problem. However, it does exist when the tetrahedrons are regular, or deduced from rectangular parallelepipeds, and allows reduction of the number of unknowns. Numerical experiments and comparisons between both formulations in 2D show the efficiency of the new formulation. Copyright © 2003 John Wiley & Sons, Ltd. [source]


Implementation of a stabilized finite element formulation for the incompressible Navier,Stokes equations based on a pressure gradient projection

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN FLUIDS, Issue 4 2001
Ramon Codina
Abstract We discuss in this paper some implementation aspects of a finite element formulation for the incompressible Navier,Stokes equations which allows the use of equal order velocity,pressure interpolations. The method consists in introducing the projection of the pressure gradient and adding the difference between the pressure Laplacian and the divergence of this new field to the incompressibility equation, both multiplied by suitable algorithmic parameters. The main purpose of this paper is to discuss how to deal with the new variable in the implementation of the algorithm. Obviously, it could be treated as one extra unknown, either explicitly or as a condensed variable. However, we take for granted that the only way for the algorithm to be efficient is to uncouple it from the velocity,pressure calculation in one way or another. Here we discuss some iterative schemes to perform this uncoupling of the pressure gradient projection (PGP) from the calculation of the velocity and the pressure, both for the stationary and the transient Navier,Stokes equations. In the first case, the strategies analyzed refer to the interaction of the linearization loop and the iterative segregation of the PGP, whereas in the second the main dilemma concerns the explicit or implicit treatment of the PGP. Copyright © 2001 John Wiley & Sons, Ltd. [source]


Common Risk Factors Versus a Mispricing Factor of Tokyo Stock Exchange Firms: Inquiries into the Fundamental Value Derived from Analyst Earnings Forecasts,

INTERNATIONAL REVIEW OF FINANCE, Issue 3 2009
KEIICHI KUBOTA
ABSTRACT We search for common factors and/or a mispricing factor for Tokyo Stock Exchange firms. We utilize the Edwards,Bell,Ohlson model to compute the firms' fundamental value and divide this value by the firms' market price to construct a new variable called a ,value-to-price ratio' (VPR). We find that this VPR variable can generate abnormal returns even after adjusting for the risk factors related to portfolio style differences. To find out whether it is indeed a risk factor or simply a characteristic, we construct return difference portfolios of the high VPR stocks minus the low value-to-price stocks and call this portfolio the upward-forecast minus downward-forecast (UMD) factor. Fama and MacBeth test indicate that the risk premium for this UMD factor is positive. The best model in terms of the adjusted R2 value is the four-factor model in which the UMD factor is added to the Fama and French three factors. GMM Euler condition tests reveal that the UMD factor helps to price assets and that the four-factor model is not rejected. We conclude the VPR variable contains new information content that is not contained in the conventional Fama and French's three factors. [source]


Third-Person Effects and the Environment: Social Distance, Social Desirability, and Presumed Behavior

JOURNAL OF COMMUNICATION, Issue 2 2005
Jakob D. Jensen
Previous research has documented third-person effects (persons presuming that others will be more susceptible to media effects than they themselves are) and explored moderators such as social desirability (the effect reverses when the media effects are undesirable) and social distance (the effect increases as the social distance from the self increases). In a study of environmental news coverage, the authors observed the general third-person effect and the moderating role of social desirability; however, they also found that social distance affected presumed influence in complex ways reflecting varying perceptions of issue relevance for the comparison groups. A new variable, presumed behavior (the presumed effect of media coverage on others' behavior), was found to be independent of presumed influence and to offer improved prediction of perceivers' behavioral intentions. [source]


Fast, three-dimensional free-breathing MR imaging of myocardial infarction: A feasibility study

MAGNETIC RESONANCE IN MEDICINE, Issue 5 2004
Manojkumar Saranathan
Abstract Imaging delayed hyperenhancement of myocardial infarction is most commonly performed using an inversion recovery (IR) prepared 2D breathhold segmented k -space gradient echo (FGRE) sequence. Since only one slice is acquired per breathhold in this technique, 12,16 successive breathholds are required for complete anatomical coverage of the heart. This prolongs the overall scan time and may be exhausting for patients. A navigator-echo gated, free-breathing, 3D FGRE sequence is proposed that can be used to acquire a single slab covering the entire heart with high spatial resolution. The use of a new variable sampling in time (VAST) acquisition scheme enables the entire 3D volume to be acquired in 1.5,2 min, minimizing artifacts from bulk motion and diaphragmatic drift and contrast variations due to contrast media washout. Magn Reson Med 51:1055,1060, 2004. © 2004 Wiley-Liss, Inc. [source]


On the numerical approach of the enthalpy method for the Stefan problem

NUMERICAL METHODS FOR PARTIAL DIFFERENTIAL EQUATIONS, Issue 4 2004
Khaled Omrani
Abstract In this article an error bound is derived for a piecewise linear finite element approximation of an enthalpy formulation of the Stefan problem; we have analyzed a semidiscrete Galerkin approximation and completely discrete scheme based on the backward Euler method and a linearized scheme is given and its convergence is also proved. A second-order error estimates are derived for the Crank-Nicolson Galerkin method. In the second part, a new class of finite difference schemes is proposed. Our approach is to introduce a new variable and transform the given equation into an equivalent system of equations. Then, we prove that the difference scheme is second order convergent. © 2004 Wiley Periodicals, Inc. Numer Methods Partial Differential Eq, 2004 [source]


Resource abundance vs. resource dependence in cross-country growth regressions

OPEC ENERGY REVIEW, Issue 2 2010
Annika Kropf
Having analysed the macroeconomic performance of large oil exporters, I found that, in many cases, rents from natural resources have been successfully used to enhance economic growth. Nevertheless, adherents of the ,resource curse' seem to have found ample evidence suggesting that resource-abundant countries grow slower than resource-poor countries. A review of empirical research on the ,resource curse' reveals that the variables used were usually proxies for resource dependence. These variables introduce a bias, making less developed economies per se more resource ,abundant' than developed economies. As a consequence, a new variable, not containing any information on a country's stage of development, was introduced. Comparing the variables on resource dependence and resource abundance in a model by Sachs and Warner, resource abundance was not significant. In a new model, resource abundance was even positively correlated with growth. [source]


Island clustering analysis for the comparison of the membrane and the soluble protein fractions of human brain proteome

PROTEINS: STRUCTURE, FUNCTION AND BIOINFORMATICS, Issue 6 2008
Kyung-Hoon Kwon
Abstract A protein identified in multiple separate bands of a 1-D gel reflects variation in the molecular weight caused by alternative splicing, endoproteolytic cleavage, or PTMs, such as glycosylation or ubiquitination. To characterize such a protein distribution over the bands, we defined an entity called an ,island' as the band region including the bands of the same protein identified sequentially. We quantified the island distribution using a new variable called an Iscore. Previously, as described in Park et al.. (Proteomics 2006, 6, 4978,4986.), we analyzed human brain tissue using a multidimensional MS/MS separation method. Here, the new method of island analysis was applied to the previous proteome data. The soluble and membrane protein fractions of human brain tissue were reanalyzed using the island distribution. The proteome of the soluble fraction exhibited more variation in island positions than that of the membrane fraction. Through the island analysis, we identified protein modifications and protein complexes over the 1-D gel bands. [source]


Advanced monitoring of high-rate anaerobic reactors through quantitative image analysis of granular sludge and multivariate statistical analysis

BIOTECHNOLOGY & BIOENGINEERING, Issue 2 2009
J.C. Costa
Abstract Four organic loading disturbances were performed in lab-scale EGSB reactors fed with ethanol. In load disturbance 1 (LD1) and 2 (LD2), the organic loading rate (OLR) was increased between 5 and 18.5 kg,COD,m,3,day,1, through the influent ethanol concentration increase, and the hydraulic retention time decrease from 7.8 to 2.5 h, respectively. Load disturbances 3 (LD3) and 4 (LD4) were applied by increasing the OLR to 50 kg,COD,m,3,day,1 during 3 days and 16 days, respectively. The granular sludge morphology was quantified by image analysis and was related to the reactor performance, including effluent volatile suspended solids, indicator of washout events. In general, it was observed the selective washout of filamentous forms associated to granules erosion/fragmentation and to a decrease in the specific acetoclastic activity. These phenomena induced the transitory deterioration of reactor performance in LD2, LD3, and LD4, but not in LD1. Extending the exposure time in LD4 promoted acetogenesis inhibition after 144 h. The application of Principal Components Analysis determined a latent variable that encompasses a weighted sum of performance, physiological and morphological information. This new variable was highly sensitive to reactor efficiency deterioration, enclosing variations between 27% and 268% in the first hours of disturbances. The high loadings raised by image analysis parameters, especially filaments length per aggregates area (LfA), revealed that morphological changes of granular sludge, should be considered to monitor and control load disturbances in high rate anaerobic (granular) sludge bed digesters. Biotechnol. Bioeng. 2009;102: 445,456. © 2008 Wiley Periodicals, Inc. [source]


Maximizing revenue in Grid markets using an economically enhanced resource manager

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 14 2010
M. Macías
Abstract Traditional resource management has had as its main objective the optimization of throughput, based on parameters such as CPU, memory, and network bandwidth. With the appearance of Grid markets, new variables that determine economic expenditure, benefit and opportunity must be taken into account. The Self-organizing ICT Resource Management (SORMA) project aims at allowing resource owners and consumers to exploit market mechanisms to sell and buy resources across the Grid. SORMA's motivation is to achieve efficient resource utilization by maximizing revenue for resource providers and minimizing the cost of resource consumption within a market environment. An overriding factor in Grid markets is the need to ensure that the desired quality of service levels meet the expectations of market participants. This paper explains the proposed use of an economically enhanced resource manager (EERM) for resource provisioning based on economic models. In particular, this paper describes techniques used by the EERM to support revenue maximization across multiple service level agreements and provides an application scenario to demonstrate its usefulness and effectiveness. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Security Theory in the "New Regionalism",

INTERNATIONAL STUDIES REVIEW, Issue 2 2007
Robert E. Kelly
The relevance of regional security theories has grown in the wake of the Cold War. The global system has more participants,is less Eurocentric with Third World states having greater autonomy and involvement,and clearly unipolar, shifting the locus of conflict down from the global level. A new wave of regionalist scholarship has arisen in response. This review identifies this literature's central themes and suggested new variables. Its foundational and most contested challenge to international relations (IR) theory revolves around the autonomy of a regional level of analysis between the state and the globe. Accepting such autonomy, the literature broadly settles on three variables specific to regional structures. First, regional subsystems are porous. Intervention from above can overlay local dynamics. Second, proximity qualifies the security dilemma dramatically. Most states only threaten their neighbors, thus creating meaningful and distinct regional dynamics. Third, weak state-dominant regional complexes generate a shared internal security dilemma that trumps the external one. Regional organizations serve to repress shared centrifugal threats through pooled rather than ceded sovereignty. [source]


Hierarchical principal component analysis (PCA) and projection to latent structure (PLS) technique on spectroscopic data as a data pretreatment for calibration

JOURNAL OF CHEMOMETRICS, Issue 4 2001
K. Janné
Abstract Spectroscopic data consists of several hundred to some thousand variables, wherein most of the variables are autocorrelated. When PCA and PLS techniques are used for the interpretation of these kinds of data, the loading plots are usually complex due to the covariation in the spectrum, and therefore difficult to correlate to the corresponding score plot. One of the standard methods used to decrease the influence of light scatter or shifts of the spectra is the multiplicative scatter correction technique. Another technique is the hierarchical multiblock segmentation technique, where new variables are created from the original data by blocking the spectra into sub spectra, and then projecting the sub spectra by PCA. These new variables are then used in the coming PCA or PLS calculations. These techniques reduce the random and non-wanted signals from e.g. light scatter, but still conserve all systematic information in the signals, but the greatest advantage is that the technique gives an easier interpretation of the correlation between scores and the loadings. Two examples are presented; the attenuated total reflection (ATR) and NIR, which show the advantages as well as the implementation of the method. Copyright © 2001 John Wiley & Sons, Ltd. [source]


INTERCITY RENT DIFFERENTIALS IN THE U.S. HOUSING MARKET 2000: UNDERSTANDING RENT VARIATIONS AS A SOCIOLOGICAL PHENOMENON

JOURNAL OF URBAN AFFAIRS, Issue 4 2009
JOHN I. GILDERBLOOM
ABSTRACT:,This study extends the intercity rent differentials investigation by Gilderbloom and Appelbaum (1988) in relatively independent housing markets to see how it can be replicated using U.S. census data from the year 2000 against the 1970 and 1980 models with the addition of several new variables to measure its impact on intercity rents. We find that region, race, and climate no longer explain rent differentials in 2000 as it did in the 1980 research, while affirming that a large percentage of old houses and small mom-and-pop landlords causes rents to fall. We find that both the cost of homeownership and the level of household income remain critical factors in explaining the level of median rent across cities. We also find a strong correlation between cities with extensive anti-war activity in the late 1960s and same sex households having higher rents, although more research needs to be done before we argue a causal relationship. We contend that sociology needs to be put back into the equation in order to understand how rents vary from city to city. Our explanation of rent variations adds a social dimension that most other researches miss. We also show how the amount of explanatory power is increased significantly by adding in a sociological dimension. [source]


OF POLITICS AND PURPOSE: POLITICAL SALIENCE AND GOAL AMBIGUITY OF US FEDERAL AGENCIES

PUBLIC ADMINISTRATION, Issue 3 2009
JUNG WOOK LEE
As scholars have observed, government agencies have ambiguous goals. Very few large sample empirical studies, however, have tested such assertions and analysed variations among organizations in the characteristics of their goals. Researchers have developed concepts of organizational goal ambiguity, including ,evaluative goal ambiguity', and ,priority goal ambiguity', and found that these goal ambiguity variables related meaningfully to financial publicness (the degree of government funding versus prices or user charges), regulatory responsibility, and other variables. This study analyses the influence of the external political environment (external political authorities and processes) on goal ambiguity in government agencies; many researchers have analysed external influences on government bureaucracies, but very few have examined the effects on the characteristics of the organizations, such as their goals. This analysis of 115 US federal agencies indicates that higher ,political salience' to Congress, the president, and the media, relates to higher levels of goal ambiguity. A newly developed analytical framework for the analysis includes components for external environmental influences, organizational characteristics, and managerial influences, with new variables that represent components of the framework. Higher levels of political salience relate to higher levels of both types of goal ambiguity; components of the framework, however, relate differently to evaluative goal ambiguity than to priority goal ambiguity. The results contribute evidence of the viability of the goal ambiguity variables and the political environment variables. The results also show the value of bringing together concepts from organization theory and political science to study the effects of political environments on characteristics of government agencies. [source]


International Organization as a Seal of Approval: European Union Accession and Investor Risk

AMERICAN JOURNAL OF POLITICAL SCIENCE, Issue 4 2009
Julia Gray
Much of the literature on international institutions argues that membership regularizes expectations about members' future behavior. Using the accession of the postcommunist countries as a test case, this article argues that the EU can send strong signals to financial markets about the trajectory of a particular country. Examining spreads on sovereign debt from 1990 to 2006, this article shows that closing negotiation chapters on domestic economic policy,in other words, receiving a seal of approval from Brussels that previously existing policy reform is acceptable to the wider EU,substantially decreases perceptions of default risk in those countries. That decrease operates independently from policy reform that the country has taken and is also distinct from selection processes (modeled here with new variables, including UNESCO World Heritage sites and domestic movie production, that proxy for cultural factors). Thus, this particular international organization has played an important role in coordinating market sentiment on members, conferring confidence that policy reform alone could not accomplish. [source]


Finding the most variable stars in the Orion Belt with the All Sky Automated Survey

ASTRONOMISCHE NACHRICHTEN, Issue 3 2010
J.A. Caballero
Abstract We look for high-amplitude variable young stars in the open clusters and associations of the Orion Belt. We use public data from the ASAS-3 Photometric V -band Catalogue of the All Sky Automated Survey, infrared photometry from the 2MASS and IRAS catalogues, proper motions, and the Aladin sky atlas to obtain a list of the most variable stars in a survey area of side 5° centred on the bright star Alnilam (, Ori) in the centre of the Orion Belt. We identify 32 highly variable stars, of which 16 had not been reported to vary before. They are mostly variable young stars and candidates (16) and background giants (8), but there are also field cataclysmic variables, contact binaries, and eclipsing binary candidates. Of the young stars, which typically are active Herbig Ae/Be and T Tauri stars with H, emission and infrared flux excess, we discover four new variables and confirm the variability status of another two. Some of them belong to the well known , Orionis cluster. Besides, six of the eight giants are new variables, and three are new periodic variables (© 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]