Easy Task (easy + task)

Distribution by Scientific Domains


Selected Abstracts


Comparison of Additional Costs for Several Replacement Strategies of Randomly Ageing Reinforced Concrete Pipes

COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 7 2009
Franck Schoefs
Some of them carry seawater and can deteriorate with time because of internal corrosion. Because of the low O2 content of aggressive water, slow corrosion is expected for such applications. If the RCPs are not periodically replaced, they will eventually fail. Replacement strategies for these pipes depend on (1) the risks associated with the failure of the water distribution network, and (2) the costs associated with replacing the pipes, including the removal of existing pipes, installation of new pipes, and associated production losses. Because of the lack of statistical data regarding RCP failure, the development of a risk-based replacement strategy is not an easy task. This article demonstrates how predictive models for the evolution of the failure of RCPs and the associated consequences of failure can be used to develop risk-based replacement strategies for RCPs. An application for the replacement strategies of a network modeled as a system consisting of 228 RCPs is presented as a case study. We focus on the assessment of the number of replaced components that governs the costs. The main objective of this article is to provide a theoretical approach for comparing replacement strategies, based on (1) the results of a reliability study, (2) the representation of the distributions of failed components (binomial distribution), and (3) the decision tree representation for replacement of RCPs. A focus on the scatter of the induced costs themselves is suggested to emphasize the financial risk. [source]


RESTRUCTURING U.S. FEDERAL FINANCIAL REGULATION

CONTEMPORARY ECONOMIC POLICY, Issue 3 2007
ROSE M. KUSHMEIDER
Despite changes over the past 70 yr, the U.S. federal financial regulatory system remains rooted in the reforms of the 1930s. The institutions governed by this system have, nevertheless, continued to evolve. Today, regulation of large, multiproduct, internationally active financial organizations poses challenges for a system designed largely to regulate smaller, distinct, locally based organizations. Reform of the regulatory system, however, is not an easy task,complex issues regarding deposit insurance, the role of the central bank, and the dual banking system must be addressed. In the absence of a crisis, however, regulatory restructuring will not likely generate much political interest. (JEL G28) [source]


Managing stakeholders or the environment?

CORPORATE SOCIAL RESPONSIBILITY AND ENVIRONMENTAL MANAGEMENT, Issue 1 2009
The challenge of relating indicators in practice
Abstract Many organizations present their environmental work in the form of annual reports and use the indicators in them for follow-up. However, internal communication and management is needed for environmental improvements. The indicators found in reports may be suitable for external communication, but are they also suitable internally and operationally? This article reviews the existing literature on environmental indicators. With the help of an operational approach, from organisation theory, and a life-cycle approach, indicators are analysed. The analysis shows that formulating indicators for internal management is not an easy task; available guidelines are of little help. It is concluded that the environment can be managed internally by relating indicators. Therefore, an additional set of indicators for internal management and a wider responsibility for the life cycle are recommended. The analysis and recommendations are illustrated with examples drawn from the field of property management. Copyright © 2008 John Wiley & Sons, Ltd and ERP Environment. [source]


The international monetary system in the last and next 20 years

ECONOMIC POLICY, Issue 47 2006
Barry Eichengreen
SUMMARY The evolution of exchange rate regimes The last two decades have seen far-reaching changes in the structure of the international monetary system. Europe moved from the European Monetary System to the euro. China adopted a dollar peg and then moved to a basket, band and crawl in 2005. Emerging markets passed through a series of crises, leading some to adopt regimes of greater exchange rate flexibility and others to rethink the pace of capital account liberalization. Interpreting these developments is no easy task: some observers conclude that recent trends are confirmation of the ,bipolar view' that intermediate exchange rate arrangements are disappearing, while members of the ,fear of floating school' conclude precisely the opposite. We show that the two views can be reconciled if one distinguishes countries by their stage of economic and financial development. Among the advanced countries, intermediate regimes have essentially disappeared; this supports the bipolar view for the group of countries for which it was first developed. Within this subgroup, the dominant movement has been toward hard pegs, reflecting monetary unification in Europe. While emerging markets have also seen a decline in the prevalence of intermediate arrangements, these regimes still account for more than a third of the relevant subsample. Here the majority of the evacuees have moved to floats rather than fixes, reflecting the absence of EMU-like arrangements in other parts of the world. Among developing countries, the prevalence of intermediate regimes has again declined, but less dramatically. Where these regimes accounted for two-thirds of the developing country subsample in 1990, they account for a bit more than half of that subsample today. As with emerging markets, the majority of those abandoning the middle have moved to floats rather than hard pegs. The gradual nature of these trends does not suggest that intermediate regimes will disappear outside the advanced countries anytime soon. , Barry Eichengreen and Raul Razo-Garcia [source]


Molecular biology of aromatic plants and spices.

FLAVOUR AND FRAGRANCE JOURNAL, Issue 5 2010
A review.
Abstract In recent years, molecular tools have been used to help to elucidate some aspects of genetic diversity in aromatic species, the genetic relationships between different cultivars and comparisons of molecular marker analysis to the chemical composition of plants. In this review, an explanation of the most important techniques involving molecular markers is given. A literature survey on molecular markers is presented, with some examples from aromatic plants and spices. However, understanding what controls flavour and aroma production in plants is not an easy task to accomplish. Several aspects of plant secondary metabolism, in particular volatiles production in aromatic plants, are still unknown. The route from genomics to proteomics is not well documented, although some research with model plants has already been performed. To address the question of the synthesis of volatiles, two different approaches are possible and summarized in this review: first, the biochemical and genetic approach; and second, approaches involving functional genomics. Finally, a brief survey of bioinformatics resources is presented. Copyright © 2010 John Wiley & Sons, Ltd. [source]


An Applied Econometricians' View of Empirical Corporate Governance Studies

GERMAN ECONOMIC REVIEW, Issue 3 2002
Axel Börsch-Supan
The economic analysis of corporate governance is in vogue. In addition to a host of theoretical papers, an increasing number of empirical studies analyze how ownership structure, capital structure, board structure, and the market for corporate control influence firm performance. This is not an easy task, and indeed, for reasons explained in this survey, empirical studies on corporate governance have more than the usual share of econometric problems. This paper is a critical survey of the recent empirical literature on corporate governance , to show which methodological lessons can be learned for future empirical research in the field of corporate governance, paying particular attention to German institutions and data availability. [source]


A comparison of techniques for hydrograph recession analysis

HYDROLOGICAL PROCESSES, Issue 3 2004
Joko Sujono
Abstract A comparison between commonly used techniques for hydrograph recession analysis, namely the semi-logarithmic plot of a single recession segment, the master recession and a relatively new approach based on wavelet transform was carried out. These methods were applied to a number of flood hydrograph events of two catchments in West Java, Indonesia. The results show that all the methods tested produce reasonable and comparable results. However, problems arise in the semi-logarithmic plot and the master recession, i.e. determining the recession parameter K is not an easy task especially where the plotted data on a semi-logarithmic plot is not a linear but a curved line. On a curved line, the end of direct flow or starting point of baseflow is not clear and it is quite difficult to identify. Hence, the best line as a basis for computing the recession parameter K becomes uncertain. The wavelet transform approach, however, produces promising results and minimizes a number of problems associated with hydrograph recession analysis. The end of direct flow and the location of the baseflow component are easily determined through the wavelet maps. Copyright © 2004 John Wiley & Sons, Ltd. [source]


Predictor-based repetitive learning control for a class of remote control nonlinear systems

INTERNATIONAL JOURNAL OF ROBUST AND NONLINEAR CONTROL, Issue 16 2007
Ya-Jun Pan
Abstract In this paper, a repetitive learning control (RLC) approach is proposed for a class of remote control nonlinear systems satisfying the global Lipschitz condition. The proposed approach is to deal with the remote tracking control problem when the environment is periodic or repeatable over infinite time domain. Since there exist time delays in the two transmission channels: from the controller to the actuator and from the sensor to the controller, tracking a desired trajectory through a remote controller is not an easy task. In order to solve the problem caused by time delays, a predictor is designed on the controller side to predict the future state of the nonlinear system based on the delayed measurements from the sensor. The convergence of the estimation error of the predictor is ensured. The gain design of the predictor applies linear matrix inequality (LMI) techniques developed by Lyapunov Kravoskii method for time delay systems. The RLC law is constructed based on the feedback error from the predicted state. The overall tracking error tends to zero asymptotically over iterations. The proof of the stability is based on a constructed Lyapunov function related to the Lyapunov Kravoskii functional used for the proof of the predictor's convergence. By well incorporating the predictor and the RLC controller, the system state tracks the desired trajectory independent of the influence of time delays. A numerical simulation example is shown to verify the effectiveness of the proposed approach. Copyright © 2007 John Wiley & Sons, Ltd. [source]


Analyzing the Trade-off Between Investing in Service Channels and Satisfying the Targeted User Service for Brazilian Internet Service Providers

INTERNATIONAL TRANSACTIONS IN OPERATIONAL RESEARCH, Issue 3 2002
Gisele C. Fontanella
The computer connection to the Internet is provided by firms known as internet service providers (ISPs). The simplest mode of physical connection is when the user connects to an ISP's service channel by an ordinary telephone line (dial-up). Finding an available channel may not be an easy task, especially during the peak hours of many Brazilian ISPs. This results in a problem for the ISPs, which is how to achieve the most appropriate trade-off between investing in capacity and satisfying the target user service level. This paper analyzes this trade-off based on a three-step approach: (i) determine user arrival and service processes in chosen periods, (ii) select an appropriate queueing model using some simplifying assumptions, and (iii) generate trade-off curves between system performance measures. To illustrate the application of this approach, some results derived from a case study performed at an ISP in Sao Paulo state are given. [source]


Improvements in the production of bacterial synthesized biocellulose nanofibres using different culture methods

JOURNAL OF CHEMICAL TECHNOLOGY & BIOTECHNOLOGY, Issue 2 2010
Amir Sani
Abstract This review summarizes previous work that was done to improve the production of bacterial cellulose nanofibres. Production of biocellulose nanofibres is a subject of interest owing to the wide range of unique properties that makes this product an attractive material for many applications. Bacterial cellulose is a natural nanomaterial that has a native dimension of less than 50 nm in diameter. It is produced in the form of nanofibres, yielding a very pure cellulose product with unique physical properties that distinguish it from plant-derived cellulose. Its high surface-to-volume ratio combined with its unique properties such as poly-functionality, hydrophilicity and biocompatibility makes it a potential material for applications in the biomedical field. The purpose of this review is to summarize the methods that might help in delivering microbial cellulose to the market at a competitive cost. Different feedstocks in addition to different bioreactor systems that have been previously used are reviewed. The main challenge that exists is the low yield of the cellulosic nanofibres, which can be produced in static and agitated cultures. The static culture method has been used for many years. However, the production cost of this nanomaterial in bioreactor systems is less expensive than the static culture method. Biosynthesis in bioreactors will also be less labour intensive when scaled up. This would improve developing intermediate fermentation scale-up so that the conversion to an efficient large-scale fermentation technology will be an easy task. Copyright © 2009 Society of Chemical Industry [source]


Towards a System-Oriented Framework for Analysing and Evaluating Emergency Response

JOURNAL OF CONTINGENCIES AND CRISIS MANAGEMENT, Issue 1 2010
Marcus Abrahamsson
Information can be provided by studying and evaluating past emergencies and the response in connection to them. This information would then be useful in efforts directed at preventing, mitigating and/or preparing for future emergencies. However, the analysis and evaluation of emergency response operations is not an easy task, especially when the operation involves several cooperating actors (e.g. the fire and rescue services, the police, the emergency medical services, etc.). Here, we identify and discuss four aspects of this challenge: (1) issues related to the values governing the evaluation, (2) issues related to the complexity of the systems involved, (3) issues related to the validity of the information on which the analysis and evaluation is based and (4) issues related to the limiting conditions under which the emergency response system operated. An outline of a framework for such an analysis and evaluation, influenced by systems theory, accident investigation theories and programme evaluation theories dealing with the above aspects, is introduced, discussed and exemplified using empirical results from a case study. We conclude that the proposed framework may provide a better understanding of how an emergency response system functioned during a specific operation, and help to identify the potential events and/or circumstances that could significantly affect the performance of the emergency response system, either negatively or positively. The insights gained from using the framework may allow the actors involved in the response operation to gain a better understanding of how the emergency response system functioned as a whole, as well as how the actors performed as individual components of the system. Furthermore, the information can also be useful for actors preparing for future emergencies. [source]


Segmentation of 3D microtomographic images of granular materials with the stochastic watershed

JOURNAL OF MICROSCOPY, Issue 1 2010
M. FAESSEL
Summary Segmentation of 3D images of granular materials obtained by microtomography is not an easy task. Because of the conditions of acquisition and the nature of the media, the available images are not exploitable without a reliable method of extraction of the grains. The high connectivity in the medium, the disparity of the object's shape and the presence of image imperfections make classical segmentation methods (using image gradient and watershed constrained by markers) extremely difficult to perform efficiently. In this paper, we propose a non-parametric method using the stochastic watershed, allowing to estimate a 3D probability map of contours. Procedures allowing to extract final segmentation from this function are then presented. [source]


PERMEABILITY ANISOTROPY DISTRIBUTIONS IN AN UPPER JURASSIC CARBONATE RESERVOIR, EASTERN SAUDI ARABIA

JOURNAL OF PETROLEUM GEOLOGY, Issue 2 2007
A. Sahin
Most classical reservoir engineering concepts are based on homogeneous reservoirs despite the fact that homogeneous reservoirs are the exception rather than the rule. This is especially true of carbonate reservoirs in the Middle East which are known to be highly heterogeneous. The realistic petrophysical characterization of these kinds of reservoirs is not an easy task and must include the study of directional variations of permeability. Such variation can be incorporated into engineering calculations as the square root of the ratio of horizontal to vertical permeability, a parameter known as the anisotropy ratio. This paper addresses the distribution of anisotropy ratio values in an Upper Jurassic carbonate reservoir in the Eastern Province of Saudi Arabia. Based on whole core data from a number of vertical wells, statistical distributions of horizontal and vertical permeability measurements as well as anisotropy ratios were determined. The distributions of both permeability measurements and anisotropy ratios have similar patterns characterized by considerable positive skewness. The coefficients of variation for these distributions are relatively high, indicating their very heterogeneous nature. Comparison of plots of anisotropy ratios against depth for the wells and the corresponding core permeability values indicate that reservoir intervals with lower vertical permeability yield consistently higher ratios with considerable fluctuations. These intervals are represented by lower porosity mud-rich and/or mud-rich/granular facies. Granular facies, on the other hand, yielded considerably lower ratios without significant fluctuations. [source]


From Fundamental Polymerization Kinetics to Process Application,A Realistic Vision?,

MACROMOLECULAR CHEMISTRY AND PHYSICS, Issue 5 2010
Christian Bauer
Abstract This contribution presents a ramble through the past, inspecting the developments of fundamental scientific investigations into the elementary reaction steps of polymerizations. Often the question arises whether benefits from fundamental scientific investigation find their way into industrial practice and how these pay off in better understanding large-scale processes. The high-pressure ethene polymerization is the system of interest here because 1) it is a complex process with a world production of several million tons per year and 2) although dealing with system pressures up to 3000,bar is not an easy task, a long history,if not tradition,exists between the scientific efforts to improve the process and the research groups of Franck and Buback in Karlsruhe, Schönemann and Luft in Darmstadt, and later Buback in Göttingen. The scope of this discussion will range from non-invasive on- and in-line monitoring by spectroscopic methods up to process modeling by advanced kinetic models and their potential for explorative work. [source]


How to use molecular marker data to measure evolutionary parameters in wild populations

MOLECULAR ECOLOGY, Issue 7 2005
DANY GARANT
Abstract Estimating the genetic basis of phenotypic traits and the selection pressures acting on them are central to our understanding of the evolution and conservation of wild populations. However, obtaining such evolutionary-related parameters is not an easy task as it requires accurate information on both relatedness among individuals and their breeding success. Polymorphic molecular markers are very useful in estimating relatedness between individuals and parentage analyses are now extensively used in most taxa. The next step in the application of molecular data to wild populations is to use them to derive estimates of evolutionary-related parameters for quantitative traits, such as quantitative genetic parameters (e.g. heritability, genetic correlations) and measures of selection (e.g. selection gradients). Despite their great appeal and potential, the optimal use of molecular tools is still debated and it remains unclear how they should best be used to obtain reliable estimates of evolutionary parameters in the wild. Here, we review the methods available for estimating quantitative genetic and selection parameters and discuss their merits and shortcomings, to provide a tool that summarizes the potential uses of molecular data to obtain such parameters in wild populations. [source]


Nanoparticle phagocytosis and cellular stress: involvement in cellular imaging and in gene therapy against glioma

NMR IN BIOMEDICINE, Issue 1 2010
Anne-Karine Bouzier-Sore
Abstract In gene therapy against glioma, targeting tumoral tissue is not an easy task. We used the tumor infiltrating property of microglia in this study. These cells are well adapted to this therapy since they can phagocyte nanoparticles and allow their visualization by MRI. Indeed, while many studies have used transfected microglia containing a suicide gene and other internalized nanoparticles to visualize microglia, none have combined both approaches during gene therapy. Microglia cells were transfected with the TK-GFP gene under the control of the HSP70 promoter. First, the possible cellular stress induced by nanoparticle internalization was checked to avoid a non-specific activation of the suicide gene. Then, MR images were obtained on tubes containing microglia loaded with superparamagnetic nanoparticles (VUSPIO) to characterize their MR properties, as well as their potential to track cells in vivo. VUSPIO were efficiently internalized by microglia, were found non-toxic and their internalization did not induce any cellular stress. VUSPIO relaxivity r2 was 224,mM,1.s,1. Such results could generate a very high contrast between loaded and unloaded cells on T2 -weighted images. The intracellular presence of VUSPIO does not prevent suicide gene activity, since TK is expressed in vitro and functional in vivo. It allows MRI detection of gene modified macrophages during cell therapy strategies. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Electron gyration modified in the magnetic field tilted to the symmetry species of a crystalline metal

PHYSICA STATUS SOLIDI (B) BASIC SOLID STATE PHYSICS, Issue 8 2006
S. Olszewski
Abstract When a crystal electron is gyrating in the magnetic field being normal to the crystallographic plane, the calculation of the gyration frequency represents a relatively easy task. The paper approaches a more complicated problem of the gyration frequency in the case when the magnetic field is tilted to the crystallographic axes. The tightly-bound s-electrons in crystal lattices of cubic symmetry are considered as examples. Another problem concerns a metal plate for which the changes of the electron gyration frequency are examined as a function of the inclination angle of the magnetic field with respect to the planar boundaries of that plate. (© 2006 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


Psychogeriatrics in the New Century,Issues and Challenge ,

PSYCHOGERIATRICS, Issue 3 2001
Kazuo Hasegawa
Abstract: Many huge changes have taken place in our own field of psychogeriatrics and psychogeriatric care at the beginning of the new century. The speed of these changes has been extremely rapid, and increasingly difficult problems have emerged. We, thus, must at this juncture gather together our collective wisdom, devise an appropriate approach to the new century, and aggressively tackle those areas that need to be addressed. While predicting the future is by no means an easy task, I would like to discuss the following three points: educational issues, research activities and the psychogeriatric service. I sincerely pray for the future of our psychogeriatric society with a famous religious philosopher Dr. Reinhold Neibuhr "God, grant me the serenity to accept the things I cannot change, the coursjge to change the things I can, and the wisdom to know the difference." [source]


A KOREAN PERSPECTIVE ON DEVELOPING A GLOBAL POLICY FOR ADVANCE DIRECTIVES

BIOETHICS, Issue 3 2010
SOYOON KIM
ABSTRACT Despite the wide and daunting array of cross-cultural obstacles that the formulation of a global policy on advance directives will clearly pose, the need is equally evident. Specifically, the expansion of medical services driven by medical tourism, just to name one important example, makes this issue urgently relevant. While ensuring consistency across national borders, a global policy will have the additional and perhaps even more important effect of increasing the use of advance directives in clinical settings and enhancing their effectiveness within each country, regardless of where that country's state of the law currently stands. One cross-cultural issue that may represent a major obstacle in formulating, let alone applying, a global policy is whether patient autonomy as the underlying principle for the use of advance directives is a universal norm or a construct of western traditions that must be reconciled with alternative value systems that may place lesser significance on individual choice. A global policy, at a minimum, must emphasize respect for patient autonomy, provision of medical information, limits to the obligations for physicians, and portability. And though the development of a global policy will be no easy task, active engagement in close collaboration with the World Health Organization can make it possible. [source]