Same Time (same + time)

Distribution by Scientific Domains

Terms modified by Same Time

  • same time interval
  • same time period
  • same time point
  • same time scale

  • Selected Abstracts


    How and Why to Support Common Schooling and Educational Choice at the Same Time

    JOURNAL OF PHILOSOPHY OF EDUCATION, Issue 4 2007
    ROB REICH
    The common school ideal is the source of one of the oldest educational debates in liberal democratic societies. The movement in favour of greater educational choice is the source of one of the most recent. Each has been the cause of major and enduring controversy, not only within philosophical thought but also within political, legal and social arenas. Echoing conclusions reached by Terry McLaughlin, but taking the historical and legal context of the United States as my backdrop, I argue that the ideal of common schooling and the existence of separate schools, which is to say, the existence of educational choice, are not merely compatible but necessarily co-exist in a liberal democratic society. In other words, we need both common schooling and educational choice. The essay proceeds in four parts. First, I explain why we need to understand something about pluralism in order to understand common schooling and school choice. In the second and third parts, I explore the normative significance of pluralism for common schooling and educational choice, respectively. In the fourth part, I show how the two can be reconciled, given a certain understanding of what pluralism demands. [source]


    PREDICTING THE IMPACT OF ANTICIPATORY ACTION ON U.S. STOCK MARKET,AN EVENT STUDY USING ANFIS (A NEURAL FUZZY MODEL)

    COMPUTATIONAL INTELLIGENCE, Issue 2 2007
    P. Cheng
    In this study, the adaptive neural fuzzy inference system (ANFIS), a hybrid fuzzy neural network, is adopted to predict the actions of the investors (when and whether they buy or sell) in a stock market in anticipation of an event,changes in interest rate, announcement of its earnings by a major corporation in the industry, or the outcome of a political election for example. Generally, the model is relatively more successful in predicting when the investors take actions than what actions they take and the extent of their activities. The findings do demonstrate the learning and predicting potential of the ANFIS model in financial applications, but at the same time, suggest that some of the market behaviors are too complex to be predictable. [source]


    Volume fraction based miscible and immiscible fluid animation

    COMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 3-4 2010
    Kai Bao
    Abstract We propose a volume fraction based approach to effectively simulate the miscible and immiscible flows simultaneously. In this method, a volume fraction is introduced for each fluid component and the mutual interactions between different fluids are simulated by tracking the evolution of the volume fractions. Different techniques are employed to handle the miscible and immiscible interactions and special treatments are introduced to handle flows involving multiple fluids and different kinds of interactions at the same time. With this method, second-order accuracy is preserved in both space and time. The experiment results show that the proposed method can well handle both immiscible and miscible interactions between fluids and much richer mixing detail can be generated. Also, the method shows good controllability. Different mixing effects can be obtained by adjusting the dynamic viscosities and diffusion coefficients. Copyright © 2010 John Wiley & Sons, Ltd. [source]


    Interactive shadowing for 2D Anime

    COMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 2-3 2009
    Eiji Sugisaki
    Abstract In this paper, we propose an instant shadow generation technique for 2D animation, especially Japanese Anime. In traditional 2D Anime production, the entire animation including shadows is drawn by hand so that it takes long time to complete. Shadows play an important role in the creation of symbolic visual effects. However shadows are not always drawn due to time constraints and lack of animators especially when the production schedule is tight. To solve this problem, we develop an easy shadowing approach that enables animators to easily create a layer of shadow and its animation based on the character's shapes. Our approach is both instant and intuitive. The only inputs required are character or object shapes in input animation sequence with alpha value generally used in the Anime production pipeline. First, shadows are automatically rendered on a virtual plane by using a Shadow Map1 based on these inputs. Then the rendered shadows can be edited by simple operations and simplified by the Gaussian Filter. Several special effects such as blurring can be applied to the rendered shadow at the same time. Compared to existing approaches, ours is more efficient and effective to handle automatic shadowing in real-time. Copyright © 2009 John Wiley & Sons, Ltd. [source]


    Design and realization of the cooperative work system based on equipments sharing

    COMPUTER APPLICATIONS IN ENGINEERING EDUCATION, Issue 4 2009
    Bo Yan
    Abstract With analysis on the necessity and functions of the equipments sharing platform and the cooperative work system for colleges and universities, this paper designs the cooperative work system to provide cooperative support for resource query and reservation. The system classifies users' resource application roles, divides users' application information into different cooperative grades, and provides a basis for users' cooperative work. Functions, authorization, page flow, operating methods, and relevant database table of cooperative roles are shown in detail. At the same time, the ASP system will be introduced into the system, and a special fee management system will be established for effective management of the system. Functions and page flow of the fee management system are also designed. © 2009 Wiley Periodicals, Inc. Comput Appl Eng Educ 17: 372,378, 2009; Published online in Wiley InterScience (www.interscience.wiley.com); DOI 10.1002/cae.20195 [source]


    Shallow Bounding Volume Hierarchies for Fast SIMD Ray Tracing of Incoherent Rays

    COMPUTER GRAPHICS FORUM, Issue 4 2008
    H. Dammertz
    Abstract Photorealistic image synthesis is a computationally demanding task that relies on ray tracing for the evaluation of integrals. Rendering time is dominated by tracing long paths that are very incoherent by construction. We therefore investigate the use of SIMD instructions to accelerate incoherent rays. SIMD is used in the hierarchy construction, the tree traversal and the leaf intersection. This is achieved by increasing the arity of acceleration structures, which also reduces memory requirements. We show that the resulting hierarchies can be built quickly and are smaller than acceleration structures known so far while at the same time outperforming them for incoherent rays. Our new acceleration structure speeds up ray tracing by a factor of 1.6 to 2.0 compared to a highly optimized bounding interval hierarchy implementation, and 1.3 to 1.6 compared to an efficient kd-tree. At the same time, the memory requirements are reduced by 10,50%. Additionally we show how a caching mechanism in conjunction with this memory efficient hierarchy can be used to speed up shadow rays in a global illumination algorithm without increasing the memory footprint. This optimization decreased the number of traversal steps up to 50%. [source]


    Illustrative Hybrid Visualization and Exploration of Anatomical and Functional Brain Data

    COMPUTER GRAPHICS FORUM, Issue 3 2008
    W. M. Jainek
    Abstract Common practice in brain research and brain surgery involves the multi-modal acquisition of brain anatomy and brain activation data. These highly complex three-dimensional data have to be displayed simultaneously in order to convey spatial relationships. Unique challenges in information and interaction design have to be solved in order to keep the visualization sufficiently complete and uncluttered at the same time. The visualization method presented in this paper addresses these issues by using a hybrid combination of polygonal rendering of brain structures and direct volume rendering of activation data. Advanced rendering techniques including illustrative display styles and ambient occlusion calculations enhance the clarity of the visual output. The presented rendering pipeline produces real-time frame rates and offers a high degree of configurability. Newly designed interaction and measurement tools are provided, which enable the user to explore the data at large, but also to inspect specific features closely. We demonstrate the system in the context of a cognitive neurosciences dataset. An initial informal evaluation shows that our visualization method is deemed useful for clinical research. [source]


    Pose Controlled Physically Based Motion

    COMPUTER GRAPHICS FORUM, Issue 4 2006
    Raanan Fattal
    Abstract In this paper we describe a new method for generating and controlling physically-based motion of complex articulated characters. Our goal is to create motion from scratch, where the animator provides a small amount of input and gets in return a highly detailed and physically plausible motion. Our method relieves the animator from the burden of enforcing physical plausibility, but at the same time provides full control over the internal DOFs of the articulated character via a familiar interface. Control over the global DOFs is also provided by supporting kinematic constraints. Unconstrained portions of the motion are generated in real time, since the character is driven by joint torques generated by simple feedback controllers. Although kinematic constraints are satisfied using an iterative search (shooting), this process is typically inexpensive, since it only adjusts a few DOFs at a few time instances. The low expense of the optimization, combined with the ability to generate unconstrained motions in real time yields an efficient and practical tool, which is particularly attractive for high inertia motions with a relatively small number of kinematic constraints. [source]


    Gradient Estimation in Volume Data using 4D Linear Regression

    COMPUTER GRAPHICS FORUM, Issue 3 2000
    László Neumann
    In this paper a new gradient estimation method is presented which is based on linear regression. Previous contextual shading techniques try to fit an approximate function to a set of surface points in the neighborhood of a given voxel. Therefore a system of linear equations has to be solved using the computationally expensive Gaussian elimination. In contrast, our method approximates the density function itself in a local neighborhood with a 3D regression hyperplane. This approach also leads to a system of linear equations but we will show that it can be solved with an efficient convolution. Our method provides at each voxel location the normal vector and the translation of the regression hyperplane which are considered as a gradient and a filtered density value respectively. Therefore this technique can be used for surface smoothing and gradient estimation at the same time. [source]


    Scalable Algorithm for Resolving Incorrect Occlusion in Dynamic Augmented Reality Engineering Environments

    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 1 2010
    Amir H. Behzadan
    As a result of introducing real-world objects into the visualization, less virtual models have to be deployed to create a realistic visual output that directly translates into less time and effort required to create, render, manipulate, manage, and update three-dimensional (3D) virtual contents (CAD model engineering) of the animated scene. At the same time, using the existing layout of land or plant as the background of visualization significantly alleviates the need to collect data about the surrounding environment prior to creating the final visualization while providing visually convincing representations of the processes being studied. In an AR animation, virtual and real objects must be simultaneously managed and accurately displayed to a user to create a visually convincing illusion of their coexistence and interaction. A critical challenge impeding this objective is the problem of incorrect occlusion that manifests itself when real objects in an AR scene partially or wholly block the view of virtual objects. In the presented research, a new AR occlusion handling system based on depth-sensing algorithms and frame buffer manipulation techniques was designed and implemented. This algorithm is capable of resolving incorrect occlusion occurring in dynamic AR environments in real time using depth-sensing equipment such as laser detection and ranging (LADAR) devices, and can be integrated into any mobile AR platform that allows a user to navigate freely and observe a dynamic AR scene from any vantage position. [source]


    A pulse programmer for nuclear magnetic resonance spectrometers

    CONCEPTS IN MAGNETIC RESONANCE, Issue 2 2007
    C.C. Odebrecht
    Abstract A pulse programmer (PP) designed to control a nuclear magnetic resonance (NMR) spectrometer is reported on. The heart of the PP is a computer programmable logic device (CPDL) that provides flexibility to the design and, at the same time, reduces the number of electronics components needed and the dimensions of the printed circuit board. The PP works as follow: first, a pulse sequence defined by a set of instructions is loaded into the RAM memory of the PP. Then, when the process is started, the instructions are, one by one, read, decoded, and executed. Four types of instructions (functions) were defined: PRINT A, PRINT B, WAIT, and STOP. PRINT A and PRINT B change the status of the output channels A and B, respectively, WAIT generates a time delay, and STOP terminates the sequence. The output ports A and B have 14 channels each, and the shortest pulse and resolution are both 200 ns. The design of the PP is versatile, and new functions can be added through software without modifying the printed circuit board. To control the PP from a personal computer, a program named PulseJr was developed. It contains a graphical user interface (GUI) and pulse sequences can be drawn on the monitor screen with the mouse of the computer. Once the pulse sequence is sketched, clicking a button the program compiles the pulse sequence, generates the set of instructions, loads them into the RAM memory of the PP, and starts the pulse sequence. © 2007 Wiley Periodicals, Inc. Concepts Magn Reson Part A 30A: 127,131, 2007. [source]


    Scheduling dense linear algebra operations on multicore processors

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 1 2010
    Jakub Kurzak
    Abstract State-of-the-art dense linear algebra software, such as the LAPACK and ScaLAPACK libraries, suffers performance losses on multicore processors due to their inability to fully exploit thread-level parallelism. At the same time, the coarse,grain dataflow model gains popularity as a paradigm for programming multicore architectures. This work looks at implementing classic dense linear algebra workloads, the Cholesky factorization, the QR factorization and the LU factorization, using dynamic data-driven execution. Two emerging approaches to implementing coarse,grain dataflow are examined, the model of nested parallelism, represented by the Cilk framework, and the model of parallelism expressed through an arbitrary Direct Acyclic Graph, represented by the SMP Superscalar framework. Performance and coding effort are analyzed and compared against code manually parallelized at the thread level. Copyright © 2009 John Wiley & Sons, Ltd. [source]


    Segregation and scheduling for P2P applications with the interceptor middleware system

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 6 2008
    Cosimo Anglano
    Abstract Very large size peer-to-peer systems are often required to implement efficient and scalable services, but usually they can be built only by assembling resources contributed by many independent users. Among the guarantees that must be provided to convince these users to join the P2P system, particularly important is the ability of ensuring that P2P applications and services run on their nodes will not unacceptably degrade the performance of their own applications because of an excessive resource consumption. In this paper we present the Interceptor, a middleware-level application segregation and scheduling system, which is able to strictly enforce quantitative limitations on node resource usage and, at the same time, to make P2P applications achieve satisfactory performance even in face of these limitations. A proof-of-concept implementation has been carried out for the Linux operating system, and has been used to perform an extensive experimentation aimed at quantitatively evaluating the Interceptor. The results we obtained clearly demonstrate that the Interceptor is able to strictly enforce quantitative limitations on node resource usage, and at the same time to effectively schedule P2P applications. Copyright © 2007 John Wiley & Sons, Ltd. [source]


    Performance and effectiveness trade-off for checkpointing in fault-tolerant distributed systems

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 1 2007
    Panagiotis Katsaros
    Abstract Checkpointing has a crucial impact on systems' performance and fault-tolerance effectiveness: excessive checkpointing results in performance degradation, while deficient checkpointing incurs expensive recovery. In distributed systems with independent checkpoint activities there is no easy way to determine checkpoint frequencies optimizing response-time and fault-tolerance costs at the same time. The purpose of this paper is to investigate the potentialities of a statistical decision-making procedure. We adopt a simulation-based approach for obtaining performance metrics that are afterwards used for determining a trade-off between checkpoint interval reductions and efficiency in performance. Statistical methodology including experimental design, regression analysis and optimization provides us with the framework for comparing configurations, which use possibly different fault-tolerance mechanisms (replication-based or message-logging-based). Systematic research also allows us to take into account additional design factors, such as load balancing. The method is described in terms of a standardized object replication model (OMG FT-CORBA), but it could also be applied in other (e.g. process-based) computational models. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    GAUGE: Grid Automation and Generative Environment,

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 10 2006
    Francisco Hernández
    Abstract The Grid has proven to be a successful paradigm for distributed computing. However, constructing applications that exploit all the benefits that the Grid offers is still not optimal for both inexperienced and experienced users. Recent approaches to solving this problem employ a high-level abstract layer to ease the construction of applications for different Grid environments. These approaches help facilitate construction of Grid applications, but they are still tied to specific programming languages or platforms. A new approach is presented in this paper that uses concepts of domain-specific modeling (DSM) to build a high-level abstract layer. With this DSM-based abstract layer, the users are able to create Grid applications without knowledge of specific programming languages or being bound to specific Grid platforms. An additional benefit of DSM provides the capability to generate software artifacts for various Grid environments. This paper presents the Grid Automation and Generative Environment (GAUGE). The goal of GAUGE is to automate the generation of Grid applications to allow inexperienced users to exploit the Grid fully. At the same time, GAUGE provides an open framework in which experienced users can build upon and extend to tailor their applications to particular Grid environments or specific platforms. GAUGE employs domain-specific modeling techniques to accomplish this challenging task. Copyright © 2005 John Wiley & Sons, Ltd. [source]


    Sapphire: copying garbage collection without stopping the world

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 3-5 2003
    Richard L. Hudson
    Abstract The growing use in concurrent systems of languages that require garbage collection (GC), such as Java, is raising practical interest in concurrent GC. Sapphire is a new algorithm for concurrent copying GC for Java. It stresses minimizing the amount of time any given application thread may need to block to support the collector. In particular, Sapphire is intended to work well in the presence of a large number of application threads, on small- to medium-scale shared memory multiprocessors. Sapphire extends previous concurrent copying algorithms, and is most closely related to replicating copying collection, a GC technique in which application threads observe and update primarily the old copies of objects. The key innovations of Sapphire are: (1) the ability to ,flip' one thread at a time (changing the thread's view from the old copies of objects to the new copies), as opposed to needing to stop all threads and flip them at the same time; (2) exploiting Java semantics and assuming any data races occur on volatile fields, to avoid a barrier on reads of non-volatile fields; and (3) working in concert with virtually any underlying (non-concurrent) copying collection algorithm. Copyright © 2003 John Wiley & Sons, Ltd. [source]


    Community mediation: Reflections on a quarter century of practice

    CONFLICT RESOLUTION QUARTERLY, Issue 4 2000
    Scott Bradley
    It is fitting that this issue of Mediation Quarterly, one of the last before it transforms into a joint publication of confederating organizations, is devoted to community mediation. During the past twenty-five years, community mediation has provided much of the momentum for the growth and diversity of the alternative dispute resolution movement in the United States. At the same time, it has faced many challenges as the larger dispute resolution field grows and evolves. How community mediation responds to these challenges will shape its role and place for the next generation. In this issue, we have asked some key leaders and practitioners in the field to reflect on the development of community mediation and the challenges as we move into another century of practice. [source]


    Necessity to establish new risk assessment and risk communication for human fetal exposure to multiple endocrine disruptors in Japan

    CONGENITAL ANOMALIES, Issue 2 2002
    Emiko Todaka
    ABSTRACT, Our recent study clearly shows that fetuses are exposed to multiple chemicals including endocrine disruptors in Japan. Although the embryo and fetus stages are the most sensitive period to chemicals in humans' life cycle, the health effects of the chemicals such as endocrine disruptors to them are largely unknown. The conventional risk assessment method cannot assess the risk to fetuses precisely. Now we need a new risk assessment, in which the target is fetuses and not the adults, in addition to the conventional risk assessment At the same time, we also need a new strategy to practically eliminate the risk for the future generations. To make the strategy effective, we suggest a new approach to reduce the risk and avoid the possible adverse health effects, using primary, secondary and tertiary preventions as they are used in public health. We also suggest a new concept of "pre-primary prevention" to reduce the risk for fetuses. Furthermore, to make this method even more practical, we suggest a new risk communication method. In this paper, we present a framework of risk avoidance of multiple chemical exposure to fetuses. [source]


    The Atrial Fibrillation Paradox of Heart Failure

    CONGESTIVE HEART FAILURE, Issue 1 2010
    Rhidian J. Shelton MRCP
    Congest Heart Fail. 2010;16:3,9. ©2009 Wiley Periodicals, Inc. The prevalence of atrial fibrillation (AF) in patients with heart failure (HF) is high, but longitudinal studies suggest that the incidence of AF is relatively low. The authors investigated this paradox prospectively in an epidemiologically representative population of patients with HF and persistent AF. In all, 891 consecutive patients with HF [mean age, 70±10 years; 70% male; left ventricular ejection fraction, 32%±9%] were enrolled. The prevalence of persistent AF at baseline was 22%. The incidence of persistent AF at 1 year was 26 per 1000 person-years, ranging from 15 in New York Heart Association class I/II to 44 in class III/IV. AF occurred either at the same time or prior to HF in 76% of patients and following HF in 24%. A risk score was developed to predict the occurrence of persistent AF. The annual risk of persistent AF developing was 0.5% (0%,1.3%) for those in the low-risk group compared with 15% (3.4%,26.6%) in the high-risk group. Despite a high prevalence of persistent AF in patients with HF, the incidence of persistent AF is relatively low. This is predominantly due to AF coinciding with or preceding the development of HF. The annual risk of persistent AF developing can be estimated from clinical variables. [source]


    Hyponatremia and Heart Failure,Treatment Considerations

    CONGESTIVE HEART FAILURE, Issue 1 2006
    Domenic A. Sica MD
    Hyponatremia as it occurs in the heart failure patient is a multifactorial process. The presence of hyponatremia in the heart failure patient correlates with both the severity of the disease and its ultimate outcome. The therapeutic approach to the treatment of hyponatremia in heart failure has traditionally relied on attempts to improve cardiac function while at the same time limiting fluid intake. In more select circumstances, hypertonic saline, loop diuretics, and/or lithium or demeclocycline have been used. The latter two compounds act by retarding the antidiuretic effect of vasopressin but carry with their use the risk of serious renal and/or cardiovascular side effects. Alternatively, agents that selectively block the type 2 vasopressin receptor increase free water excretion without any of the adverse consequences of other therapies. Conivaptan, lixivaptan, and tolvaptan are three such aquaretic drugs. Vasopressin receptor antagonists will redefine the treatment of heart failure-related hyponatremia and may possibly evolve as adjunct therapies to loop diuretics in diuretic-resistant patients. [source]


    Cerebral oxygenation is reduced during hyperthermic exercise in humans

    ACTA PHYSIOLOGICA, Issue 1 2010
    P. Rasmussen
    Abstract Aim:, Cerebral mitochondrial oxygen tension (PmitoO2) is elevated during moderate exercise, while it is reduced when exercise becomes strenuous, reflecting an elevated cerebral metabolic rate for oxygen (CMRO2) combined with hyperventilation-induced attenuation of cerebral blood flow (CBF). Heat stress challenges exercise capacity as expressed by increased rating of perceived exertion (RPE). Methods:, This study evaluated the effect of heat stress during exercise on PmitoO2 calculated based on a Kety-Schmidt-determined CBF and the arterial-to-jugular venous oxygen differences in eight males [27 ± 6 years (mean ± SD) and maximal oxygen uptake (VO2max) 63 ± 6 mL kg,1 min,1]. Results:, The CBF, CMRO2 and PmitoO2 remained stable during 1 h of moderate cycling (170 ± 11 W, ,50% of VO2max, RPE 9,12) in normothermia (core temperature of 37.8 ± 0.4 °C). In contrast, when hyperthermia was provoked by dressing the subjects in watertight clothing during exercise (core temperature 39.5 ± 0.2 °C), PmitoO2 declined by 4.8 ± 3.8 mmHg (P < 0.05 compared to normothermia) because CMRO2 increased by 8 ± 7% at the same time as CBF was reduced by 15 ± 13% (P < 0.05). During exercise with heat stress, RPE increased to 19 (19,20; P < 0.05); the RPE correlated inversely with PmitoO2 (r2 = 0.42, P < 0.05). Conclusion:, These data indicate that strenuous exercise in the heat lowers cerebral PmitoO2, and that exercise capacity in this condition may be dependent on maintained cerebral oxygenation. [source]


    Multiple fixed drug eruption due to drug combination

    CONTACT DERMATITIS, Issue 6 2005
    A. Yokoyama
    We report the case of a multiple fixed drug eruption (FDE) after taking 1 g of PL® and 100 mg of levofloxacin (Cravit®) at the same time. Patch tests with PL® alone, levofloxacin alone and the combination of PL® and levofloxacin were all negative on the involved and uninvolved sites. Lymphocytic stimulation tests were also negative for PL® alone, levofloxacin alone and the combination of PL® and levofloxacin. Oral provocation tests with PL®alone or levofloxacin alone produced no reactivation. However, we could provoke multiple erythematous plaques on the involved areas by taking a 1/10th dose of the combination of PL® and levofloxacin at the same time. Drug eruption due to a drug combination appears to be very rare. This is the first case of multiple FDE caused by taking PL® -levofloxacin combination. [source]


    Contact allergy to farnesol in 2021 consecutively patch tested patients.

    CONTACT DERMATITIS, Issue 3 2004
    Results of the IVDK
    Farnesol is one of the fragrances considered to be a significant contact allergen. Therefore, it was decided by the European Union to label products containing farnesol. Farnesol was tested [5% petrolatum (pet.)] together with the standard series between 1 January 2003 and 30 June 2003 in 2021 consecutive patients, 1243 females and 778 males. Of these, 22 [1.1%, 95% confidence interval (CI): 0.7,1.6%] had a positive reaction to farnesol. 147 (8.1%) of those 1825 tested to Myroxylon pereirae resin (balsam of Peru, 25% pet.) at the same time reacted positively, 143 (7.8%) of those 1823 tested to the fragrance mix (FM) (8% pet.) and 34 (1.9%) of 1831 tested to propolis (10% pet.). With regard to concomitant reactions in farnesol-positive patients, 5 of 22 reacted additionally to the FM [odds ratio (OR): 4.3; CI: 1.53,12.15] and 2 (of these 5) additionally to M. pereirae resin (OR: 1.27; CI: 0.29,5.54). The strongest association was seen to propolis (OR: 6.2; 95% CI: 1.4,27.7). Compared to those with negative reactions to farnesol, the group of patients allergic to farnesol was characterized by a higher proportion of young females and office workers, and the hand and the face were more often affected. In conclusion, farnesol is an important allergen. We recommend that farnesol should be included in a fragrance patch-test preparation and that its use should be regulated for consumer safety reasons. Furthermore, the extent of exposure to farnesol should be further studied. [source]


    Ball-Pen Probe Measurements in L-Mode and H-Mode on ASDEX Upgrade

    CONTRIBUTIONS TO PLASMA PHYSICS, Issue 9 2010
    J. Adamek
    Abstract Experimental investigations of the plasma potential, poloidal electric field and electron temperature during L-mode and ELMy H-mode were performed on ASDEX Upgrade by means of a probe head containing four ball-pen probes and four Langmuir probes. This allows to measure simultaneously the floating and plasma potential at the same time which are related by the electron temperature. Thus a combination of ball-pen probes and Langmuir probes offers the possibility to determine the electron temperature directly with high temporal resolution. This novel temperature measurement method is compared to standard techniques. The influence of the electron temperature on the usual calculation of the poloidal electric field from the gradient of the floating potential is determined by a comparison to the poloidal electric field derived from the plasma potential (© 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


    Sum rules and exact relations for quantal Coulomb systems

    CONTRIBUTIONS TO PLASMA PHYSICS, Issue 5-6 2003
    V.M. Adamyan
    Abstract A complex response function describing a reaction of a multi-particle system to a weak alternating external field is the boundary value of a Nevanlinna class function (i.e. a holomorphic function with non-negative imaginary part in the upper half-plane). Attempts of direct calculations of response functions based on standard approximations of the kinetic theory for real Coulomb condensed systems often result in considerable discrepancies with experiments and computer simulations. At the same time a relatively simple approach using only the exact values of leading asymptotic terms of the response function permits to restrict essentially a subset of Nevanlinna class functions containing this response function, and in this way to obtain sufficient data to explain and predict experimental results. Mathematical details of this approach are demonstrated on an example with the response function being the (external) dynamic electrical conductivity of cold dense hydrogen-like plasmas. In particular, the exact values of the leading terms of asymptotic expansions of the conductivity are calculated. (© 2003 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


    Patterns of Compliance with the German Corporate Governance Code

    CORPORATE GOVERNANCE, Issue 4 2008
    Till Talaulicar
    ABSTRACT Manuscript Type: Empirical Research Question/Issue: This study investigates whether the form of compliance with the recommendations of the German Corporate Governance Code (GCGC) appears to be idiosyncratic to a specific company or features similarities across firms. The major aim of this research is thus to explore the ability of a classification of compliance patterns to account for the similarities and differences between firms regarding their conformity with the GCGC. Research Findings/Insights: Based on seven dimensions of code compliance, cluster analysis is used to identify discrete groups of companies with similar patterns of code observance. We determine eight patterns of compliance which are characterized by distinct forms of code conformity. Theoretical Academic Implications: The identified cluster solution does not merely reflect the number of rejected code recommendations. Rather, companies with very similar rates of overall compliance with the GCGC are assigned to different clusters because they feature, at the same time, different patterns of code conformity. These findings imply that governance prediction and governance performance studies have to overcome overly aggregated measures of code compliance which only incorporate the number of rejected code recommendations. Practitioner/Policy Implications: This study provides evidence to practitioners and policy makers that firms can be classified regarding their compliance with the code recommendations. In-depth analyses of the identified patterns of compliance furthermore reveal that some patterns may indicate less well substantiated deviations from the code and partly even decouplings of the declared compliance practices. [source]


    Oversight and Delegation in Corporate Governance: deciding what the board should decide

    CORPORATE GOVERNANCE, Issue 1 2006
    Michael Useem
    American boards of directors increasingly treat their delegation of authority to management as a careful and self-conscious decision. Numerically dominated by non-executives, boards recognize that they cannot run the company, and many are now seeking to provide stronger oversight of the company without crossing the line into management. Based on interviews with informants at 31 major companies, we find that annual calendars and written protocols are often used to allocate decision rights between the board and management. Written protocols vary widely, ranging from detailed and comprehensive to skeletal and limited in scope. While useful, such calendars and protocols do not negate the need for executives to make frequent judgement calls on what issues should go to the board and what should remain within management. Executives still set much of the board's decision-making agenda, and despite increasingly asserting their sovereignty in recent years, directors remain substantially dependent upon the executives' judgement on what should come to the board. At the same time, a norm is emerging among directors and executives that the latter must be mindful of what directors want to hear and believe they should decide. [source]


    Evolutionary origins of the purinergic signalling system

    ACTA PHYSIOLOGICA, Issue 4 2009
    G. Burnstock
    Abstract Purines appear to be the most primitive and widespread chemical messengers in the animal and plant kingdoms. The evidence for purinergic signalling in plants, invertebrates and lower vertebrates is reviewed. Much is based on pharmacological studies, but important recent studies have utilized the techniques of molecular biology and receptors have been cloned and characterized in primitive invertebrates, including the social amoeba Dictyostelium and the platyhelminth Schistosoma, as well as the green algae Ostreococcus, which resemble P2X receptors identified in mammals. This suggests that contrary to earlier speculations, P2X ion channel receptors appeared early in evolution, while G protein-coupled P1 and P2Y receptors were introduced either at the same time or perhaps even later. The absence of gene coding for P2X receptors in some animal groups [e.g. in some insects, roundworms (Caenorhabditis elegans) and the plant Arabidopsis] in contrast to the potent pharmacological actions of nucleotides in the same species, suggests that novel receptors are still to be discovered. [source]


    A multicriterion classification approach for assessing the impact of environmental policies on the competitiveness of firms

    CORPORATE SOCIAL RESPONSIBILITY AND ENVIRONMENTAL MANAGEMENT, Issue 1 2007
    V. Hontou
    Abstract The key objective of the European Union's environmental policy is to successfully combine environmental protection with sustainable economic growth in the long term. Nowadays, it is increasingly recognized that environmental policies, besides increasing production cost, may at the same time give incentives to firms for undertaking innovative actions and/or developing and exploiting differentiation opportunities. Both differentiation capacity and cost increase are strongly dependent on a multiplicity of internal and external factors, such as energy intensity, type of technology used, characteristics of the competitive environment etc. The present paper presents a multicriterion approach for classifying firms into discrete categories of possible impact, according to their sensitivity to cost increases and their differentiation potential. The resulting environment,competitiveness matrix can be exploited for establishing sustainability strategies and designing effective policies in the industrial sector. Copyright © 2006 John Wiley & Sons, Ltd and ERP Environment. [source]


    Managing the Co-operation,Competition Dilemma in R&D Alliances: A Multiple Case Study in the Advanced Materials Industry

    CREATIVITY AND INNOVATION MANAGEMENT, Issue 1 2010
    Dries Faems
    Generating value in R&D alliances requires intensive and fine-grained interaction between collaborating partners. At the same time, more intensive co-operation increases the risk of competitive abuse of the R&D alliance by one or more partners. In this study, we explore how managers address the fundamental tension between the need for co-operation and the risk of competition, using an in-depth case study of five R&D alliances in the advanced materials industry. Based on our data, we identify two relational strategies to enhance co-operation between engineers of different partners (i.e., adopting boundary-spanning activities and installing similar technical equipment) and three structural strategies to mitigate the risk of such intensified co-operation (i.e., definition of partner-specific task domains, definition of partner-specific knowledge domains and definition of partner-specific commercial domains). In addition, we find that partners tend to use particular combinations of such relational and structural strategies at different stages of the alliance life-cycle to address the co-operation,competition dilemma. [source]