Key Problem (key + problem)

Distribution by Scientific Domains


Selected Abstracts


DECENTRALIZING HEALTH SERVICES IN THE UK: A NEW CONCEPTUAL FRAMEWORK

PUBLIC ADMINISTRATION, Issue 2 2008
STEPHEN PECKHAM
Decentralization is a central plank of current government health policy. However, it is possible to discern both centralist and decentralist movements in the UK. This paper examines existing frameworks of decentralization in relation to identifying whether policy is decentralist or not and identifies a number of problems that limit their value. Key problems relate to the way decentralization is conceptualized and defined. Existing frameworks are also highly contextualized and are therefore of limited value when applied in different contexts. The paper then presents a new framework which, it is argued, provides a more useful way of examining centralization and decentralization by providing a way of categorizing policies and actions and avoids the problems of being contextually constrained. The paper ends with a discussion of how the framework can be applied in a health context and shows how this framework helps avoid the problems found in previous discussions of decentralization. [source]


Evolving modular networks with genetic algorithms: application to nonlinear time series

EXPERT SYSTEMS, Issue 4 2004
A.S. Cofińo
Abstract: A key problem of modular neural networks is finding the optimal aggregation of the different subtasks (or modules) of the problem at hand. Functional networks provide a partial solution to this problem, since the inter-module topology is obtained from domain knowledge (functional relationships and symmetries). However, the learning process may be too restrictive in some situations, since the resulting modules (functional units) are assumed to be linear combinations of selected families of functions. In this paper, we present a non-parametric learning approach for functional networks using feedforward neural networks for approximating the functional modules of the resulting architecture; we also introduce a genetic algorithm for finding the optimal intra-module topology (the appropriate balance of neurons for the different modules according to the complexity of their respective tasks). Some benchmark examples from nonlinear time-series prediction are used to illustrate the performance of the algorithm for finding optimal modular network architectures for specific problems. [source]


Study on battlespace ontology construction approach

INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 12 2005
Jun-feng Song
In Network Centric Warfare, the sensor network's capability is much stronger than ever; the force can get a mass of information about battlespace in real or near-real time. How to utilize the information about battlespace effectively and transform the information superiority into knowledge superiority is a key problem for NCW research. To solve this problem, first we need to establish a suitable knowledge infrastructure. In this article, battlespace ontology is considered as the knowledge infrastructure of NCW, and we propose a battlespace ontology construction approach based on OWL, which consists of two parts: formal ontology construction approach to construct subdomain ontologies of battlespace and formal ontology integration approach to integrate subdomain ontologies of battlespace. Then a concrete application of the approach to an air combat battlespace is given. © 2005 Wiley Periodicals, Inc. Int J Int Syst 20: 1219,1231, 2005. [source]


Survey of quantitative feedback theory (QFT),

INTERNATIONAL JOURNAL OF ROBUST AND NONLINEAR CONTROL, Issue 10 2001
Isaac Horowitz
QFT is an engineering design theory devoted to the practical design of feedback control systems. The foundation of QFT is that feedback is needed in control only when plant (P), parameter and/or disturbance (D) uncertainties (sets ,,={P}, ,,={D}) exceed the acceptable (A) system performance uncertainty (set ,,={A}). The principal properties of QFT are as follows. (1) The amount of feedback needed is tuned to the (,,, ,,, ,,) sets. If ,, ,exceeds' (,,, ,,), feedback is not needed at all. (2) The simplest modelling is used: (a) command, disturbance and sensor noise inputs, and (b) the available sensing points and the defined outputs. No special controllability test is needed in either linear or non-linear plants. It is inherent in the design procedure. There is no observability problem because uncertainty is included. The number of independent sensors determines the number of independent loop transmissions (Li), the functions which provide the benefits of feedback. (3) The simplest mathematical tools have been found most use ful,primarily frequency response. The uncertainties are expressed as sets in the complex plane. The need for the larger ,,, ,, sets to be squeezed into the smaller ,, set results in bounds on the Li(j,) in the complex plane. In the more complex systems a key problem is the division of the ,feedback burden' among the available Li(j,). Point-by-point frequency synthesis tremendously simplifies this problem. This is also true for highly uncertain non-linear and time-varying plants which are converted into rigorously equivalent linear time invariant plant sets and/or disturbance sets with respect to the acceptable output set ,,. Fixed point theory justifies the equivalence. (4) Design trade-offs are highly transparent in the frequency domain: between design complexity and cost of feedback (primarily bandwidth), sensor noise levels, plant saturation levels, number of sensors needed, relative sizes of ,,, ,, and cost of feedback. The designer sees the trade-offs between these factors as he proceeds and can decide according to their relative importance in his particular situation. QFT design techniques with these properties have been developed step by step for: (i) highly uncertain linear time invariant (LTI) SISO single- and multiple-loop systems, MIMO single-loop matrix and multiple-loop matrix systems; and (ii) non-linear and time-varying SISO and MIMO plants, and to a more limited extent for plants with distributed control inputs and sensors. QFT has also been developed for single- and multiple-loop dithered non-linear (adaptive) systems with LTI plants, and for a special class (FORE) of non-linear compensation. New techniques have been found for handling non-minimum-phase (NMP) MIMO plants, plants with both zeros and poles in the right half-plane and LTI plants with incidental hard non-linearities such as saturation. [source]


Cautions and Concerns in Experimental Research on the Consumer Interest

JOURNAL OF CONSUMER AFFAIRS, Issue 3 2008
MARLA B. ROYNE
Most published consumer research presents data from surveys or other data analyses that, at best, report that certain things tend to happen at the same time. However, correlation does not mean causation; cause and effect relationships can only be concluded from controlled experiments. A key problem is that the use of experimental designs calls for various conceptual and pragmatic trade-offs that cannot be ignored. [source]


Accountability in the Regulatory State

JOURNAL OF LAW AND SOCIETY, Issue 1 2000
Colin Scott
Accountability has long been both a key theme and a key problem in constitutional scholarship. The centrality of the accountability debates in contemporary political and legal discourse is a product of the difficulty of balancing the autonomy given to those exercising public power with appropriate control. The traditional mechanisms of accountability to Parliament and to the courts are problematic because in a complex administrative state, characterized by widespread delegation of discretion to actors located far from the centre of government, the conception of centralized responsibility upon which traditional accountability mechanisms are based is often fictional. The problems of accountability have been made manifest by the transformations wrought on public administration by the new public management (NPM) revolution which have further fragmented the public sector. In this article it is argued that if public lawyers are to be reconciled to these changes then it will be through recognizing the potential for additional or extended mechanisms of accountability in supplementing or displacing traditional accountability functions. The article identifies and develops two such extended accountability models: interdependence and redundancy [source]


Multiculturalism and the Willingness of Citizens to Defer to Law and to Legal Authorities

LAW & SOCIAL INQUIRY, Issue 4 2000
Tom R. Tyler
A key problem in trying to manage diverse societies is finding social policies that will be acceptable to all individuals and groups. Studies suggest that this problem may not be as intractable as is often believed, since people's acceptance of policies is shaped to an important degree by the fairness of the procedures used by authorities to make policy. When policies are fairly made, they gain widespread support, even among those who may feel that the consequences of the policy for them or their group are undesirable or even unfair. These findings support an optimistic view of the ability of authorities to manage diverse societies. On the other hand, research suggests that the ability of procedural justice to bridge differences among individuals and groups may not be equally strong under all conditions. People's willingness to accept policies is more influenced by procedural justice judgments when they identify with the society that the authorities represent and view them as representing a group of which they are members. They are less influenced by procedural justice judgments when they identify more strongly with subgroups than with society and/or view the authorities as representatives of a group to which they do not belong. [source]


Measuring the plasma environment at Mercury: The fast imaging plasma spectrometer

METEORITICS & PLANETARY SCIENCE, Issue 9 2002
P. L. KOEHN
Three primary populations of ions exist at Mercury: solar wind, magnetospheric, and pickup ions. These pickup ions are generated through the ionization of Mercury's exosphere or are sputtered particles from the Mercury surface. A comprehensive mission to Mercury, such as MESSENGER (MErcury: Surface, Space ENvironment, GEochemistry, Ranging), should include a sensor that is able to determine the dynamical properties and composition of all these plasma components. An instrument to measure the composition of these ion populations and their three-dimensional velocity distribution functions must be lightweight, fast, and have a very large field of view. The fast imaging plasma spectrometer (FIPS) is an imaging mass spectrometer, part of NASA's MESSENGER mission, the first Mercury orbiter. This versatile instrument has a very small footprint, and has a mass that is ,1 order of magnitude less than other comparable systems. It maintains a nearly full-hemisphere field of view, suitable for either spinning or three-axis-stabilized platforms. The major piece of innovation to enable this sensor is a new deflection system geometry that enables a large instantaneous (,1.5,) field of view. This novel electrostatic analyzer system is then combined with a position sensitive time-of-flight system. We discuss the design and prototype tests of the FIPS deflection system and show how this system is expected to address one key problem in Mercury science, that of the nature of the radar-bright regions at the Hermean poles. [source]


Light trapping in organic solar cells

PHYSICA STATUS SOLIDI (A) APPLICATIONS AND MATERIALS SCIENCE, Issue 12 2008
Michael Niggemann
Abstract One key problem in optimizing organic solar cells is to maximize the absorption of incident light and to keep the charge carrier transport paths as short as possible in order to minimize recombination losses during the charge carrier extraction. The large versatility of organic semiconductors and compositions requires specific optimization of each system. Due to the small thickness of the functional layers in the order of several ten nanometres, coherent optics has to be considered and therefore interference effects play a dominant role. Here we present and discuss concepts for light trapping in organic solar cells. These are wide gap layers in planar solar cells, folded solar cell architectures benefiting from the illumination under inclined incident angles and multiple reflections and absorptions as well as diffraction gratings embossed into the photoactive layer. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


Ideas, bargaining and flexible policy communities: policy change and the case of the Oxford Transport Strategy

PUBLIC ADMINISTRATION, Issue 3 2003
Geoffrey DudleyArticle first published online: 8 AUG 200
Critiques of policy networks have highlighted particularly the inability of concepts such as policy communities to explain policy change. The established construction of policy community places it chiefly as a metaphor for a relatively stable network within the policy process, which emphasizes the resource dependencies between key stakeholders. Typically, a process of bargaining brings about accommodation and a state of negotiated order. However, a key problem arises in explaining major policy change where an established policy community persists. One solution here is to appreciate that, over time, dominant ideas and associated policy meanings may shift appreciably within an otherwise durable policy community. Thus, even a seemingly insulated policy community, under certain conditions, may not be immune to idea mutation and new policy meanings. Given the central importance of policy communities, these shifts may induce significant policy change. A case study of this type is provided by the Oxford Transport Strategy (OTS), where a dual process of change took place. On one level of analysis, a challenge to the policy community produced a typical bargaining strategy, with an emphasis on negotiated order. On another level of analysis, however, the terms of the policy debate shifted markedly, and produced a new meaning for the key concept of integrated transport within the policy community. In turn, this process induced significant policy change. The article concludes that, ironically, the survival of a policy community depends on its ability to re-create itself by visualizing a new future. [source]


Varieties of second modernity: the cosmopolitan turn in social and political theory and research

THE BRITISH JOURNAL OF SOCIOLOGY, Issue 3 2010
Ulrich Beck
Abstract The theme of this special issue is the necessity of a cosmopolitan turn in social and political theory. The question at the heart of this introductory chapter takes the challenge of ,methodological cosmopolitanism', already addressed in a Special Issue on Cosmopolitan Sociology in this journal (Beck and Sznaider 2006), an important step further: How can social and political theory be opened up, theoretically as well as methodologically and normatively, to a historically new, entangled Modernity which threatens its own foundations? How can it account for the fundamental fragility, the mutability of societal dynamics (of unintended side-effects, domination and power), shaped by the globalization of capital and risks at the beginning of the twenty-first century? What theoretical and methodological problems arise and how can they be addressed in empirical research? In the following, we will develop this ,cosmopolitan turn' in four steps: firstly, we present the major conceptual tools for a theory of cosmopolitan modernities; secondly, we de-construct Western modernity by using examples taken from research on individualization and risk; thirdly, we address the key problem of methodological cosmopolitanism, namely the problem of defining the appropriate unit of analysis; and finally, we discuss normative questions, perspectives, and dilemmas of a theory of cosmopolitan modernities, in particular problems of political agency and prospects of political realization. [source]


Imperfect transparency and the strategic use of information: an ever present temptation for central bankers?

THE MANCHESTER SCHOOL, Issue 5 2003
Andrew Hughes Hallett
Most economists argue that transparency in monetary policy is desirable because it helps the private sector make better informed decisions. They also argue that a lack of transparency has been a key problem in Europe's monetary policy. Using standard models,where there are also opportunities to use fiscal policy,we show that a lack of transparency will have very different effects depending on whether it represents a lack of political transparency or a lack of economic (or information) transparency. The former allows the central bank to create and exploit a ,strategic' reputation to its own advantage. The latter does not. Thus, political transparency helps us understand how monetary policy decisions are made. But economic transparency would reveal what information went into those decisions. [source]


Defining paganism in the Carolingian world

EARLY MEDIEVAL EUROPE, Issue 4 2007
James Palmer
Generations of scholars have looked for evidence of ,paganism' in continental sources from the eighth and ninth centuries. This paper surveys some of the key problems in defining and conceptualizing the available literary evidence for such a project. Part one argues for a return to the sources to help escape the intellectual baggage created by discussions of ,pan-Germanic paganism', interpretatio Romana and, more recently, folk practices. From the perspective of the sources' producers, paganism needs to be understood as a category of difference employed to provide a better definition of Christianity itself. In part two this line of thought is pursued through a brief study of the ways in which classical learning framed not only Carolingian attitudes to paganism, but also related strategies of moralizing. [source]


Consumer ,sovereignty' and policy issues in the development of product ecolabels

ENVIRONMENTAL POLICY AND GOVERNANCE, Issue 1 2001
Alain Nadaļ
Quality labels are increasingly focused on products' characteristics, requiring heavy scientific expertise to be assessed. Economists approach these labels as market mechanisms , i.e. signalling, reputation, or market differentiation , and ignore their institutional dimension. We contend that, by doing so, they do not address key problems faced by the regulators when developing these labels. The first part fleshes out this idea by examining the institutional dimension of the European ecolabel. We present the negotiation of the paints and varnishes ecolabelling criteria, a success story. The second part discusses three theoretical approaches to product labelling and proposes directions for further research on the subject. Copyright © 2001 John Wiley & Sons, Ltd and ERP Environment. [source]


How best to halt and/or revert UV-induced skin ageing: strategies, facts and fiction

EXPERIMENTAL DERMATOLOGY, Issue 3 2008
Lübeck Ralf Paus
Besides obvious market pressures, increasing insight into the mechanistic overlap between UV-induced skin cancer and UV-induced skin ageing has contributed to this development. Also, as strategies that work to antagonize intrinsic skin ageing/senescence may also be exploited against photoageing (and vice versa!), it has become an important skin research challenge to dissect both the differences and the overlap mechanisms between these interwined, yet distinct phenomena. Finally, the current surge in putative ,antiageing' products, devices, and strategies , too many of which boldly promise to fight and/or repair the perils that come along with a lifetime spent in the sun in the absence of convincing evidence of efficacy , makes it particularly pertinent to critically review the available evidence to support often made antiageing claims. The current CONTROVERSIES feature, therefore, aimed to provide both guidance through, and critical voices in, the antiageing circus. Here, a panel of experts defines relevant key problems, points the uninaugurated to intriguing aspects of photoageing that one may not have considered before, highlights promising strategies for how best to halt and/or revert it, and spiritedly debates some controversially discussed approaches. [source]


The development and evaluation of a telepsychiatry service for prisoners

JOURNAL OF PSYCHIATRIC & MENTAL HEALTH NURSING, Issue 4 2004
S. LEONARD bsc rn (mm) dip ndip b&fdip ptsdcounselling
The introduction of increasingly sophisticated telecommunication systems seems to offer opportunities to respond to some of the key problems around structural and spatial inequalities in access to health care. There is evidence which suggests that serious mental health problems are common among prisoners and psychiatric comorbidity is the norm. Many prisoners have complex mental health needs, but more often than not these remain unaddressed. Telepsychiatry is one strategy to improve the accessibility and quality of mental health care in the prison setting. This paper firstly reviews the current prison health care system and then describes a research study which is focused on the development and evaluation of a telepsychiatry service for prisoners. This study has investigated what is lost or gained in a psychiatric assessment when it is conducted via telepsychiatry. The researcher compared the inter-rater reliability between two raters interviewing 80 participants in an observer/interviewer split configuration in telepsychiatry and same room settings. The measure used was the Comprehensive Psychopathology Rating Scale. Prisoners and prison staff also took part in semi-structured interviews which focused on their satisfaction and acceptability of the telepsychiatry service. A cost comparison of the telepsychiatry service with the existing visiting service was conducted. This paper outlines the study design and focuses on the potential impact that telepsychiatry may have upon the practice setting. [source]


An Approach to Calculating Wear on Annular Non-Return Valves

MACROMOLECULAR MATERIALS & ENGINEERING, Issue 11 2002
Helmut Potente
Abstract The serviceability of non-return valves has a major influence on the productivity of the injection molding process. During a meeting of experts held at our Institute, it was seen that closing behavior and wear are the key problems encountered in practice. The conducted investigations to tackle these questions have shown that both an improved closing behavior and a lower level of wear can be achieved by reducing the inside radius of the locking ring. Pressure profile over the length of a non-return valve (n,=,0.4; ,=,25,000 mm3/s). [source]


Microstrip antennas for cellular and wireless communication systems

MICROWAVE AND OPTICAL TECHNOLOGY LETTERS, Issue 5 2002
obodzian
Abstract The Letter describes problems related to use of microstrip antennas in cellular and wireless telecommunications systems. Because of the unique properties, microstrip technology is nowadays often used to manufacture small internal antennas for portable terminals as well as antenna arrays for base stations. It also seems to be a very promising technology for multisystem antennas, for which there is an ever-growing demand. The Letter also presents a short overview of currently available solutions and ones under development by the authors, along with some key problems related to their design. © 2002 Wiley Periodicals, Inc. Microwave Opt Technol Lett 34: 380,384, 2002; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/mop.10468 [source]


Age estimation of archaeological remains using amino acid racemization in dental enamel: A comparison of morphological, biochemical, and known ages-at-death

AMERICAN JOURNAL OF PHYSICAL ANTHROPOLOGY, Issue 2 2009
R.C. Griffin
Abstract The poor accuracy of most current methods for estimating age-at-death in adult human skeletal remains is among the key problems facing palaeodemography. In forensic science, this problem has been solved for unburnt remains by the development of a chemical method for age estimation, using amino acid racemization in collagen extracted from dentine. Previous application of racemization methods to archaeological material has proven problematic. This study presents the application to archaeological human remains of a new age estimation method utilizing amino acid racemization in a potentially closed system,the dental enamel. The amino acid composition and extent of racemization in enamel from two Medieval cemeteries (Newcastle Blackgate and Grantham, England) and from a documented age-at-death sample from a 19th century cemetery (Spitalfriedhof St Johann, Switzerland) were determined. Alterations in the amino acid composition were detected in all populations, indicating that diagenetic change had taken place. However, in the Medieval populations, these changes did not appear to have substantially affected the relationship between racemization and age-at-death, with a strong relationship being retained between aspartic acid racemization and the morphological age estimates. In contrast, there was a poor relationship between racemization and age in the post-medieval documented age-at-death population from Switzerland. This appears to be due to leaching of amino acids post-mortem, indicating that enamel is not functioning as a perfectly closed system. Isolation of amino acids from a fraction of enamel which is less susceptible to leaching may improve the success of amino acid racemization for archaeological age estimation. Am J Phys Anthropol, 2009. © 2009 Wiley-Liss, Inc. [source]


Polarization anisotropy of X-ray atomic factors and `forbidden' resonant reflections

ACTA CRYSTALLOGRAPHICA SECTION A, Issue 5 2005
Vladimir E. Dmitrienko
Symmetry and physical aspects of `forbidden' reflections excited by a local polarization anisotropy of the X-ray susceptibility are surveyed. Such reflections are observed near absorption edges where the anisotropy is caused by distortions of the atomic electronic states owing to interaction with neighbouring atoms. As a consequence, they allow for extracting nontrivial information about the resonant atom's local environment and their physical conditions. The unusual polarization properties of the considered reflections are helpful to distinguish them from other types of `forbidden' reflections. When such reflections are excited, it is, for example, possible to determine not only the intrinsic anisotropy of an atomic form factor but also additional anisotropy induced by thermal motion, point defects and/or incommensurate modulations. Even the local `chirality' of atoms in centrosymmetric crystals is accessible. Unsolved key problems and possible future developments are addressed. [source]


Bayesian Finite Markov Mixture Model for Temporal Multi-Tissue Polygenic Patterns

BIOMETRICAL JOURNAL, Issue 1 2009
Yulan Liang
Abstract Finite mixture models can provide the insights about behavioral patterns as a source of heterogeneity of the various dynamics of time course gene expression data by reducing the high dimensionality and making clear the major components of the underlying structure of the data in terms of the unobservable latent variables. The latent structure of the dynamic transition process of gene expression changes over time can be represented by Markov processes. This paper addresses key problems in the analysis of large gene expression data sets that describe systemic temporal response cascades and dynamic changes to therapeutic doses in multiple tissues, such as liver, skeletal muscle, and kidney from the same animals. Bayesian Finite Markov Mixture Model with a Dirichlet Prior is developed for the identifications of differentially expressed time related genes and dynamic clusters. Deviance information criterion is applied to determine the number of components for model comparisons and selections. The proposed Bayesian models are applied to multiple tissue polygenetic temporal gene expression data and compared to a Bayesian model-based clustering method, named CAGED. Results show that our proposed Bayesian Finite Markov Mixture model can well capture the dynamic changes and patterns for irregular complex temporal data (© 2009 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


Practical identifiability of biokinetic parameters of a model describing two-step nitrification in biofilms

BIOTECHNOLOGY & BIOENGINEERING, Issue 3 2008
D. Brockmann
Abstract Parameter estimation and model calibration are key problems in the application of biofilm models in engineering practice, where a large number of model parameters need to be determined usually based on experimental data with only limited information content. In this article, identifiability of biokinetic parameters of a biofilm model describing two-step nitrification was evaluated based solely on bulk phase measurements of ammonium, nitrite, and nitrate. In addition to evaluating the impact of experimental conditions and available measurements, the influence of mass transport limitation within the biofilm and the initial parameter values on identifiability of biokinetic parameters was evaluated. Selection of parameters for identifiability analysis was based on global mean sensitivities while parameter identifiability was analyzed using local sensitivity functions. At most, four of the six most sensitive biokinetic parameters were identifiable from results of batch experiments at bulk phase dissolved oxygen concentrations of 0.8 or 5 mg O2/L. High linear dependences between the parameters of the subsets and resulted in reduced identifiability. Mass transport limitation within the biofilm did not influence the number of identifiable parameters but, in fact, decreased collinearity between parameters, especially for parameters that are otherwise correlated (e.g., µAOB and , or µNOB and ). The choice of the initial parameter values had a significant impact on the identifiability of two parameter subsets, both including the parameters µAOB and . Parameter subsets that did not include the subsets µAOB and or µNOB and were clearly identifiable independently of the choice of the initial parameter values. Biotechnol. Bioeng. 2008;101: 497,514. © 2008 Wiley Periodicals, Inc. [source]


What tools do we need to improve identification of child abuse?

CHILD ABUSE REVIEW, Issue 6 2005
Eileen Munro
Abstract Child protection work is being transformed by the introduction of information and communication technology (ICT) and other tools to improve frontline work. This article argues that current innovations are being developed without sufficient attention to understanding the needs of frontline workers. Taking the identification of child abuse as an example, the article shows how beginning with the question ,What tools do we need?' produces radically different answers from the current proposed tools such as the Information Sharing and Assessment database (ISA). The approach advocated here involves examining what aspects of the task frontline workers find difficult and identifying where they would most appreciate help. In relation to the problem of sharing information between professionals to ensure accurate assessment of risk, it is argued that the key problems do not lie in the technical process of sharing data but in professionals' ability to collect the necessary information, to interpret it accurately and to communicate it clearly. Copyright © 2005 John Wiley & Sons, Ltd. [source]