General Problem (general + problem)

Distribution by Scientific Domains
Distribution within Humanities and Social Sciences


Selected Abstracts


Critiquing Bachelor candidates' theses: are the criteria useful?

INTERNATIONAL NURSING REVIEW, Issue 2 2002
I. Kapborg RN
Abstract Nursing education programmes should be at an academic level and connected to research. In Sweden, empirical studies are generally required in order to obtain a Bachelor's degree; hence, in some cases, these studies are replaced by a literature review. A study was conducted using 11 criteria. Thirteen theses produced in a department of nursing science were examined, elaborated and reproduced by reviewing international and national literature. Thereafter, the criteria themselves were scrutinized. Principal findings when critiquing the theses were that in eight theses the purpose was clearly identified and well defined in relation to the study accomplished; in three theses the purpose was indistinct and vague; and in two the definitions and research questions were lacking. The topic was relevant for the area of nursing in all theses. General problems identified were poor spelling and grammar, and unsatisfactory thesis structure. This article discusses whether criteria are useful when examining the Bachelor candidates' theses. The authors report that the criteria seemed to be useful, giving some guidance for scrutinizing theses and facilitating correspondence. Criteria could be appropriate guidelines for using to increase the quality of the theses as well as the quality of nursing. [source]


A phylogenetic analysis of the monogenomic Triticeae (Poaceae) based on morphology

BOTANICAL JOURNAL OF THE LINNEAN SOCIETY, Issue 1 2001
OLE SEBERG FLS
A cladistic analysis, primarily based on morphology, is presented from 40 diploid taxa representing the 24 monogenomic genera of the Triticeae. General problems related to the treatment of hybrids and supposedly allopolyploid heterogenomic taxa are highlighted. Special emphasis is given to taxa not traditionally included in Aegilops s.J. Most of the 33 characters used in the analysis are coded as binary. The only four multistate characters in the matrix are treated as unordered. Three diploid species of Bromus are used as outgroup. The number of equally parsimonious trees found is very large (approx. 170000; length = 107, ci = 0.36, ri = 0.75) and the strict consensus tree has an expectedly low level of resolution. However, most of the equally parsimonious trees owe their existence to an unresolved Aegilops clade. If this clade is replaced by its hypothetical ancestor, the number of equally parsimonious trees drops dramatically (48; length = 78, ci = 0.45, ri = 0.76). When trees for which more highly resolved compatible trees exist are excluded, only two trees remain. Bremer support is used as a measure of branch support. The trees based on morphology and on molecular data are largely incongruent. [source]


Maintaining Case-Based Reasoners: Dimensions and Directions

COMPUTATIONAL INTELLIGENCE, Issue 2 2001
David C. Wilson
Experience with the growing number of large-scale and long-term case-based reasoning (CBR) applications has led to increasing recognition of the importance of maintaining existing CBR systems. Recent research has focused on case-base maintenance (CBM), addressing such issues as maintaining consistency, preserving competence, and controlling case-base growth. A set of dimensions for case-base maintenance, proposed by Leake and Wilson, provides a framework for understanding and expanding CBM research. However, it also has been recognized that other knowledge containers can be equally important maintenance targets. Multiple researchers have addressed pieces of this more general maintenance problem, considering such issues as how to refine similarity criteria and adaptation knowledge. As with case-base maintenance, a framework of dimensions for characterizing more general maintenance activity, within and across knowledge containers, is desirable to unify and understand the state of the art, as well as to suggest new avenues of exploration by identifying points along the dimensions that have not yet been studied. This article presents such a framework by (1) refining and updating the earlier framework of dimensions for case-base maintenance, (2) applying the refined dimensions to the entire range of knowledge containers, and (3) extending the theory to include coordinated cross-container maintenance. The result is a framework for understanding the general problem of case-based reasoner maintenance (CBRM). Taking the new framework as a starting point, the article explores key issues for future CBRM research. [source]


Confusion Effect in a Reptilian and a Primate Predator

ETHOLOGY, Issue 8 2000
Carsten Schradin
The confusion effect is claimed to be one benefit of group living with respect to predator avoidance: it is more difficult for predators to capture prey that is surrounded by other conspecifics than to capture an isolated individual. So far, the predictions of the confusion effect have been tested mainly in aquatic predators. As the confusion effect is seen to be a general problem for predators, terrestrial predators of two different vertebrate classes were used to test it. The prey (mealworms and black beetles, Tenebrio molitor) was harmless and had no chance of predator avoidance. Thus, confounding effects of group defence and enhanced vigilance were controlled. Both leopard geckos (Eublepharis macularius) and common marmosets (Callithrix jacchus) took longer to catch one out of several prey compared to one single prey. Leopard geckos showed more fixations (changing of head position) when confronted with 20 mealworms than when confronted with only one mealworm, thus showing indications of being ,confused'. [source]


Multi-variable parameter estimation to increase confidence in hydrological modelling

HYDROLOGICAL PROCESSES, Issue 2 2002
Sten Bergström
Abstract The expanding use and increased complexity of hydrological runoff models has given rise to a concern about overparameterization and risks for compensating errors. One proposed way out is the calibration and validation against additional observations, such as snow, soil moisture, groundwater or water quality. A general problem, however, when calibrating the model against more than one variable is the strategy for parameter estimation. The most straightforward method is to calibrate the model components sequentially. Recent results show that in this way the model may be locked up in a parameter setting, which is good enough for one variable but excludes proper simulation of other variables. This is particularly the case for water quality modelling, where a small compromise in terms of runoff simulation may lead to dramatically better simulations of water quality. This calls for an integrated model calibration procedure with a criterion that integrates more aspects on model performance than just river runoff. The use of multi-variable parameter estimation and internal control of the HBV hydrological model is discussed and highlighted by two case studies. The first example is from a forested basin in northern Sweden and the second one is from an agricultural basin in the south of the country. A new calibration strategy, which is integrated rather than sequential, is proposed and tested. It is concluded that comparison of model results with more measurements than only runoff can lead to increased confidence in the physical relevance of the model, and that the new calibration strategy can be useful for further model development. Copyright © 2002 John Wiley & Sons, Ltd. [source]


Growing Disability Rates , the Gender Issue: The Dutch Case in an International Perspective

INTERNATIONAL SOCIAL SECURITY REVIEW, Issue 1 2005
Marcel Einerhand
In the Netherlands, with its remarkably high disability rates, a new phenomenon seems to be emerging. Growing disability rates in the past few years have been exclusively caused by the growing inflow of women into the disability schemes. Comparing the Dutch situation internationally shows that roughly the same problem seems to exist in those countries in which there is a more general problem of large inflow into disability. Women are overrepresented in these arrangements. The Dutch literature shows that there are many factors (both work- and non-work-related) that contribute to a larger push of women towards disability. We conclude that the benefit system can be seen as a sort of "filter". If the filter is weak, many persons will enter disability. If the pressure on women to enter is larger (or the forces to stop women from entering disability smaller), then inflow risks for women will be higher. [source]


Systematic quantum chemical study of DNA-base tautomers

JOURNAL OF COMPUTATIONAL CHEMISTRY, Issue 1 2004
M. Piacenza
Abstract The relative energies of the energetically low-lying tautomers of pyridone, cytosine, uracil, thymine, guanine, and iso-cytosine are studied by a variety of different quantum chemical methods. In particular, we employ density functional theory (DFT) using the six functionals HCTH407, PBE, BP86, B-LYP, B3-LYP, and BH-LYP, and the ab initio methods Hartree-Fock (HF), standard second-order Møller-Plesset perturbation theory (MP2), an improved version of it (SCS-MP2), and quadratic configuration interaction including single and double excitations (QCISD) and perturbative triple corrections [QCISD(T)]. A detailed basis set study is performed for the formamide/formamidic acid tautomeric pair. In general, large AO basis sets of at least valence triple-, quality including f-functions (TZV) are employed, which are found to be necessary for an accurate energetic description of the various structures. The performance of the more approximate methods is evaluated with QCISD(T)/TZV(2df,2dp) data taken as reference. In general it is found that DFT is not an appropriate method for the problem. For the tautomers of pyridone and cytosine, most density functionals, including the popular B3-LYP hybrid, predict a wrong energetic order, and only for guanine, the correct sequence of tautomers is obtained with all functionals. Out of the density functionals tested, BH-LYP, which includes a rather large fraction of HF exchange, performs best. A consistent description of the nonaromatic versus aromatic tautomers seems to be a general problem especially for pure, nonhybrid functionals. Tentatively, this could be assigned to the exchange potentials used while the functional itself, including the correlation part, seems to be appropriate. Out of the ab initio methods tested, the new SCS-MP2 approach seems to perform best because it effectively reduces some outliers obtained with standard MP2. It outperforms the much more costly QCISD method and seems to be a very good compromise between computational effort and accuracy. © 2003 Wiley Periodicals, Inc. J Comput Chem 1: 83,98, 2004 [source]


Local and marginal control charts applied to methicillin resistant Staphylococcus aureus bacteraemia reports in UK acute National Health Service trusts

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES A (STATISTICS IN SOCIETY), Issue 1 2009
O. A. Grigg
Summary., We consider the general problem of simultaneously monitoring multiple series of counts, applied in this case to methicillin resistant Staphylococcus aureus (MRSA) reports in 173 UK National Health Service acute trusts. Both within-trust changes from baseline (,local monitors') and overall divergence from the bulk of trusts (,relative monitors') are considered. After standardizing for type of trust and overall trend, a transformation to approximate normality is adopted and empirical Bayes shrinkage methods are used for estimating an appropriate baseline for each trust. Shewhart, exponentially weighted moving average and cumulative sum charts are then set up for both local and relative monitors: the current state of each is summarized by a p -value, which is processed by a signalling procedure that controls the false discovery rate. The performance of these methods is illustrated by using 4.5 years of MRSA data, and the appropriate use of such methods in practice is discussed. [source]


Identification and differentiation of Heterotardigrada and Eutardigrada species by riboprinting

JOURNAL OF ZOOLOGICAL SYSTEMATICS AND EVOLUTIONARY RESEARCH, Issue 3 2007
R. O. Schill
Abstract In the last decades, the number of known tardigrade species has considerably increased to more than 960 species with new ones being discovered every year. However, the study of tardigrade species presents a general problem which is frequently encountered during the work with invertebrates: small size and remarkable degrees of phenotypic plasticity may sometimes not permit a definite identification of the species. In this investigation we have used riboprinting, a tool to study rDNA sequence variation, in order to distinguish tardigrade species from each other. The method combines a restriction site variation approach of ribotyping with amplified DNAs. In eight investigated species of heterotardigrades and eutardigrades we have amplified the genes for the small subunit ribosomal RNA (SSU; 18S) and subsequently sequenced the genes. Virtual riboprints were used for identification of restriction sites from ten already published 18S rDNA sequences and seven new 18S rDNA sequences. On the basis of the obtained sequences, diagnostic restriction fragment patterns can be predicted with only 11 restriction enzymes. The virtual digestion confirmed the obtained restriction fragment patterns and restriction sites of all amplified and digested tardigrade DNAs. We show that the variation in positions and number of restriction sites obtained by standard restriction fragment analysis on agarose gels can be used successfully for taxonomic identification at different taxonomic levels. The simple restriction fragment analysis provides a fast and convenient method of molecular barcoding for species identification in tardigrades. Zusammenfassung Im Laufe der letzten Jahrzehnte wurden viele neue Tardigradenarten beschrieben. Zur Zeit sind mehr als 960 Arten bekannt und jedes Jahr kommen neue Arten hinzu. Die Arbeit mit Tardigraden stellt jedoch oftmals ein Problem dar, das generell auch bei anderen Organismen von großer Bedeutung ist: die geringe Größe und die außergewöhnliche phenotypische Plastizität machen in vielen Fällen eine genaue Artidentifikation schwierig. In der vorliegenden Untersuchung verwenden wir das Riboprinting, eine Technik, rDNA Sequenzunterschiede zu erfassen, um damit verschiedene Tardigradenarten voneinander zu differenzieren. Diese Methode vereint den Ansatz der Restriktionsschnittstellenanalyse des Riboprinting mit amplifizierten DNAs. Von acht untersuchten Heterotardigraden und Eutardigraden wurden die Gene für die kleine ribosomale RNA Untereinheit (SSU; 18S) amplifiziert und sequenziert. Virtuelle Riboprints wurden zur Identifikation von zehn bereits publizierten 18S rDNA Sequenzen und sieben neuen 18S DNA Sequenzen erstellt. Auf der Basis der vorliegenden Sequenzen können die diagnostischen Restriktionsfragmentmuster mit insgesamt elf Restriktionsenzyme vorhergesagt werden. Der virtuelle Verdau bestätigt die Restriktionsfragmentmuster und Restriktionsschnittstellen aller amplifizierten und verdauten Tardigraden DNAs. Wir zeigen, dass die unterschiedlichen Variationen in den Positionen und Anzahl der Restriktionsschnittstellen erfolgreich zur taxonomischen Identifikation auf verschiedenen taxonomischen Ebenen verwendet werden können. Die einfache Restriktionsfragmentanalyse stellt eine schnelle und geeignete Methode für das molekulare Barcoding zur Artidentifikation bei Tardigraden dar. [source]


Law, Religion, and "Public Health" in the Republic of Brazil

LAW & SOCIAL INQUIRY, Issue 1 2001
Paul Christopher Johnson
The essay evaluates the general problem that, while most modern republican constitutions follow the U.S. and French models in declaring religious freedom, absolute religious freedom is impossible and undesirable. How are religious freedoms constrained, and how much should they be? The essay evaluates the strategies by which limitations on freedoms of religion are constructed and imposed, especially the powerful isomorphism of law and science described by Boaventura de Sousa Santos. Taking the example of Afro-Brazilian religions in relation to the Brazilian state since 1890, post-emancipation, the essay argues that pseudo-scientific discourses of "public health" constrained the religious practice of former slaves, thus allowing the trompel'oeil of religious freedom to continue in the new republic, even as freedoms were in fact constrained by the state. [source]


Disruption management in production planning

NAVAL RESEARCH LOGISTICS: AN INTERNATIONAL JOURNAL, Issue 5 2005
Jian Yang
Abstract We study the problem of recovering a production plan after a disruption, where the disruption may be caused by incidents such as power failure, market change, machine breakdown, supply shortage, worker no-show, and others. The new recovery plan we seek after has to not only suit the changed environment brought about by the disruption, but also be close to the initial plan so as not to cause too much customer unsatisfaction or inconvenience for current-stage and downstream operations. For the general-cost case, we propose a dynamic programming method for the problem. For the convex-cost case, a general problem which involves both cost and demand disruptions can be solved by considering the cost disruption first and then the demand disruption. We find that a pure demand disruption is easy to handle; and for a pure cost disruption, we propose a greedy method which is provably efficient. Our computational studies also reveal insights that will be helpful to managing disruptions in production planning. © 2005 Wiley Periodicals, Inc. Naval Research Logistics, 2005. [source]


Scheduling of depalletizing and truck loading operations in a food distribution system

NAVAL RESEARCH LOGISTICS: AN INTERNATIONAL JOURNAL, Issue 3 2003
Zhi-Long Chen
Abstract This paper studies a scheduling problem arising in a beef distribution system where pallets of various types of beef products in the warehouse are first depalletized and then individual cases are loaded via conveyors to the trucks which deliver beef products to various customers. Given each customer's demand for each type of beef, the problem is to find a depalletizing and truck loading schedule that fills all the demands at a minimum total cost. We first show that the general problem where there are multiple trucks and each truck covers multiple customers is strongly NP-hard. Then we propose polynomial-time algorithms for the case where there are multiple trucks, each covering only one customer, and the case where there is only one truck covering multiple customers. We also develop an optimal dynamic programming algorithm and a heuristic for solving the general problem. By comparing to the optimal solutions generated by the dynamic programming algorithm, the heuristic is shown to be capable of generating near optimal solutions quickly. © 2003 Wiley Periodicals, Inc. Naval Research Logistics, 2003 [source]


The ring grooming problem

NETWORKS: AN INTERNATIONAL JOURNAL, Issue 3 2004
Timothy Y. Chow
Abstract The problem of minimizing the number of bidirectional SONET rings required to support a given traffic demand has been studied by several researchers. Here we study the related ring-grooming problem of minimizing the number of add/drop locations instead of the number of rings; in a number of situations this is a better approximation to the true equipment cost. Our main result is a new lower bound for the case of uniform traffic. This allows us to prove that a certain simple algorithm for uniform traffic is, in fact, a constant-factor approximation algorithm, and it also demonstrates that known lower bounds for the general problem,in particular, the linear programming relaxation,are not within a constant factor of the optimum. We also show that our results for uniform traffic extend readily to the more practically important case of quasi-uniform traffic. Finally, we show that if the number of nodes on the ring is fixed, then ring grooming is solvable in polynomial time; however, whether ring grooming is fixed-parameter tractable is still an open question. © 2004 Wiley Periodicals, Inc. NETWORKS, Vol. 44(3), 194,202 2004 [source]


Art and Negative Affect

PHILOSOPHY COMPASS (ELECTRONIC), Issue 1 2009
Aaron Smuts
Why do people seemingly want to be scared by movies and feel pity for fictional characters when they avoid situations in real life that arouse these same negative emotions? Although the domain of relevant artworks encompasses far more than just tragedy, the general problem is typically called the paradox of tragedy. The paradox boils down to a simple question: If people avoid pain then why do people want to experience art that is painful? I discuss six popular solutions to the paradox: conversion, control, compensatory, meta-response, catharsis, and rich experience theories. [source]


Sacrifice and the problem of beginning: meditations from Sakalava mythopraxis

THE JOURNAL OF THE ROYAL ANTHROPOLOGICAL INSTITUTE, Issue 1 2007
Michael Lambek
This article addresses the general problem of beginning in human thought and action. It argues for complementing the emphasis on transition in the analysis of ritual with attention to beginning and for supplementing the relative passivity of liminality with the resoluteness of initiating action, while also attending to both the transitive and intransitive aspects of beginning itself. Drawing from representations of the foundation of a Sakalava monarchy in Madagascar, the article presents sacrifice as an exemplary form of beginning. Describing sacrifice in this manner obviates the need for any theory of sacrifice while offering new insights on the gift, ethical personhood, and the temporality of tradition. [source]


IN DEFENCE OF MAGICAL ERSATZISM

THE PHILOSOPHICAL QUARTERLY, Issue 223 2006
David A. Denby
David Lewis' objection to a generic theory of modality which he calls ,magical ersatzism' is that its linchpin, a relation he calls ,selection', must be either an internal or an external relation, and that this is unintelligible either way. But the problem he points out with classifying selection as internal is really just an instance of the general problem of how we manage to grasp underdetermined predicates, is not peculiar to magical ersatzism, and is amenable to some familiar solutions. He provides no compelling grounds for thinking that classifying selection as external is unintelligible, and his argument has a false presupposition. I conclude that magical ersatzism is still a viable option in the metaphysics of modality. [source]


Estimating the risk of rare complications: is the ,rule of three' good enough?

ANZ JOURNAL OF SURGERY, Issue 7-8 2009
John Ludbrook
Abstract The clinical problem:, If a surgeon has performed a particular operation on n consecutive patients without major complications, what is the long-term risk of major complications after performing many more such operations? Examples of such operations are endoscopic cholecystectomy, nephrectomy and sympathectomy. The statistical problem and solutions:, This general problem has exercised the minds of theoretical statisticians for more than 80 years. They agree only that the long-term risk is best expressed as the upper bound of a 95% confidence interval. We consider many proposed solutions, from those that involve complex statistical theory to the empirical ,rule of three', popular among clinicians, in which the percentage risk is given by the formula 100 × (3/n). Our conclusions:, The ,rule of three' grossly underestimates the future risks and can be applied only when the initial complication rate is zero (that is, 0/n). If the initial complication rate is greater than zero, then no simple ,rule' suffices. We give the results of applying the more popular statistical models, including their coverage. The ,exact' Clopper,Pearson interval has wider coverage across all proportions than its nominal 95%, and is, thus, too conservative. The Wilson score confidence interval gives about 95% coverage on average overall population proportions, except very small ones, so we prefer it to the Clopper,Pearson method. Unlike all the other intervals, Bayesian intervals with uniform priors yield exactly 95% coverage at any observed proportion. Thus, we strongly recommend Bayesian intervals and provide free software for executing them. [source]


Analysis of call centre arrival data using singular value decomposition

APPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY, Issue 3 2005
Haipeng Shen
Abstract We consider the general problem of analysing and modelling call centre arrival data. A method is described for analysing such data using singular value decomposition (SVD). We illustrate that the outcome from the SVD can be used for data visualization, detection of anomalies (outliers), and extraction of significant features from noisy data. The SVD can also be employed as a data reduction tool. Its application usually results in a parsimonious representation of the original data without losing much information. We describe how one can use the reduced data for some further, more formal statistical analysis. For example, a short-term forecasting model for call volumes is developed, which is multiplicative with a time series component that depends on day of the week. We report empirical results from applying the proposed method to some real data collected at a call centre of a large-scale U.S. financial organization. Some issues about forecasting call volumes are also discussed. Copyright © 2005 John Wiley & Sons, Ltd. [source]


Transcriptional and translational control of C/EBPs: The case for "deep" genetics to understand physiological function

BIOESSAYS, Issue 8 2010
Claus Nerlov
Abstract The complexity of organisms is not simply determined by the number of their genes, but to a large extent by how gene expression is controlled. In addition to transcriptional regulation, this involves several layers of post-transcriptional control, such as translational repression, microRNA-mediated mRNA degradation and translational inhibition, alternative splicing, and the regulated generation of functionally distinct gene products from a single mRNA through alternative use of translation initiation sites. Much progress has been made in describing the molecular basis for these gene regulatory mechanisms. However, it is now a major challenge to translate this knowledge into deeper understanding of the physiological processes, both normal and pathological, that they govern. Using the C/EBP family of transcription factors as an example, the present review describes recent genetic experiments addressing this general problem and discusses how the physiological importance of newly discovered regulatory mechanisms might be determined. [source]


Genome Scanning Tests for Comparing Amino Acid Sequences Between Groups

BIOMETRICS, Issue 1 2008
Peter B. Gilbert
Summary Consider a placebo-controlled preventive HIV vaccine efficacy trial. An HIV amino acid sequence is measured from each volunteer who acquires HIV, and these sequences are aligned together with the reference HIV sequence represented in the vaccine. We develop genome scanning methods to identify positions at which the amino acids in infected vaccine recipient sequences either (A) are more divergent from the reference amino acid than the amino acids in infected placebo recipient sequences or (B) have a different frequency distribution than the placebo sequences, irrespective of a reference amino acid. We consider t -test-type statistics for problem A and Euclidean, Mahalanobis, and Kullback,Leibler-type statistics for problem B. The test statistics incorporate weights to reflect biological information contained in different amino acid positions and mismatches. Position-specific p -values are obtained by approximating the null distribution of the statistics either by a permutation procedure or by a nonparametric estimation. A permutation method is used to estimate a cut-off p -value to control the per comparison error rate at a prespecified level. The methods are examined in simulations and are applied to two HIV examples. The methods for problem B address the general problem of comparing discrete frequency distributions between groups in a high-dimensional data setting. [source]


A Semantic View of Ecological Theories,

DIALECTICA, Issue 1 2001
David G.A. CastleArticle first published online: 23 JUN 200
Philosophical analysis of ecological theories has lagged behind the study of evolutionary theory. The semantic conception of scientific theories, which has been employed successfully in the analysis of evolutionary theory, is adopted here to analyse ecological theory. Two general problems in ecology are discussed. One arises from the continued use of covering law models in ecology, and the other concerns the applicability of ecological theory in conservation biology. The semantic conception of ecological theories is used to resolve these problems. [source]


Voluntary environmental policy instruments: two Irish success stories?

ENVIRONMENTAL POLICY AND GOVERNANCE, Issue 1 2002
Brendan Flynn
Voluntary environmental policy instruments have attracted much interest in recent years, yet they remain firmly established only in the industrial setting of the typical environmental policy ,leader' states such as the Netherlands or Germany. This paper examines two Irish examples of innovative voluntary agreements, a farm plastic recycling scheme and a bird conservation project. These both suggest that the voluntary approach can be applied successfully outside the typical industrial sector. Noteworthy in explaining the emergence of the two case studies here was a policy transfer effect from the UK, involving organized Irish farming interests and Irish bird conservationists. A fear of impending state legislation, which is often cited as a vital precondition for successful voluntary approaches, was actually less important than the autonomous initiative of the interest groups themselves. It is suggested that a more important role for the state lies in ensuring good policy co-ordination. A concluding discussion examines the general problems and potential of relying on interest groups to transfer and implement innovative voluntary environmental policy instruments. Copyright © 2002 John Wiley & Sons, Ltd. and ERP Environment [source]


Trends in the literature on eating disorders over 36 years (1965-2000): terminology, interpretation and treatment,

EUROPEAN EATING DISORDERS REVIEW, Issue 1 2004
Sten S. Theander
Abstract A database with more than 14,700 references to articles on ED during 1965,2000 was analysed with the aim of identifying long-term and short-term trends in the literature. The main data of the analysis are the percentages of references per 6-year period or per year for the most recent 6 years, that contain certain words or phrases. The use of words in references should reflect changes in the content of the scientific papers. Trends over time have been noted in the use of terms on diagnostics, personality, comorbidity, psychotherapy, epidemiology, biology and course of illness. The use of words on psychoanalysis has diminished, while the use of biological words has increased. The discussion concerns the use of bibliographic methods and especially the general problems of using data from digital databases as a research tool in the search for trends in the development of a medical field. Copyright © 2004 John Wiley & Sons, Ltd and Eating Disorders Association. [source]


Passion and Reason: Aristotelian Strategies in Kierkegaard's Ethics

JOURNAL OF RELIGIOUS ETHICS, Issue 2 2002
Norman Lillegard
Both Aristotle and Kierkegaard show that virtues result, in part, from training which produces distinctive patterns of salience. The "frame problem" in AI shows that rationality requires salience. Salience is a function of cares and desires (passions) and thus governs choice in much the way Aristotle supposes when he describes choice as deliberative desire. Since rationality requires salience it follows that rationality requires passion. Thus Kierkegaard is no more an irrationalist in ethics than is Aristotle, though he continues to be charged with irrationalism. The compatibility of an Aristotelian reading of Kierkegaard with the "suspension of the ethical" and general problems with aretaic ethical theories are treated briefly. The author argues that it is possible to preserve a realist ethics in the face of the "tradition relativism" which threatens the version of virtue ethics here attributed to Kierkegaard. [source]


Inference of object-oriented design patterns

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 5 2001
Paolo Tonella
Abstract When designing a new application, experienced software engineers usually adopt solutions that have proven successful in previous projects. Such reuse of code organizations is seldom made explicit. Nevertheless, it represents important information, which can be extremely valuable in the maintenance phase by documenting the design choices underlying the implementation. In addition it can be reused whenever a similar problem is encountered. In this paper an approach for the inference of recurrent design patterns directly from the code is proposed. No assumption is made on the availability of any pattern library, and the concept analysis algorithm,adapted for this purpose,is able to infer the presence of class groups which instantiate a common, repeated pattern. In fact, concept analysis provides sets of objects sharing attributes, which,in the case of object-oriented design patterns,become class members or inter-class relations. The approach was applied to three C++ applications for which the structural relations among classes led to the extraction of a set of design patterns, which could be enriched with non-structural information about class members and method invocations. The resulting patterns could be interpreted as meaningful organizations aimed at solving general problems which have several instances in the applications analyzed. Copyright © 2001 John Wiley & Sons, Ltd. [source]


Non-stoichiometry and non-homogeneity in InN

PHYSICA STATUS SOLIDI (C) - CURRENT TOPICS IN SOLID STATE PHYSICS, Issue 7 2005
K. Scott
Abstract It is shown that the wide variation of apparent band-gap observed for thin films nominally referred to as InN is strongly influenced by variations in the nitrogen:indium stoichiometry. InN samples grown by remote plasma enhanced chemical vapour deposition show a change in band-gap between 1.8 and 1.0 eV that is not due to the Moss-Burstein effect, oxygen inclusion or quantum size effects, but for which changes in the growth temperature result in a strong change in stoichiometry. Material non-homogenity and non-stoichiometry appear to be general problems for InN growth. Excess nitrogen can be present at very high levels and indium rich material is also found. This work shows that the extent of the Moss-Burstein effect will have to be reassessed for InN. (© 2005 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


Modeling multiple-response categorical data from complex surveys

THE CANADIAN JOURNAL OF STATISTICS, Issue 4 2009
Christopher R. Bilder
Abstract Although "choose all that apply" questions are common in modern surveys, methods for analyzing associations among responses to such questions have only recently been developed. These methods are generally valid only for simple random sampling, but these types of questions often appear in surveys conducted under more complex sampling plans. The purpose of this article is to provide statistical analysis methods that can be applied to "choose all that apply" questions in complex survey sampling situations. Loglinear models are developed to incorporate the multiple responses inherent in these types of questions. Statistics to compare models and to measure association are proposed and their asymptotic distributions are derived. Monte Carlo simulations show that tests based on adjusted Pearson statistics generally hold their correct size when comparing models. These simulations also show that confidence intervals for odds ratios estimated from loglinear models have good coverage properties, while being shorter than those constructed using empirical estimates. Furthermore, the methods are shown to be applicable to more general problems of modeling associations between elements of two or more binary vectors. The proposed analysis methods are applied to data from the National Health and Nutrition Examination Survey. The Canadian Journal of Statistics © 2009 Statistical Society of Canada Quoique les questions du type « Sélectionner une ou plusieurs réponses » sont courantes dans les enquêtes modernes, les méthodes pour analyser les associations entre les réponses à de telles questions viennent seulement d'être développées. Ces méthodes sont habituellement valides uni-quement pour des échantillons aléatoires simples, mais ce genre de questions apparaissent souvent dans les enquêtes conduites sous des plans de sondage beaucoup plus complexes. Le but de cet article est de donner des méthodes d'analyse statistique pouvant être appliquées aux questions de type « Sélectionner une ou plusieurs réponses » dans des enquêtes utilisant des plans de sondage complexes. Des modèles loglinéaires sont développés permettant d'incorporer les réponses multiples inhérentes à ce type de questions. Des statistiques permettant de comparer les modèles et de mesu-rer l'association sont proposées et leurs distributions asymptotiques sont obtenues. Des simulations de Monte-Carlo montrent que les tests basés sur les statistiques de Pearson ajustées maintiennent généralement leur niveau lorsqu'ils sont utilisés pour comparer des modèles. Ces études montrent également que les niveaux des intervalles de confiance pour les rapports de cotes estimés à par-tir des modèles loglinéaires ont de bonnes propriétés de couverture tout en étant plus courts que ceux utilisant les estimations empiriques. De plus, il est montré que ces méthodes peuvent aussi êtres utilisées dans un contexte plus général de modélisation de l'association entre les éléments de deux ou plusieurs vecteurs binaires. Les méthodes d'analyse proposées sont appliquées à des données provenant de l'étude américaine « National Health and Nutrition Examination Survey » (NHANES). La revue canadienne de statistique © 2009 Société statistique du Canada [source]


Exploring the potential benefits of Asian participation in the Extractive Industries Transparency Initiative: The case of China

BUSINESS STRATEGY AND THE ENVIRONMENT, Issue 6 2010
Liliane C. Mouan
Abstract This paper is not intended as an empirical assessment of the benefits of the Extractive Industries Transparency Initiative (EITI). It is, rather, an evaluation of the theoretical presumptions that underpin the discussion about its benefits for Asian economies in general, and China in particular. The paper finds that, despite its well meaning objectives, the EITI might be of limited value for China and its Asian peers, not only because it faces general problems about legitimacy in non-Western circles, as do most Western-led multi-stakeholder partnerships, but also because the principles or values that it promotes are not aligned with China's culture, philosophy and business interests. The paper concludes with suggestions on how a stronger ,business case' for China's participation can be made. Copyright © 2010 John Wiley & Sons, Ltd and ERP Environment. [source]