Other Fields (other + field)

Distribution by Scientific Domains


Selected Abstracts


The weed community affects yield and quality of soybean (Glycine max (L.) Merr.)

JOURNAL OF THE SCIENCE OF FOOD AND AGRICULTURE, Issue 3 2008
David J Gibson
Abstract BACKGROUND: The relationship between the weed community and soybean (Glycine max (L.) Merr.) seed yield and quality was assessed in two experiments in Illinois, USA. In one field different proportions of target weeds (Ambrosia trifida L., Amaranthus rudis J. Sauer, and Setaria faberi F. Herrm) were sown into experimental plots, and the other field was naturally infested with these and other weeds. The composition of the weed communities in both fields was compared to soybean yield, biomass, canopy cover and quality (% protein, oil, relative water content, and seed weight) using non-metric dimensional scaling ordination. RESULTS: In the experimentally sown plots, low yield and low quality soybeans were harvested from plots dominated by the target weeds, particularly A. trifida, and a suite of subordinate volunteers. In the naturally infested field, highest soybean protein was associated with S. faberi early in the season and Ambrosia artemisiifolia and Ipomea hederacea later in the season, and low amounts of A. rudis throughout the growing season. CONCLUSION: Similar results from the two experiments indicate that soybean seed yield and quality are affected by the composition of the weed community. Producers need to manage the weed community to optimize seed quality. Copyright © 2007 Society of Chemical Industry [source]


Summer predation rates on ungulate prey by a large keystone predator: how many ungulates does a large predator kill?

JOURNAL OF ZOOLOGY, Issue 4 2008
J. W. Laundré
Abstract Estimates of predation rates by large predators can provide valuable information on their potential impact on their ungulate prey populations. This is especially the case for pumas Puma concolor and its main prey, mule deer Odocoileus hemionus. However, only limited information on predation rates of pumas exist where mule deer are the only ungulate prey available. I used VHF telemetry data collected over 24-h monitoring sessions and once daily over consecutive days to derive two independent estimates of puma predation rates on mule deer where they were the only large prey available. For the 24-h data, I had 48 time blocks on female pumas with kittens, 43 blocks on females without kittens and 30 blocks on males. For the daily consecutive data, the average number of consecutive days followed was 51.5±4.2 days. There were data on five female pumas with kittens, five pregnant females and nine females without kittens. Predation rates over an average month of 30 days from the 24-h monitoring sessions were 2.0 mule deer per puma month for males (15.1 days per kill), 2.1 mule deer per puma month (14.3 days per kill) for females without kittens and 2.5 mule deer per puma month (12.0 days per kill) for pregnant females and females with kittens. For the consecutive daily data, females without kittens had an estimated predation rate of 2.1±0.14 mule deer per puma month (14.9±0.90 days per kill). Pregnant and females with kittens had predation rates of 2.7±0.18 and 2.6±0.21 mule deer per puma month, respectively (11.4±0.72 and 12.0±1.1 days per kill, respectively). Predation rates estimated in this study compared with those estimated by energetic demand for pumas in the study area but were lower than other field derived estimates. These data help increase our understanding of predation impacts of large predators on their prey. [source]


Watching the birdie watching you: eyewitness memory for actions using CCTV recordings of actual crimes

APPLIED COGNITIVE PSYCHOLOGY, Issue 4 2001
Penny S. Woolnough
In this paper we describe a method of assessing eyewitness performance for actual crimes that could prove a valuable addition to more traditional experimental and field-based approaches to the study of eyewitnessing. We present the findings of the first reported attempt to assess the accuracy of information contained in police statements given by eyewitnesses to actual criminal episodes using CCTV as a means of verification. Employing the criterion of using only those items that could be verified against CCTV recordings (largely action details), both victims and bystanders from eight incidents of assault were found to be highly accurate in their accounts (96% accurate). These results are discussed in terms of what they might indicate about the relationship between arousal and eyewitness performance and how they compare with laboratory and other field-based approaches to the study of eyewitness memory. In addition, we consider some of the methodological, technological and practical constraints associated with this novel approach and its possible future applications to the study of everyday memory as well as memory for unusual events. Copyright © 2001 John Wiley & Sons, Ltd. [source]


Clustering revealed in high-resolution simulations and visualization of multi-resolution features in fluid,particle models

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 2 2003
Krzysztof Boryczko
Abstract Simulating natural phenomena at greater accuracy results in an explosive growth of data. Large-scale simulations with particles currently involve ensembles consisting of between 106 and 109 particles, which cover 105,106 time steps. Thus, the data files produced in a single run can reach from tens of gigabytes to hundreds of terabytes. This data bank allows one to reconstruct the spatio-temporal evolution of both the particle system as a whole and each particle separately. Realistically, for one to look at a large data set at full resolution at all times is not possible and, in fact, not necessary. We have developed an agglomerative clustering technique, based on the concept of a mutual nearest neighbor (MNN). This procedure can be easily adapted for efficient visualization of extremely large data sets from simulations with particles at various resolution levels. We present the parallel algorithm for MNN clustering and its timings on the IBM SP and SGI/Origin 3800 multiprocessor systems for up to 16 million fluid particles. The high efficiency obtained is mainly due to the similarity in the algorithmic structure of MNN clustering and particle methods. We show various examples drawn from MNN applications in visualization and analysis of the order of a few hundred gigabytes of data from discrete particle simulations, using dissipative particle dynamics and fluid particle models. Because data clustering is the first step in this concept extraction procedure, we may employ this clustering procedure to many other fields such as data mining, earthquake events and stellar populations in nebula clusters. Copyright © 2003 John Wiley & Sons, Ltd. [source]


Staging anorexia nervosa: conceptualizing illness severity

EARLY INTERVENTION IN PSYCHIATRY, Issue 1 2008
Sarah Maguire
Abstract In recent years, there has been increasing attention to the conceptualization of anorexia nervosa (AN) and its diagnostic criteria. While varying levels of severity within the illness category of AN have long been appreciated, neither a precise definition of severity nor an empirical examination of severity in AN has been undertaken. The aim of this article is to review the current state of knowledge on illness severity and to propose a theoretical model for the definition and conceptualization of severity in AN. AN is associated with significant medical morbidity which is related to the ,severity' of presentation on such markers as body mass index, eating and purging behaviours. The development of a functional staging system, based on symptom severity, is indicated for reasons similar to those cited by the cancer lobby. Improving case management and making appropriate treatment recommendations have been the primary purpose of staging in other fields, and might also apply to AN. Such a standardized staging system could potentially ease communication between treatment settings, and increase the specificity and comparability of research findings in the field of AN. [source]


Industrial epidemics, public health advocacy and the alcohol industry: lessons from other fields

ADDICTION, Issue 9 2007
RENÉ I. JAHIEL
First page of article [source]


BROADENING THE APPLICATION OF EVOLUTIONARILY BASED GENETIC PEST MANAGEMENT

EVOLUTION, Issue 2 2008
Fred Gould
Insect- and tick-vectored diseases such as malaria, dengue fever, and Lyme disease cause human suffering, and current approaches for prevention are not adequate. Invasive plants and animals such as Scotch broom, zebra mussels, and gypsy moths continue to cause environmental damage and economic losses in agriculture and forestry. Rodents transmit diseases and cause major pre- and postharvest losses, especially in less affluent countries. Each of these problems might benefit from the developing field of Genetic Pest Management that is conceptually based on principles of evolutionary biology. This article briefly describes the history of this field, new molecular tools in this field, and potential applications of those tools. There will be a need for evolutionary biologists to interact with researchers and practitioners in a variety of other fields to determine the most appropriate targets for genetic pest management, the most appropriate methods for specific targets, and the potential of natural selection to diminish the effectiveness of genetic pest management. In addition to producing environmentally sustainable pest management solutions, research efforts in this area could lead to new insights about the evolution of selfish genetic elements in natural systems and will provide students with the opportunity to develop a more sophisticated understanding of the role of evolutionary biology in solving societal problems. [source]


ON THE ORIGIN OF MODULAR VARIATION

EVOLUTION, Issue 8 2002
Hod Lipson
Abstract We study the dynamics of modularization in a minimal substrate. A module is a functional unit relatively separable from its surrounding structure. Although it is known that modularity is useful both for robustness and for evolvability (Wagner 1996), there is no quantitative model describing how such modularity might originally emerge. Here we suggest, using simple computer simulations, that modularity arises spontaneously in evolutionary systems in response to variation, and that the amount of modular separation is logarithmically proportional to the rate of variation. Consequently, we predict that modular architectures would appear in correlation with high environmental change rates. Because this quantitative model does not require any special substrate to occur, it may also shed light on the origin of modular variation in nature. This observed relationship also indicates that modular design is a generic phenomenon that might be applicable to other fields, such as engineering: Engineering design methods based on evolutionary simulation would benefit from evolving to variable, rather than stationary, fitness criteria, as a weak and problem-independent method for inducing modularity. [source]


Capacitors with an Equivalent Oxide Thickness of <0.5 nm for Nanoscale Electronic Semiconductor Memory

ADVANCED FUNCTIONAL MATERIALS, Issue 18 2010
Seong Keun Kim
Abstract The recent progress in the metal-insulator-metal (MIM) capacitor technology is reviewed in terms of the materials and processes mostly for dynamic random access memory (DRAM) applications. As TiN/ZrO2 -Al2O3 -ZrO2/TiN (ZAZ) type DRAM capacitors approach their technical limits, there has been renewed interest in the perovskite SrTiO3, which has a dielectric constant of >100, even at a thickness ,10 nm. However, there are many technical challenges to overcome before this type of MIM capacitor can be used in mass-production compatible processes despite the large advancements in atomic layer deposition (ALD) technology over the past decade. In the mean time, rutile structure TiO2 and Al-doped TiO2 films might find space to fill the gap between ZAZ and SrTiO3 MIM capacitors due to their exceptionally high dielectric constant among binary oxides. Achieving a uniform and dense rutile structure is the key technology for the TiO2 -based dielectrics, which depends on having a dense, uniform and smooth RuO2 layer as bottom electrode. Although the Ru (and RuO2) layers grown by ALD using metal-organic precursors are promising, recent technological breakthroughs using the RuO4 precursor made a thin, uniform, and denser Ru and RuO2 layer on a TiN electrode. A minimum equivalent oxide thickness as small as 0.45 nm with a low enough leakage current was confirmed, even in laboratory scale experiments. The bulk dielectric constant of ALD SrTiO3 films, grown at 370 °C, was ,150 even with thicknesses ,15 nm. The recent development of novel group II precursors made it possible to increase the growth rate largely while leaving the electrical properties of the ALD SrTiO3 film intact. This is an important advancement toward the commercial applications of these MIM capacitors to DRAM as well as to other fields, where an extremely high capacitor density and three-dimensional structures are necessary. [source]


Observations on the Nature and Culture of Environmental History

HISTORY AND THEORY, Issue 4 2003
J. R. McNeill
This article aims to consider the robust field of environmental history as a whole, as it stands and as it has developed over the past twenty-five years around the world. It necessarily adopts a selective approach but still offers more breadth than depth. It treats the links between environmental history and other fields within history, and with other related disciplines such as geography. It considers the precursors of environmental history, its emergence since the 1970s, its condition in several settings and historiographies. Finally it touches on environmental history's relationship to social theory and to the natural sciences as they have evolved in recent decades. It concludes that while there remains plenty of interesting work yet to do, environmental history has successfully established itself as a legitimate field within the historical profession, and has a bright future, if perhaps for discouraging reasons. [source]


A new fast hybrid adaptive grid generation technique for arbitrary two-dimensional domains

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 3 2010
Mohamed S. Ebeida
Abstract This paper describes a new fast hybrid adaptive grid generation technique for arbitrary two-dimensional domains. This technique is based on a Cartesian background grid with square elements and quadtree decomposition. A new algorithm is introduced for the distribution of boundary points based on the curvature of the domain boundaries. The quadtree decomposition is governed either by the distribution of the boundary points or by a size function when a solution-based adaptive grid is desired. The resulting grid is quaddominant and ready for the application of finite element, multi-grid, or line-relaxation methods. All the internal angles in the final grid have a lower bound of 45° and an upper bound of 135°. Although our main interest is in grid generation for unsteady flow simulations, the technique presented in this paper can be employed in many other fields. Several application examples are provided to illustrate the main features of this new approach. Copyright © 2010 John Wiley & Sons, Ltd. [source]


Axial symmetric elasticity analysis in non-homogeneous bodies under gravitational load by triple-reciprocity boundary element method

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 7 2009
Yoshihiro Ochiai
Abstract In general, internal cells are required to solve elasticity problems by involving a gravitational load in non-homogeneous bodies with variable mass density when using a conventional boundary element method (BEM). Then, the effect of mesh reduction is not achieved and one of the main merits of the BEM, which is the simplicity of data preparation, is lost. In this study, it is shown that the domain cells can be avoided by using the triple-reciprocity BEM formulation, where the density of domain integral is expressed in terms of other fields that are represented by boundary densities and/or source densities at isolated interior points. Utilizing the rotational symmetry, the triple-reciprocity BEM formulation is developed for axially symmetric elasticity problems in non-homogeneous bodies under gravitational force. A new computer program was developed and applied to solve several test problems. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Comparison of various precipitation downscaling methods for the simulation of streamflow in a rainshadow river basin,

INTERNATIONAL JOURNAL OF CLIMATOLOGY, Issue 8 2003
Eric P. Salathé Jr
Abstract Global simulations of precipitation from climate models lack sufficient resolution and contain large biases that make them unsuitable for regional studies, such as forcing hydrologic simulations. In this study, the effectiveness of several methods to downscale large-scale precipitation is examined. To facilitate comparisons with observations and to remove uncertainties in other fields, large-scale predictor fields to be downscaled are taken from the National Centers for Environmental Prediction,National Center for Atmospheric Research reanalyses. Three downscaling methods are used: (1): a local scaling of the simulated large-scale precipitation; (2) a modified scaling of simulated precipitation that takes into account the large-scale wind field; and (3) an analogue method with 1000 hPa heights as predictor. A hydrologic model of the Yakima River in central Washington state, USA, is then forced by the three downscaled precipitation datasets. Simulations with the raw large-scale precipitation and gridded observations are also made. Comparisons among these simulated flows reveal the effectiveness of the downscaling methods. The local scaling of the simulated large-scale precipitation is shown to be quite successful and simple to implement. Furthermore, the tuning of the downscaling methods is valid across phases of the Pacific decadal oscillation, suggesting that the methods are applicable to climate-change studies. Copyright © 2003 Royal Meteorological Society [source]


Improved aetiological diagnosis of ischaemic stroke in a Vascular Medicine Unit , the significance of transesophageal echocardiogram

INTERNATIONAL JOURNAL OF CLINICAL PRACTICE, Issue 3 2008
A. Martignoni
Summary Background:, The TOAST study estimates that 34% of ischaemic strokes are of undetermined aetiology. Improvements in the diagnosis of the pathogenetic mechanism of ischaemic stroke would translate into a better care, in analogy to other fields of vascular and internal medicine. Objective:, To measure the reduction of undetermined aetiology strokes performing a set of additional diagnostic tests. Design:, Consecutive case series with historical controls. Setting:, Internal Medicine Ward with a stroke area (SA) admitting most stroke patients of a large hospital in Italy. Subjects:, A total of 179 ischaemic stroke patients admitted to SA in 2004,2005 compared with 105 ischaemic stroke patients admitted to the whole department in 2001. Intervention:, To perform more diagnostic tests, including transesophageal echocardiography (TEE), in the greatest possible number of ischaemic stroke inpatients admitted in SA of the Internal Medicine Department, in the years 2004,2005. Results:, More diagnostic tests were performed during the study period than in 2001, especially TEE (56% of patients in 2004,2005 vs. 3% of patients in 2001). We observed a significant reduction of undetermined aetiology from 38% in 2001 to 16% in 2004,2005 (p < 0.0001), largely for an increased identification of cases of cardio-embolic mechanism (from 18% to 40%, p = 0.0002). In the years 2004,2005 the fraction of patients on anticoagulant treatment at discharge was 21% vs. 12% in 2001 (p = 0.041). Conclusion:, Performing more tests, particularly TEE, brought improvements in the aetiological diagnosis of stroke, increasing cardio-embolism diagnosis and anticoagulant treatment. [source]


Laparoscopic adrenalectomy: Troublesome cases

INTERNATIONAL JOURNAL OF UROLOGY, Issue 5 2009
Gaku Kawabata
Among 143 cases of laparoscopic adrenalectomy carried out from 1993 to the present, 13 patients in whom the surgical manipulation presented problems were examined. Problems occurred due to the condition of the adrenal tumors themselves in six patients, whereas problems occurred due to the operative history in four patients. There were three patients with no operative history but with strong intraperitoneal adhesion. In patients with a history of laparotomy in other fields such as open cholecystectomy, gastrectomy or colostomy, operations were possible in most patients by examining the trocar site preoperatively. Patients with strong adhesion even without a history of surgery could be handled by full separation of the adhesion during surgery. In patients with bleeding in the adrenal tumors, large adrenal tumors, or tumors impacted in the liver, methods such as changing the sequence of separation procedures were required. In patients with a history of renal subcapsular hematomas due to extracorporeal shock wave lithotripsy (ESWL), it was not possible to understand the conditions of adrenal or perinephritic adhesion in preoperative imaging diagnosis, but resection was possible by changing the order of separation procedures and by using optimal instruments and devices. As with any surgery, including open surgeries, it is necessary to obtain knowledge on how to deal with variations in laparoscopic adrenalectomy to assure safe outcomes and to always consider effective methods for coping with unexpected difficulties. [source]


Aquatic Microbial Ecology: Water Desert, Microcosm, Ecosystem.

INTERNATIONAL REVIEW OF HYDROBIOLOGY, Issue 4-5 2008
What's Next?
Abstract Aquatic microbial ecology aims at nothing less than explaining the world from "ecological scratch". It develops theories, concepts and models about the small and invisible living world that is at the bottom of every macroscopic aquatic system. In this paper we propose to look at the development of Aquatic Microbial Ecology as a reiteration of classical (eukaryotic) limnology and oceanography. This was conceptualized moving historically from the so-called water desert to microcosm to ecosystem. Each of these concepts characterizes a particular historical field of knowledge that embraces also practices and theories about living beings in aquatic environments. Concerning the question of "who is there", however, Aquatic Microbial Ecology historically developed in reverse order. Repetition, reiteration and replication notwithstanding, Aquatic Microbial Ecology has contributed new ideas, theories and methods to the whole field of ecology as well as to microbiology. The disciplining of Aquatic Microbial Ecology happened in the larger field of plankton biology, and it is still attached to this biological domain, even conceiving of itself very self-consciously as a discipline of its own. Today, Aquatic Microbial Ecology as a discipline is much broader than plankton ecology ever was, for it includes not only oceans and freshwaters but also benthic, interstitial and groundwater systems. The success of Aquatic Microbial Ecology is expressed by its influence on other fields in ecology. The challenge is to further develop its theoretical and methodological features while at the same time contributing to current pressing problems such as climate change or the management of global water resources. And then it may not be fanciful to suppose that even in the year nineteen hundred and nineteen a great number of minds are still only partially lit up by the cold light of knowledge. It is the most capricious illuminant. They are still apt to ruminate, without an overpowering bias to the truth, whether a kingfisher's body shows which way the wind blows; whether an ostrich digests iron; whether owls and ravens herald ill-fortune; and the spilling of salt bad luck; what the tingling of ears forebodes, and even to toy pleasantly with more curious speculations as to the joints of elephants and the politics of storks, which came within the province of the more fertile and better-informed brain of the author (1919) Virginia Woolf from the essay "Reading", In: Leonard Woolf (ed.), 1950: The Captain's Death Bed and Other Essays, , London: Hogarth Press, p. 157. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


Social governance: Corporate governance in institutions of social security, welfare and healthcare

INTERNATIONAL SOCIAL SECURITY REVIEW, Issue 2 2003
Vanessa Verdeyen
Corporate governance is a concept that attracted the attention of jurists and economists in the United States in the 1970s and 1980s. It has become widespread in Belgian company law since the 1990s. Lately, corporate governance elements have popped up in other fields as well. When applied to social institutions, this model is referred to by the term "social governance". The corporate governance concept can be very valuable in the social area. After all, the debate on corporate governance is much more fundamental than the debate on the relationship between shareholders and management and between minority and majority shareholders. The essence of corporate governance can be found in the pursuit of a situation of "checks and balances", which gives the stakeholders the possibility of complementing and controlling each other. We analyse the existing social governance elements in social security and welfare law. We conclude that the implementation of a social governance model should have a positive influence on the policy and practice of social institutions. [source]


Becoming Undisciplined: Toward the Supradisciplinary Study of Security

INTERNATIONAL STUDIES REVIEW, Issue 1 2005
J. Marshall Beier
In recent years we have seen increasing reflection among scholars of security studies regarding the boundaries of their field and the range of its appropriate subject matter. At the same time, scholars elsewhere in the academy have been developing their own approaches to issues of security. These various pockets of work have been undertaken in nearly complete isolation from one another and with little apparent awareness of relevant developments in the other fields. In this essay, we advance the claim that security cannot be satisfactorily theorized within the confines of disciplinary boundaries,any disciplinary boundaries. The challenge thus becomes how to develop what might be termed a "supradisciplinary" approach to the study of security that will allow us to think and engage our subject matter across a range of discourses without giving rise to an interdisciplinary hybrid or sui generis discipline. [source]


Considerations for Development of Surrogate Endpoints for Antifracture Efficacy of New Treatments in Osteoporosis: A Perspective,,

JOURNAL OF BONE AND MINERAL RESEARCH, Issue 8 2008
Mary L Bouxsein
Abstract Because of the broad availability of efficacious osteoporosis therapies, conduct of placebo-controlled trials in subjects at high risk for fracture is becoming increasing difficult. Alternative trial designs include placebo-controlled trials in patients at low risk for fracture or active comparator studies, both of which would require enormous sample sizes and associated financial resources. Another more attractive alternative is to develop and validate surrogate endpoints for fracture. In this perspective, we review the concept of surrogate endpoints as it has been developed in other fields of medicine and discuss how it could be applied in clinical trials of osteoporosis. We outline a stepwise approach and possible study designs to qualify a biomarker as a surrogate endpoint in osteoporosis and review the existing data for several potential surrogate endpoints to assess their success in meeting the proposed criteria. Finally, we suggest a research agenda needed to advance the development of biomarkers as surrogate endpoints for fracture in osteoporosis trials. To ensure optimal development and best use of biomarkers to accelerate drug development, continuous dialog among the health professionals, industry, and regulators is of paramount importance. [source]


Liquid membrane technology: fundamentals and review of its applications

JOURNAL OF CHEMICAL TECHNOLOGY & BIOTECHNOLOGY, Issue 1 2010
M. F. San Román
Abstract OVERVIEW: During the past two decades, liquid membrane technology has grown into an accepted unit operation for a wide variety of separations. The increase in the use of this technology owing to strict environmental regulations and legislation together with the wider acceptance of this technology in preference to conventional separation processes has led to a spectacular advance in membrane development, module configurations, applications, etc. IMPACT: Liquid membrane technology makes it possible to attain high selectivity as well as efficient use of energy and material relative to many other separation systems. However, in spite of the known advantages of liquid membranes, there are very few examples of industrial applications because of the problems associated with the stability of the liquid membrane. APPLICATIONS: Liquid membrane technology has found applications in the fields of chemical and pharmaceutical technology, biotechnology, food processing and environmental engineering. On the other hand, its use in other fields, such as in the case of hydrogen separation, the recovery of aroma compounds from fruits, the application of ionic liquids in the membrane formulation, etc., is increasing rapidly. Copyright © 2009 Society of Chemical Industry [source]


The information-processing approach to the human mind: Basics and beyond

JOURNAL OF CLINICAL PSYCHOLOGY, Issue 4 2004
Daniel David
Cognitive psychology attempts to understand the nature of the human mind by using the information-processing approach. In this article, the fundamentals of the cognitive approach will be presented. It will be argued that the human mind can be described at three levels,computational, algorithmic,representational, and implementational,and that the cognitive approach has both important theoretical and practical/clinical implications. Finally, it will be suggested that the study of cognitive psychology can provide a foundation for other fields of social science, including the field of clinical psychology. © 2004 Wiley Periodicals, Inc. J Clin Psychol. [source]


Brains and brands: developing mutually informative research in neuroscience and marketing

JOURNAL OF CONSUMER BEHAVIOUR, Issue 4-5 2008
Tyler K. Perrachione
Advances in neuroimaging technology have led to an explosion in the number of studies investigating the living human brain, and thereby our understanding of its structure and function. With the proliferation of dazzling images from brain scans in both scientific and popular media, researchers from other fields in the social and behavioral sciences have naturally become interested in the application of neuroimaging to their own research. Commercial enterprises have long been interested in the prospects of literally "getting inside the heads" of customers and partners, with a variety of goals in mind. Here we consider the ways in which scholars of consumer behavior may draw upon neuroscientific advances to inform their own research. We describe the motivation of neuroscientific inquiry from the point of view of neuroscientists, including an introduction to the technologies and methodologies available; correspondingly, we consider major questions in consumer behavior that are likely to be of interest to neuroscientists and why. Recent key discoveries in neuroscience are presented which will likely have a direct impact on the development of a neuromarketing subdiscipline and for neuroimaging as a marketing research technique. We discuss where and how neuroscience methodologies may reasonably be added to the research inventory of marketers. In sum, we aim to show not only that a neuromarketing subdiscipline may fruitfully contribute to our understanding of the biological bases of human behavior, but also that developing this as a productive research field will rest largely in framing marketing research questions in the brain-centric mindset of neuroscientists. Copyright © 2008 John Wiley & Sons, Ltd. [source]


MARGINAL COMMODITY TAX REFORMS: A SURVEY

JOURNAL OF ECONOMIC SURVEYS, Issue 4 2007
Alessandro Santoro
Abstract As noted 30 years ago by Martin Feldstein, optimal taxes may be useless for practical purposes and emphasis should instead be placed on the possibility of enhancing welfare by reforming existing tax rates. In this perspective, marginal commodity tax reforms are gaining increasing attention due to political and economic constraints on large reforms of direct (or indirect) taxation. In this paper, we summarize the main features and results of the literature on marginal commodity tax reforms pioneered by Ahmad and Stern, further developed by Yitzhaki and Thirsk and recently reinterpreted by Makdissi and Wodon. We establish new links to other fields of research, namely the literature on the use of equivalence scales and on poverty measurement. We also critically examine some issues associated with the implementation of marginal tax reforms with special reference to the calculation of welfare weights and revenue effects. Finally, we suggest directions for future research on poverty-reducing commodity tax reforms. [source]


Natural selection maximizes Fisher information

JOURNAL OF EVOLUTIONARY BIOLOGY, Issue 2 2009
S. A. FRANK
Abstract In biology, information flows from the environment to the genome by the process of natural selection. However, it has not been clear precisely what sort of information metric properly describes natural selection. Here, I show that Fisher information arises as the intrinsic metric of natural selection and evolutionary dynamics. Maximizing the amount of Fisher information about the environment captured by the population leads to Fisher's fundamental theorem of natural selection, the most profound statement about how natural selection influences evolutionary dynamics. I also show a relation between Fisher information and Shannon information (entropy) that may help to unify the correspondence between information and dynamics. Finally, I discuss possible connections between the fundamental role of Fisher information in statistics, biology and other fields of science. [source]


Co-Authorship in Management and Organizational Studies: An Empirical and Network Analysis*

JOURNAL OF MANAGEMENT STUDIES, Issue 5 2006
Francisco José Acedo
In recent decades there has been growing interest in the nature and scale of scientific collaboration. Studies into co-authorship have taken two different approaches. The first one attempts to analyse the reasons why authors collaborate and the consequences of such decision (Laband and Tollison, 2000). The second approach is based on the idea that co-authorship creates a social network of researchers (Barabási et al., 2002; Moody, 2004; Newman, 2001). In this study we have carried out an exploratory analysis of co-authorships in the field of management from the two aforementioned approaches. The results obtained show a growing tendency of the co-authored papers in the field of management, similar to what can be observed in other disciplines. Our study analyses some of the underpinning factors, which have been highlighted in the literature, explaining this tendency. Thus, the progressive quantitative character of research and the influence of the collaboration on the articles' impact are enhanced. The network analysis permits the exploration of the peculiarities of the management in comparison with other fields of knowledge, as well as the existing linkages between the most central and prominent authors within this discipline. [source]


The impact of a large-scale traumatic event on individual and organizational outcomes: exploring employee and company reactions to September 11, 2001

JOURNAL OF ORGANIZATIONAL BEHAVIOR, Issue 8 2002
Kristin Byron
Much of the literature on stress and organizational outcomes has focused on organizational factors and has ignored extraorganizational stressors that lead to perceived stress. However, research in other fields and recent studies in management suggests that acute-extraorganizational stressors, such as traumatic events, may have potentially negative and costly implications for organizations. This study tests a theoretical model of traumatic stress and considers the relationship between strain from an acute-extraorganizational stressor, the terrorist attack on September 11, 2001, and absenteeism. Using a sample of 108 MBA and MPA students, this study suggests that strain caused by an acute-extraorganizational stressor can have important consequences for organizations. Namely, employees who report more strain from a traumatic life event are more likely to be absent from work in the weeks following the event. Copyright © 2002 John Wiley & Sons, Ltd. [source]


Future Prospects for Biomarkers of Alcohol Consumption and Alcohol-Induced Disorders

ALCOHOLISM, Issue 6 2010
Willard M. Freeman
The lack of reliable measures of alcohol intake is a major obstacle to the diagnosis, treatment, and research of alcohol abuse and alcoholism. Successful development of a biomarker that allows for accurate assessment of alcohol intake and drinking patterns would not only be a major advance in clinical care but also a valuable research tool. A number of advances have been made in testing the validity of proposed biomarkers as well as in identifying potential new biomarkers through systems biology approaches. This commentary will examine the definition of a biomarker of heavy drinking, the types of potential biomarkers, the steps in biomarker development, the current state of biomarker development, and critical obstacles for the field. The challenges in developing biomarkers for alcohol treatment and research are similar to those found in other fields. However, the alcohol research field must reach a competitive level of rigor and organization. We recommend that NIAAA consider taking a leadership role in organizing investigators in the field and providing a common set of clinical specimens for biomarker validation studies. [source]


Bodensystematik und Bodenklassifikation Teil II: Zur Situation in der deutschen Bodenkunde

JOURNAL OF PLANT NUTRITION AND SOIL SCIENCE, Issue 2 2005
Christoph Albrecht
Abstract In Deutschland werden Böden mit einem Ordnungssystem beschrieben, welches eine Mischung aus einer Systematik und einer Klassifikation ist. Damit wird versucht, gleichzeitig auf wissenschaftliche und praktische Anforderungen zu reagieren. Die Definitionen der Bodenhorizonte und -profile haben eine Struktur, mit der subjektive Interpretationen und ein relativ freier Umgang mit vorhandenen Grenzwerten möglich sind. Infolgedessen kommt es zu Inkonsequenzen beim Aufbau des Bodenordnungssystems und zu Problemen bei der Bodenansprache. Den bodensystematischen Angaben fehlt dann oft die Qualität, die für die Nutzung in weiteren Anwendungen nötig wäre. Mit Fallbeispielen werden Probleme bei der Identifikation von Horizonten und Böden aufgezeigt. Zur Problemlösung wird vorgeschlagen, sowohl eine wissenschaftlich-beschreibende Bodensystematik als auch einfach und objektiv anwendbare Bestimmungsschlüssel für Horizonte und Böden zu entwickeln bzw. weiter zu entwickeln. Seit längerem ist in der deutschen Bodenkunde eine Entwicklung weg von der deskriptiven Systematik zu einer grenzwertbasierten Klassifikation zu erkennen, obwohl beide Ordnungssysteme parallel verwendet werden sollten. Diese Tendenz zeigt sich auch in der 5. Auflage der Bodenkundlichen Kartieranleitung, die einen Bestimmungsschlüssel für die Abteilungen, Klassen und Typen der deutschen Bodensystematik enthält. Eigenschaften und Bedeutung des Schlüssels werden kurz betrachtet und weitere Notwendigkeiten für die Strukturierung bodenkundlichen Wissens und deren Weitergabe an die Praxis erörtert. Soil systematics and classification system sPart II: The German soil-science situation In Germany, soils are categorized with an ordering system which is based both on the principles of a systematics and on those of a classification system. The goal is to meet both the scientific and the practical demands. The soil-horizon and -profile definitions are structured allowing both subjective interpretations and threshold values. As a consequence, the configuration of the system is somehow inconsequent and leads to identification problems. As a result, the soil-systematic specifications often lack the quality needed for their application in other fields or disciplines. We suggest that the best solution would be to develop simultaneously both a scientifically based soil systematics as well as a simply to use and objective classification key. The German soil science has long been transitioning from a descriptive systematics to a threshold-based classification system, although both categorization systems should be applied in tandem. This tendency is shown prominently in the 5th edition of the German Handbook of Soil Mapping, which contains a classification key for the soil groups (Abteilungen), classes (Klassen), and types (Typen). The Handbook's new features and their impact are also of importance and are discussed. Further deliberations are conducted concerning any soil-scientific organizational requirements, with a focus placed on easing and improving the transfer of knowledge across the gap from pure science to practical applications. [source]


ON SOCIAL NETWORK ANALYSIS IN A SUPPLY CHAIN CONTEXT,

JOURNAL OF SUPPLY CHAIN MANAGEMENT, Issue 2 2009
STEPHEN P. BORGATTI
The network perspective is rapidly becoming a lingua franca across virtually all of the sciences from anthropology to physics. In this paper, we provide supply chain researchers with an overview of social network analysis, covering both specific concepts (such as structural holes or betweenness centrality) and the generic explanatory mechanisms that network theorists often invoke to relate network variables to outcomes of interest. One reason for discussing mechanisms is to facilitate appropriate translation and context-specific modification of concepts rather than blind copying. We have also taken care to apply network concepts to both "hard" types of ties (e.g., materials and money flows) and "soft" types of ties (e.g., friendships and sharing-of-information), as both are crucial (and mutually embedded) in the supply chain context. Another aim of the review is to point to areas in other fields that we think are particularly suitable for supply chain management (SCM) to draw network concepts from, such as sociology, ecology, input,output research and even the study of romantic networks. We believe the portability of many network concepts provides a potential for unifying many fields, and a consequence of this for SCM may be to decrease the distance between SCM and other branches of management science. [source]


Understanding information related fields: A conceptual framework

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, Issue 13 2007
Ping Zhang
Many scientific fields share common interests for research and education. Yet, very often, these fields do not communicate to each other and are unaware of the work in other fields. Understanding the commonalities and differences among related fields can broaden our understanding of the interested phenomena from various perspectives, better utilize resources, enhance collaboration, and eventually move the related fields forward together. In this article, we present a conceptual framework, namely the Information-Model or I-model, to describe various aspects of information related fields. We consider this a timely effort in light of the evolutions of several information related fields and a set of questions related to the identities of these fields. It is especially timely in defining the newly formed Information Field from a community of twenty some information schools. We posit that the information related fields are built on a number of other fields but with their own unique foci and concerns. That is, core components from other fundamental fields interact and integrate with each other to form dynamic and interesting information related fields that all have to do with information, technology, people, and organization/society. The conceptual framework can have a number of uses. Besides providing a unified view of these related fields, it can be used to examine old case studies, recent research projects, educational programs and curricula concerns, as well as to illustrate the commonalities and differences with the information related fields. [source]