Single Framework (single + framework)

Distribution by Scientific Domains


Selected Abstracts


A grasp-based motion planning algorithm for character animation

COMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 3 2001
Maciej Kalisiak
The design of autonomous characters capable of planning their own motions continues to be a challenge for computer animation. We present a novel kinematic motion-planning algorithm for character animation which addresses some of the outstanding problems. The problem domain for our algorithm is as follows: given a constrained environment with designated handholds and footholds, plan a motion through this space towards some desired goal. Our algorithm is based on a stochastic search procedure which is guided by a combination of geometric constraints, posture heuristics, and distance-to-goal metrics. The method provides a single framework for the use of multiple modes of locomotion in planning motions through these constrained, unstructured environments. We illustrate our results with demonstrations of a human character using walking, swinging, climbing, and crawling in order to navigate through various obstacle courses. Copyright © 2001 John Wiley & Sons, Ltd. [source]


The maximum entropy formalism and the idiosyncratic theory of biodiversity

ECOLOGY LETTERS, Issue 11 2007
Salvador Pueyo
Abstract Why does the neutral theory, which is based on unrealistic assumptions, predict diversity patterns so accurately? Answering questions like this requires a radical change in the way we tackle them. The large number of degrees of freedom of ecosystems pose a fundamental obstacle to mechanistic modelling. However, there are tools of statistical physics, such as the maximum entropy formalism (MaxEnt), that allow transcending particular models to simultaneously work with immense families of models with different rules and parameters, sharing only well-established features. We applied MaxEnt allowing species to be ecologically idiosyncratic, instead of constraining them to be equivalent as the neutral theory does. The answer we found is that neutral models are just a subset of the majority of plausible models that lead to the same patterns. Small variations in these patterns naturally lead to the main classical species abundance distributions, which are thus unified in a single framework. [source]


Niche breadth, competitive strength and range size of tree species: a trade-off based framework to understand species distribution

ECOLOGY LETTERS, Issue 2 2006
Xavier Morin
Abstract Understanding the mechanisms causing latitudinal gradients in species richness and species range size is a central issue in ecology, particularly in the current context of global climate change. Different hypotheses have been put forward to explain these patterns, emphasizing climatic variability, energy availability and competition. Here we show, using a comparative analysis controlling for phylogeny on 234 temperate/boreal tree species, that these hypotheses can be included into a single framework in an attempt to explain latitudinal gradients in species range size. We find that species tend to have larger ranges when (i) closer to the poles, (ii) successionally seral, (iii) having small and light seeds, and (iv) having short generations. The patterns can simply be explained by energy constraints associated with different life-history strategies. Overall, these findings shed a new light on our understanding of species distribution and biodiversity patterns, bringing new insights into underlying large-scale evolutionary processes. [source]


A neuroanatomically grounded Hebbian-learning model of attention,language interactions in the human brain

EUROPEAN JOURNAL OF NEUROSCIENCE, Issue 2 2008
Max Garagnani
Abstract Meaningful familiar stimuli and senseless unknown materials lead to different patterns of brain activation. A late major neurophysiological response indexing ,sense' is the negative component of event-related potential peaking at around 400 ms (N400), an event-related potential that emerges in attention-demanding tasks and is larger for senseless materials (e.g. meaningless pseudowords) than for matched meaningful stimuli (words). However, the mismatch negativity (latency 100,250 ms), an early automatic brain response elicited under distraction, is larger to words than to pseudowords, thus exhibiting the opposite pattern to that seen for the N400. So far, no theoretical account has been able to reconcile and explain these findings by means of a single, mechanistic neural model. We implemented a neuroanatomically grounded neural network model of the left perisylvian language cortex and simulated: (i) brain processes of early language acquisition and (ii) cortical responses to familiar word and senseless pseudoword stimuli. We found that variation of the area-specific inhibition (the model correlate of attention) modulated the simulated brain response to words and pseudowords, producing either an N400- or a mismatch negativity-like response depending on the amount of inhibition (i.e. available attentional resources). Our model: (i) provides a unifying explanatory account, at cortical level, of experimental observations that, so far, had not been given a coherent interpretation within a single framework; (ii) demonstrates the viability of purely Hebbian, associative learning in a multilayered neural network architecture; and (iii) makes clear predictions on the effects of attention on latency and magnitude of event-related potentials to lexical items. Such predictions have been confirmed by recent experimental evidence. [source]


ENVIRONMENTAL NICHE EQUIVALENCY VERSUS CONSERVATISM: QUANTITATIVE APPROACHES TO NICHE EVOLUTION

EVOLUTION, Issue 11 2008
Dan L. Warren
Environmental niche models, which are generated by combining species occurrence data with environmental GIS data layers, are increasingly used to answer fundamental questions about niche evolution, speciation, and the accumulation of ecological diversity within clades. The question of whether environmental niches are conserved over evolutionary time scales has attracted considerable attention, but often produced conflicting conclusions. This conflict, however, may result from differences in how niche similarity is measured and the specific null hypothesis being tested. We develop new methods for quantifying niche overlap that rely on a traditional ecological measure and a metric from mathematical statistics. We reexamine a classic study of niche conservatism between sister species in several groups of Mexican animals, and, for the first time, address alternative definitions of "niche conservatism" within a single framework using consistent methods. As expected, we find that environmental niches of sister species are more similar than expected under three distinct null hypotheses, but that they are rarely identical. We demonstrate how our measures can be used in phylogenetic comparative analyses by reexamining niche divergence in an adaptive radiation of Cuban anoles. Our results show that environmental niche overlap is closely tied to geographic overlap, but not to phylogenetic distances, suggesting that niche conservatism has not constrained local communities in this group to consist of closely related species. We suggest various randomization tests that may prove useful in other areas of ecology and evolutionary biology. [source]


Innovation management measurement: A review

INTERNATIONAL JOURNAL OF MANAGEMENT REVIEWS, Issue 1 2006
Richard Adams
Measurement of the process of innovation is critical for both practitioners and academics, yet the literature is characterized by a diversity of approaches, prescriptions and practices that can be confusing and contradictory. Conceptualized as a process, innovation measurement lends itself to disaggregation into a series of separate studies. The consequence of this is the absence of a holistic framework covering the range of activities required to turn ideas into useful and marketable products. We attempt to address this gap by reviewing the literature pertaining to the measurement of innovation management at the level of the firm. Drawing on a wide body of literature, we first develop a synthesized framework of the innovation management process consisting of seven categories: inputs management, knowledge management, innovation strategy, organizational culture and structure, portfolio management, project management and commercialization. Second, we populate each category of the framework with factors empirically demonstrated to be significant in the innovation process, and illustrative measures to map the territory of innovation management measurement. The review makes two important contributions. First, it takes the difficult step of incorporating a vastly diverse literature into a single framework. Second, it provides a framework against which managers can evaluate their own innovation activity, explore the extent to which their organization is nominally innovative or whether or not innovation is embedded throughout their organization, and identify areas for improvement. [source]


Enterprise Risk Management: Theory and Practice

JOURNAL OF APPLIED CORPORATE FINANCE, Issue 4 2006
Brian W. Nocco
The Chief Risk Officer of Nationwide Insurance teams up with a distinguished academic to discuss the benefits and challenges associated with the design and implementation of an enterprise risk management program. The authors begin by arguing that a carefully designed ERM program,one in which all material corporate risks are viewed and managed within a single framework,can be a source of long-run competitive advantage and value through its effects at both a "macro" or company-wide level and a "micro" or business-unit level. At the macro level, ERM enables senior management to identify, measure, and limit to acceptable levels the net exposures faced by the firm. By managing such exposures mainly with the idea of cushioning downside outcomes and protecting the firm's credit rating, ERM helps maintain the firm's access to capital and other resources necessary to implement its strategy and business plan. At the micro level, ERM adds value by ensuring that all material risks are "owned," and risk-return tradeoffs carefully evaluated, by operating managers and employees throughout the firm. To this end, business unit managers at Nationwide are required to provide information about major risks associated with all new capital projects,information that can then used by senior management to evaluate the marginal impact of the projects on the firm's total risk. And to encourage operating managers to focus on the risk-return tradeoffs in their own businesses, Nationwide's periodic performance evaluations of its business units attempt to refl ect their contributions to total risk by assigning risk-adjusted levels of "imputed" capital on which project managers are expected to earn adequate returns. The second, and by far the larger, part of the article provides an extensive guide to the process and major challenges that arise when implementing ERM, along with an account of Nationwide's approach to dealing with them. Among other issues, the authors discuss how a company should assess its risk "appetite," measure how much risk it is bearing, and decide which risks to retain and which to transfer to others. Consistent with the principle of comparative advantage it uses to guide such decisions, Nationwide attempts to limit "non-core" exposures, such as interest rate and equity risk, thereby enlarging the firm's capacity to bear the "information-intensive, insurance- specific" risks at the core of its business and competencies. [source]


A logistics scheduling model: Inventory cost reduction by batching

NAVAL RESEARCH LOGISTICS: AN INTERNATIONAL JOURNAL, Issue 4 2005
Xiangtong Qi
Abstract Logistics scheduling refers to the problems where the decisions of job scheduling and transportation are integrated in a single framework. In this paper, we discuss a logistics scheduling model where the raw material is delivered to the shop in batches. By making the batching and scheduling decisions simultaneously, the total inventory and batch setup cost can be reduced. We study different models on this issue, present complexity analysis and optimal algorithms, and conduct computational experiments. Some managerial insights are observed. © 2005 Wiley Periodicals, Inc. Naval Research Logistics, 2005. [source]


A Resource-Process Framework of New Service Development

PRODUCTION AND OPERATIONS MANAGEMENT, Issue 2 2007
Craig M. Froehle
Motivated by the increasing attention given to the operational importance of developing new services, this paper offers a theoretical framework that integrates both process- and resource-oriented perspectives of new service development (NSD) by defining and organizing 45 practice constructs for NSD-related practices and activities that occur in contemporary service firms. We employ a rigorous procedure whereby both quantitative and qualitative data were gathered through multiple rounds of interviews and card-sorting exercises with senior service managers. This iterative refinement process helps ensure that the construct domains and definitions are consistent and that they are applicable across multiple service sectors. A primary contribution of this research is to provide precise operational definitions of theoretically important NSD practice constructs. Importantly, this study expands on the NSD literature by including both resource- and process-centric perspectives within a single framework. A second contribution is to illustrate a general methodology for developing clear, concise, and consistent construct definitions that may be generally useful for production and operations management scholars interested in new construct development for emerging areas. Empirical results suggest that the resource-process framework can help guide and organize future research on, and provide insight into, a more comprehensive view of new service development. [source]


PERSPECTIVE: Establishing an NPD Best Practices Framework

THE JOURNAL OF PRODUCT INNOVATION MANAGEMENT, Issue 2 2006
Kenneth B. Kahn
Achieving NPD best practices is a top-of-mind issue for many new product development (NPD) managers and is often an overarching implicit, if not explicit, goal. The question is what does one mean when talking about NPD best practices? And how does a manager move toward achieving these? This article proposes a best practices framework as a starting point for much-needed discussion on this topic. Originally presented during the 2004 Product Development Management Association (PDMA) Research Conference in Chicago, the article and the authors' presentation spurred a significant, expansive discussion that included all conference attendees. Given the interest generated, the decision was made to move forward on a series of rejoinders on the topic of NPD best practice, using the Kahn, Barczak, and Moss framework as a focal launching point for these rejoinders. A total of five rejoinders were received and accompany the best practices framework in this issue of JPIM. Each rejoinder brings out a distinct issue because each of the five authors has a unique perspective. The first rejoinder is written by Dr. Marjorie Adams-Bigelow, director of the PDMA's Comparative Performance Assessment Study (CPAS), PDMA Foundation. Based on her findings during the CPAS study, Adams comments on the proposed framework, suggesting limitations in scope. She particularly points out discrepancies between the proposed framework and the framework offered by PDMA's emerging body of knowledge. Dr. Elko Kleinschmidt, professor of marketing and international business at McMaster University, wrote the second rejoinder. Based on his extensive research with Robert G. Cooper on NPD practices, he points out that best practices really raise more questions than answers. Thomas Kuczmarski, president of Kuczmarski and Associates, is the author of the third rejoinder. Kuczmarski highlights that company mindset and metrics are critical elements needing keen attention. Where do these fit,or should they,in the proposed framework? The fourth rejoinder is written by Richard Notargiacomo, consultant for the integrated product delivery process at Eastman Kodak Company. Notargiacomo compares the proposed framework to a best practices framework Kodak has used for new product commercialization and management since 1998. The distinction of the Kodak framework is the inclusion of a product maturity model component. Dr. Lois Peters, associate professor at Rensselaer Polytechnic Institute (RPI), is the author of the fifth rejoinder. She brings out issues of radical innovation, a natural focal issue of RPI's radical innovation project (RRIP). It is highlighted that radical innovation may require unique, distinctive process characteristics a single framework cannot illustrate. Multiple layers of frameworks may be more appropriate, each corresponding to a level of innovation desired. The overall hope is that the discourse on best practices in this issue of JPIM generates more discussion and debate. Ultimately, the hope is that such discourse will lead to subsequent continued study to help discern what NPD best practice means for our discipline. [source]