Home About us Contact | |||
Many Different Areas (many + different_area)
Selected AbstractsSignal representation and approximation,fundamental limitsEUROPEAN TRANSACTIONS ON TELECOMMUNICATIONS, Issue 5 2007Holger Boche The expansion of functions in orthonormal bases is an important analytical and practical instrument in many different areas such as in signal processing, in system and information theory and in communications. However, the selection of an optimal basis is a non-trivial task in general and depends strongly on the performance measure of the concrete problem. This paper considers the basis selection problem for three different applications, starting with a problem from system theory, looking on entropy based methods from information theory, and finally it investigates the peak-to-average power ratio problem in communication systems. In particular, it is investigated under which conditions the problems are solvable, that is under which conditions there exists an appropriate basis. Copyright © 2007 John Wiley & Sons, Ltd. [source] Opportunities for research about managing the knowledge-based enterpriseINTERNATIONAL JOURNAL OF MANAGEMENT REVIEWS, Issue 1 2001D. Sandy Staples Potentially valuable directions for new research into the management of knowledge-based enterprises are identified in this paper. This was done by reviewing relevant literature to develop research questions, using a model of knowledge-based capabilities to focus the review. The model highlights six knowledge capabilities: acquisition, creation, capture, storage, diffusion and transfer. A knowledge-based enterprise would have to engage in (if not excel at) these activities simply to manage its key resource , knowledge. Forty-two research questions were proposed based on the review. The focus of the research questions varies widely, representing potential opportunities for researchers from many different areas to further our understanding of managing knowledge-based enterprises. [source] Ab-initio simulations of materials using VASP: Density-functional theory and beyondJOURNAL OF COMPUTATIONAL CHEMISTRY, Issue 13 2008Jürgen Hafner Abstract During the past decade, computer simulations based on a quantum-mechanical description of the interactions between electrons and between electrons and atomic nuclei have developed an increasingly important impact on solid-state physics and chemistry and on materials science,promoting not only a deeper understanding, but also the possibility to contribute significantly to materials design for future technologies. This development is based on two important columns: (i) The improved description of electronic many-body effects within density-functional theory (DFT) and the upcoming post-DFT methods. (ii) The implementation of the new functionals and many-body techniques within highly efficient, stable, and versatile computer codes, which allow to exploit the potential of modern computer architectures. In this review, I discuss the implementation of various DFT functionals [local-density approximation (LDA), generalized gradient approximation (GGA), meta-GGA, hybrid functional mixing DFT, and exact (Hartree-Fock) exchange] and post-DFT approaches [DFT + U for strong electronic correlations in narrow bands, many-body perturbation theory (GW) for quasiparticle spectra, dynamical correlation effects via the adiabatic-connection fluctuation-dissipation theorem (AC-FDT)] in the Vienna ab initio simulation package VASP. VASP is a plane-wave all-electron code using the projector-augmented wave method to describe the electron-core interaction. The code uses fast iterative techniques for the diagonalization of the DFT Hamiltonian and allows to perform total-energy calculations and structural optimizations for systems with thousands of atoms and ab initio molecular dynamics simulations for ensembles with a few hundred atoms extending over several tens of ps. Applications in many different areas (structure and phase stability, mechanical and dynamical properties, liquids, glasses and quasicrystals, magnetism and magnetic nanostructures, semiconductors and insulators, surfaces, interfaces and thin films, chemical reactions, and catalysis) are reviewed. © 2008 Wiley Periodicals, Inc. J Comput Chem, 2008 [source] GENERATION OF BIOLUMINESCENT MORGANELLA MORGANII AND ITS POTENTIAL USAGE IN DETERMINATION OF GROWTH LIMITS AND HISTAMINE PRODUCTIONJOURNAL OF FOOD SAFETY, Issue 2 2009MEHDI ZAREI ABSTRACT A mini-Tn5 promoter probe carrying the intact lux operon of Photorhabdus luminescens (pUT mini-Tn5 luxCDABE) which allowed measurement of light output without the addition of exogenous substrate was constructed. It was used to create a pool of chromosomally lux -marked strains of Morganella morganii. Also plasmid-mediated expression of bioluminescence in M. morganii was assessed using plasmid pT7-3 luxCDABE. No significant differences in growth and histamine formation characteristics of the lux -marked strains and wild type M. morganii strain were observed. Luminescent strain of M. morganii was used in experiments in which the correlation between light output, viable cell count and histamine formation was assessed. During the exponential growth phase, a positive linear correlation was observed between these three parameters in trypticase soy broth-histidine medium at 37C. It was demonstrated that expression of bioluminescence had not had a significant effect upon both growth rate and histamine production. Thus, the measurement of bioluminescence was found to be a simple, fast and reliable method for determination of viable cell count and histamine content. PRACTICAL APPLICATIONS Constructing predictive models in microbiology requires a large number of data on desired factors. Commonly used traditional methods of counting viable cells and measuring histamine, e.g., to model the growth limits of M. morganii as a function of different intrinsic and extrinsic factors, are time consuming and laborious, and require a lot of laboratory space and materials. According to the results of this research, measurement of bioluminescence is a simple, fast and reliable method for the determination of viable cell count and histamine content during the exponential growth phase. Thus, it can be used as a labor- and material-saving selective data capture method for constructing predictive models in many different areas. [source] A method to determine protein digestibility of microdiets for larval and early juvenile fishAQUACULTURE NUTRITION, Issue 6 2009J.M. HANSEN Abstract A method to evaluate protein quality using in vivo methods was developed for larval fish. FluoSpheres® fluorescent microspheres (10 ,m) were incorporated into two test diets, our standard zein microdiet (ZMD) and a microdiet with identical ingredients except for the replacement of high quality fish meal with the same product cooked for 24 h at 80 °C (ZMD-CF). Several trials were performed to design a reliable method to test digestibility using FluoSpheres® as a marker. The developed in vivo technique was tested on 35 days posthatch (dph) larval Atlantic cod (Gadus morhua L.) and two tropical fish species in the early juvenile stage. The method took into account loss of total protein to the faecal pellet and water column. Apparent digestibility of protein in larval cod fed ZMD was significantly higher than that of larvae fed ZMD-CF (P < 0.05). A growth study to validate differences between the two diets showed significant differences in growth and survival of larvae fed ZMD versus ZMD-CF (P < 0.05). Further validation of our results was indicated through the use of a pH-stat method using enzymes extracted from 35 dph larval cod guts. This novel technique will be advantageous for researchers to evaluate feed ingredients for larval marine fish and is adaptable to many different areas of larval fish nutrition. [source] Biochemistry of the envenomation response,A generator theme for interdisciplinary integrationBIOCHEMISTRY AND MOLECULAR BIOLOGY EDUCATION, Issue 2 2010Erik Montagna Abstract The understanding of complex physiological processes requires information from many different areas of knowledge. To meet this interdisciplinary scenario, the ability of integrating and articulating information is demanded. The difficulty of such approach arises because, more often than not, information is fragmented through under graduation education in Health Sciences. Shifting from a fragmentary and deep view of many topics to joining them horizontally in a global view is not a trivial task for teachers to implement. To attain that objective we proposed a course herein described,Biochemistry of the envenomation response,aimed at integrating previous contents of Health Sciences courses, following international recommendations of interdisciplinary model. The contents were organized by modules with increasing topic complexity. The full understanding of the envenoming pathophysiology of each module would be attained by the integration of knowledge from different disciplines. Active-learning strategy was employed focusing concept map drawing. Evaluation was obtained by a 30-item Likert-type survey answered by ninety students; 84% of the students considered that the number of relations that they were able to establish as seen by concept maps increased throughout the course. Similarly, 98% considered that both the theme and the strategy adopted in the course contributed to develop an interdisciplinary view. [source] On the Meaning of Words and Dinosaur Bones: Lexical Knowledge Without a LexiconCOGNITIVE SCIENCE - A MULTIDISCIPLINARY JOURNAL, Issue 4 2009Jeffrey L. Elman Abstract Although for many years a sharp distinction has been made in language research between rules and words,with primary interest on rules,this distinction is now blurred in many theories. If anything, the focus of attention has shifted in recent years in favor of words. Results from many different areas of language research suggest that the lexicon is representationally rich, that it is the source of much productive behavior, and that lexically specific information plays a critical and early role in the interpretation of grammatical structure. But how much information can or should be placed in the lexicon? This is the question I address here. I review a set of studies whose results indicate that event knowledge plays a significant role in early stages of sentence processing and structural analysis. This poses a conundrum for traditional views of the lexicon. Either the lexicon must be expanded to include factors that do not plausibly seem to belong there; or else virtually all information about word meaning is removed, leaving the lexicon impoverished. I suggest a third alternative, which provides a way to account for lexical knowledge without a mental lexicon. [source] |