Tutorial

Distribution by Scientific Domains

Terms modified by Tutorial

  • tutorial review

  • Selected Abstracts


    MULTIPLE RATERS IN SURVEY-BASED OPERATIONS MANAGEMENT RESEARCH: A REVIEW AND TUTORIAL

    PRODUCTION AND OPERATIONS MANAGEMENT, Issue 2 2000
    Kenneth K. Boyer
    Research in the area of operations strategy has made significant progress during the past decade in terms of quantity of articles published, as well as the quality of these articles. Recent studies have examined the published literature base and determined that, in general, the field has progressed beyond an exploratory stage to a point where there is a core set of basic terminology and models. Concurrent with the formation and solidification of a core terminology, there is an increasing emphasis on developing and employing a set of reliable, valid, and reproducible methods for conducting research on operations strategy. We provide a review of common methods for assessing the degree of reliability and agreement of the responses provided by multiple raters within a given organization to a set of qualitative questions. In particular, we examine four methods of determining whether there is evidence of disagreement or bias between multiple raters within a single organization in a mail survey. [source]


    STUDENTS, PERCEPTIONS OF WORKSHOP BASED INTRODUCTORY MACROECONOMICS TUTORIALS: A SURVEY

    ECONOMIC PAPERS: A JOURNAL OF APPLIED ECONOMICS AND POLICY, Issue 3 2006
    AMEETA JAIN
    The declining popularity of Economics courses, evident in the last decade, has fuelled a debate on the nature of Economics units and the way in which they are taught in tertiary institutions. The effectiveness of traditional teaching methods has been questioned as ,lecturers search for alternative ways of presenting material and engaging students. In recent times, workshop-based/cooperative tutorials have become more popular in promoting deeper learning. This paper assesses the application of such an approach at a large tertiary institution. It evaluates student perceptions of this tutorial method in an Introductory Macroeconomics first-year unit. An anonymous questionnaire was used. Whilst the sample size is small (n = 56), the results are important in that this is the first such study in Macroeconomics. Students found workshop-based tutorials useful, preferred them over lecture style tutorials, and found that they fostered inclusivity. The importance of tutorials per se, is reiterated. Students state that tutorials are an important adjunct to lectures. This study also looks at students' study habits: finding that on average they spend less than one hour per week studying Economics and most prepare only occasionally for tutorials. The sample studied indicates that there are notable differences in the perceptions of tutorials and teaching methods between the genders and between local and international students. This may impact on the way in which tutorials are conducted effectively. [source]


    THE EFFECTIVENESS OF COLLABORATIVE PROBLEM-SOLVING TUTORIALS IN INTRODUCTORY MICROECONOMICS

    ECONOMIC PAPERS: A JOURNAL OF APPLIED ECONOMICS AND POLICY, Issue 4 2000
    JOHN MARANGOS
    First page of article [source]


    VIDEOCONFERENCING SURGICAL TUTORIALS: BRIDGING THE GAP

    ANZ JOURNAL OF SURGERY, Issue 4 2008
    Andrew J. A. Holland
    The expansion in medical student numbers has been associated with a move to increase the amount of time students spend in rural and remote locations. Providing an equivalent educational experience for students in surgical subspecialties in this setting is a logistical challenge. We sought to address this issue by providing synchronous tutorials in paediatric surgery using videoconferencing (VC) at two rural sites with the tutor located at a metropolitan paediatric clinical school. Between March 2005 and July 2006, 43 graduate students in the University of Sydney Medical Program were assigned to receive the paediatric component of the course at one of two sites within the School of Rural Health. During this 9-week rotation, students were involved in two or three surgical tutorials by videoconference. Students were then invited to complete a confidential, anonymous 20-point structured evaluation using a Likert scale. Valid responses were received from 40 students, a response rate of 93%. There were 21 females (52%), with 21 students based in Dubbo and 19 in Orange. Students agreed or strongly agreed that VC surgical tutorials were useful, the content well covered and student involvement encouraged (mean scores 4.7, 4.5 and 4.5; standard deviation 0.56, 0.72 and 0.72, respectively). Overall, the majority of students strongly agreed that participation in VC of surgical tutorials was valuable (mean 4.68, standard deviation 0.57). VC surgical tutorials were highly valued by graduate medical students as an educational method. Our data suggest that tutorials can be successfully provided at remote sites using VC. [source]


    Cases in Medical Ethics and Law: An Interactive Tutorial , By David Lloyd, Heather Widdows and Donna Dickenson

    DEVELOPING WORLD BIOETHICS, Issue 1 2007
    ANNE POPE
    No abstract is available for this article. [source]


    The Use of Reference Materials: A Tutorial

    GEOSTANDARDS & GEOANALYTICAL RESEARCH, Issue 1 2001
    Jean S. Kane
    reference materials; certified reference materials; method validation; traceability of measurement; geochemical analysis Any review of the analytical literature shows that, while reference materials are routinely used in laboratories world-wide, not all uses follow ISO Guide 33 (1989), which outlines best practices. Analytical data quality can suffer as a result. This paper reviews the various uses that the geoanalytical community has made of reference materials from a historical perspective, and suggests improvements in practice that would more closely follow ISO Guide 33 recommendations. Un examen de la littérature dans le domaine analytique montre que, si les matériaux de référence sont utilisés en routine dans les laboratoires du monde entier, ces derniers ne suivent pas toujours les recommandations du guide ISO 33 (1989a), qui souligne les bonnes pratiques de laboratoire. La qualité des données analytiques peut alors en souffrir. Cet article passe en revue les différentes utilisations des matériaux de référence par la communauté de géoanalyse, ceci d'un point de vue historique, et suggère des améliorations de pratiques pour suivre au mieux les recommandations du guide ISO 33. [source]


    Tutorial on Computational Linguistic Phylogeny

    LINGUISTICS & LANGUAGE COMPASS (ELECTRONIC), Issue 5 2008
    Johanna Nichols
    Over the last 10 or more years, there has been a tremendous increase in the use of computational techniques (many of which come directly from biology) for estimating evolutionary histories (i.e., phylogenies) of languages. This tutorial surveys the different methods and different types of linguistic data that have been used to estimate phylogenies, explains the scientific and mathematical foundations of phylogenetic estimation, and presents methodologies for evaluating a phylogeny estimation method. [source]


    Tutorials in Clinical Research: Part VII.

    THE LARYNGOSCOPE, Issue 9 2003
    Part A: General Concepts of Statistical Significance, Understanding Comparative Statistics (Contrast)
    Abstract Objectives/Hypothesis The present tutorial is the seventh in a series of Tutorials in Clinical Research. The specific purpose of the tutorial (Part A) and its sequel (Part B) is to introduce and explain three commonly used statistical tools for assessing contrast in the comparison between two groups. Study Design Tutorial. Methods The authors met weekly for 10 months discussing clinical research studies and the applied statistics. The difficulty was not in the material but in the effort to make the report easy to read and as short as possible. Results The tutorial is organized into two parts. Part A, which is the present report, focuses on the fundamental concepts of the null hypothesis and comparative statistical significance. The sequel, Part B, discusses the application of three common statistical indexes of contrast, the ,2, Mann-Whitney U, and Student t tests. Conclusions Assessing the validity of medical studies requires a working knowledge of research design and statistics; obtaining this knowledge need not be beyond the ability of the busy surgeon. The authors have tried to construct an accurate, easy-to-read, easy-to-apply, basic introduction to comparing two groups. The long-term goal of the present tutorial and others in the series is to facilitate basic understanding of clinical research, thereby stimulating reading of some of the numerous well-written research design and statistical texts. This knowledge may then be applied to the continuing educational review of the literature and the systematic prospective analysis of individual practices. [source]


    Tutorials in Clinical Research: Part III.

    THE LARYNGOSCOPE, Issue 5 2001
    Selecting a Research Approach to Best Answer a Clinical Question
    Abstract Objective This is the third in a series of sequential "Tutorials in Clinical Research."1,2 The objectives of this specific report are to enable the reader to rapidly dissect a clinical question or article to efficiently determine what critical mass of information is required to answer the question and what study design is likely to produce the answer. Study Design Tutorial. Methods The authors met weekly for 3 months exploring clinical problems and systematically recording the logic and procedural pathways from multiple clinical questions to the selection of proper research approaches. The basic elements required to understand the processes of selection were catalogued and field tested, and a report was produced to define and explain these elements. Results Fundamental to a research approach is the assembly of subjects and the allocation of exposures. An algorithm leading to the selection of an approach is presented. The report is organized into three parts. The tables serve as a rapid reference section. The initial two-part narrative explains the process of approach selection. The examples section illustrates the application of the selection algorithm. Conclusions Selecting the proper research approach has six steps: the question, logic and ethics, identification of variables, data display considerations, original data source considerations, and selection of prototypical approaches for assembly of subjects. Field tests of this approach consistently demonstrated its utility. [source]


    Building the Powerful 10-Minute Office Visit: Part II.

    THE LARYNGOSCOPE, Issue 1 2001
    Beginning a Critical Literature Review
    Abstract Objective This is the second part in a series of sequential Tutorials in Clinical Research. The objective of this tutorial is to introduce methods of searching the vast stores of information now available, to review some of the computer resources available, to reintroduce the concept of an a priori design for the search, and to reveal the need for assessment of the clinical importance and validity of each pertinent article found. Study Design Tutorial. Methods An open working group has been formed with the specific aim of surveying and translating the large volume of complex information on research design and statistics into easily understood, useable, and non-threatening tutorials for the busy practitioner. The hypotheses under which this work is conducted are highly intelligent, but extremely busy, surgeons are interested in evidence-based medicine and will increase personal participation in critical reading of the literature, pending an expanded familiarity with clinical research design and statistics. Results Available resources for literature searching, methods of quick personal overviews, and quick question-specific reviews are discussed. Additionally, the methods, with examples, of beginning a critical literature review are presented. Conclusions Rapid, personal, critical literature review requires succinct formulation of the question, efficient search for the best available evidence, and critical appraisal of the pertinent individual articles to determine if sufficient evidence exists to support a clinical contention. [source]


    Tutorial: Using Confidence Curves in Medical Research

    BIOMETRICAL JOURNAL, Issue 2 2005
    Ralf Bender
    Abstract Confidence intervals represent a routinely used standard method to document the uncertainty of estimated effects. In most cases, for the calculation of confidence intervals the conventional fixed 95% confidence level is used. Confidence curves represent a graphical illustration of confidence intervals for confidence levels varying between 0 and 100%. Although such graphs have been repeatedly proposed under different names during the last 40 years, confidence curves are rarely used in medical research. In this paper, we introduce confidence curves and present a short historical review. We draw attention to the different interpretation of one- and two-sided statistical inference. It is shown that these two options also have influence on the plotting of appropriate confidence curves. We illustrate the use of one- and two-sided confidence curves and explain their correct interpretation. In medical research more emphasis on the choice between the one- and two-sided approaches should be given. One- and two-sided confidence curves are useful complements to the conventional methods of presenting study results. (© 2005 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


    A Survey of Model Evaluation Approaches With a Tutorial on Hierarchical Bayesian Methods

    COGNITIVE SCIENCE - A MULTIDISCIPLINARY JOURNAL, Issue 8 2008
    Richard M. Shiffrin
    Abstract This article reviews current methods for evaluating models in the cognitive sciences, including theoretically based approaches, such as Bayes factors and minimum description length measures; simulation approaches, including model mimicry evaluations; and practical approaches, such as validation and generalization measures. This article argues that, although often useful in specific settings, most of these approaches are limited in their ability to give a general assessment of models. This article argues that hierarchical methods, generally, and hierarchical Bayesian methods, specifically, can provide a more thorough evaluation of models in the cognitive sciences. This article presents two worked examples of hierarchical Bayesian analyses to demonstrate how the approach addresses key questions of descriptive adequacy, parameter interference, prediction, and generalization in principled and coherent ways. [source]


    When Are Tutorial Dialogues More Effective Than Reading?

    COGNITIVE SCIENCE - A MULTIDISCIPLINARY JOURNAL, Issue 1 2007
    Kurt VanLehn
    It is often assumed that engaging in a one-on-one dialogue with a tutor is more effective than listening to a lecture or reading a text. Although earlier experiments have not always supported this hypothesis, this may be due in part to allowing the tutors to cover different content than the noninteractive instruction. In 7 experiments, we tested the interaction hypothesis under the constraint that (a) all students covered the same content during instruction, (b) the task domain was qualitative physics, (c) the instruction was in natural language as opposed to mathematical or other formal languages, and (d) the instruction conformed with a widely observed pattern in human tutoring: Graesser, Person, and Magliano's 5-step frame. In the experiments, we compared 2 kinds of human tutoring (spoken and computer mediated) with 2 kinds of natural-language-based computer tutoring (Why2-Atlas and Why2-AutoTutor) and 3 control conditions that involved studying texts. The results depended on whether the students' preparation matched the content of the instruction. When novices (students who had not taken college physics) studied content that was written for intermediates (students who had taken college physics), then tutorial dialogue was reliably more beneficial than less interactive instruction, with large effect sizes. When novices studied material written for novices or intermediates studied material written for intermediates, then tutorial dialogue was not reliably more effective than the text-based control conditions. [source]


    Teaching, Exploring, Learning,Developing Tutorials for In-Class Teaching and Self-Learning

    COMPUTER GRAPHICS FORUM, Issue 4 2007
    S. Beckhaus
    Abstract This paper presents an experience report on a novel approach for a course on intermediate and advanced computer graphics topics. The approach uses Teachlet Tutorials, a combination of traditional seminar,type teaching with interactive exploration of the content by the audience, plus development of self-contained tutorials on the topic. In addition to a presentation, an interactive software tool is developed by the students to assist the audience in learning and exploring the topic's details. This process is guided through set tasks. The resulting course material is developed for two different contexts: (a) for classroom presentation and (b) as an interactive, self-contained, self-learning tutorial. The overall approach results in a more thorough understanding of the topic both for the student teachers as well as for the class participants. In addition to detailing the Teachlet Tutorial approach, this paper presents our experiences implementing the approach in our Advanced Computer Graphics course and presents the resultant projects. Most of the final Teachlet Tutorials were surprisingly good and we had excellent feedback from the students on the approach and course. [source]


    Diodenlaser sind im Kommen!

    LASER TECHNIK JOURNAL, Issue 2 2007
    Chefredakteur Andreas Thoß Dr.
    Im November des vergangenen Jahres gab es am Fraunhofer IWS in Dresden einen spannenden Workshop zum Thema "Industrielle Anwendungen von Hochleistungsdiodenlasern". Bei den Vorträgen kamen nicht nur die Diodenlaserhersteller und -anwender, sondern auch Mikrooptikfirmen zu Wort. Und besonders von dieser Seite kamen einige interessante Gedanken zum Vergleich von Hochleistungsdiodenlasern mit den "neuen" Festkörperlasern. Dabei ging es darum, dass das System "Diodenlaser plus Mikrooptik plus Faser" mit inzwischen bis zu 10 kW Ausgangsleistung durchaus konkurrenzfähig ist. Neben den deutlich verbesserten Parametern der Diodenlaser ist das Know-how, Diode, Optik und Faser effizient miteinander zu verbinden, ein entscheidender Punkt. Der Unterschied ist noch im Faserdurchmesser sichtbar, aber schon da ist es für einige Anwendungen zweitrangig, ob die Strahlquelle ein Faser-, ein Scheiben- oder doch "nur" ein Diodenlaser ist. Die hervorragende Strahlqualität der Faser- und Scheibenlaser ermöglicht sicher einige neue Anwendungen, für andere jedoch ist sie unnötig. Und wenn die Strahlung ohnehin durch Mikrooptik geformt wird - beispielsweise zu einer schmalen Linie - kann der Diodenlaser den anderen Systemen durchaus überlegen sein. Dazu kommt der Preis als (entscheidendes) Argument und auch da spricht einiges für den Diodenlaser. Neben der vergleichsweise einfachen Konstruktion und der hohen Effizienz sahen die Hersteller noch andere Vorteile: So wird sich eine positive Marktentwicklung bei diodengepumpten Lasern auch auf den Preis für Pumpdioden auswirken und damit natürlich auch auf den Preis der High-Power-Diodenlasersysteme. Dazu kommt, dass im Rahmen des Förderprogramms BRIOLAS (BMBF) die Weiterentwicklung von Hochleistungsdiodenlasern massiv gefördert wird. Zu BRIOLAS gehört sowohl die Entwicklung einer neuen Generation von brillanten Diodenmodulen (BRILASI) als auch die Automatisierung der Montagetechnologie (INLAS). Man darf also erwarten, dass die Diodenlaser noch besser und noch preiswerter werden. Und dementsprechend werden auch die Diodenlaseranwendungen in den nächsten Jahren mehr Aufmerksamkeit bekommen. Einige Beispiele für neue Entwicklungen auf dem Gebiet finden Sie in diesem Heft in der Rubrik "Diodenlaseranwendung". Eine wichtige Voraussetzung für die Entwicklung neuer Applikationsfelder ist ein detailliertes Verständnis der Prozesse beim Schweißvorgang. Die verschiedenen Systeme zur Prozeßüberwachung (zum Beispiel das auf dem Titelbild) geben schon eine Vielzahl von Informationen, die aber immer durch die Art des Detektors eingeschränkt sind. Tiefere Einblicke versprechen an der Stelle numerische Simulationen. Der Beitrag von Peter Berger (IFSW Stuttgart) bietet auf Seite 31 eine Einführung in dieses komplexe Gebiet. Das Thema des Tutorials in diesem Heft ist ein Dauerbrenner: die Messung von Strahlparametern bei Hochleistungslasern. Hier ist in den letzten Jahren viel passiert, inzwischen gibt es nicht nur neue Normen sondern auch eine Vielzahl von Detektoren. Der Beitrag auf Seite 46 liefert Ihnen einen Einstieg in das Thema und eine Übersicht, was man mit welchem Detektor messen kann. [source]


    Tutorials in Clinical Research: Part VII.

    THE LARYNGOSCOPE, Issue 9 2003
    Part A: General Concepts of Statistical Significance, Understanding Comparative Statistics (Contrast)
    Abstract Objectives/Hypothesis The present tutorial is the seventh in a series of Tutorials in Clinical Research. The specific purpose of the tutorial (Part A) and its sequel (Part B) is to introduce and explain three commonly used statistical tools for assessing contrast in the comparison between two groups. Study Design Tutorial. Methods The authors met weekly for 10 months discussing clinical research studies and the applied statistics. The difficulty was not in the material but in the effort to make the report easy to read and as short as possible. Results The tutorial is organized into two parts. Part A, which is the present report, focuses on the fundamental concepts of the null hypothesis and comparative statistical significance. The sequel, Part B, discusses the application of three common statistical indexes of contrast, the ,2, Mann-Whitney U, and Student t tests. Conclusions Assessing the validity of medical studies requires a working knowledge of research design and statistics; obtaining this knowledge need not be beyond the ability of the busy surgeon. The authors have tried to construct an accurate, easy-to-read, easy-to-apply, basic introduction to comparing two groups. The long-term goal of the present tutorial and others in the series is to facilitate basic understanding of clinical research, thereby stimulating reading of some of the numerous well-written research design and statistical texts. This knowledge may then be applied to the continuing educational review of the literature and the systematic prospective analysis of individual practices. [source]


    Tutorials in Clinical Research: Part III.

    THE LARYNGOSCOPE, Issue 5 2001
    Selecting a Research Approach to Best Answer a Clinical Question
    Abstract Objective This is the third in a series of sequential "Tutorials in Clinical Research."1,2 The objectives of this specific report are to enable the reader to rapidly dissect a clinical question or article to efficiently determine what critical mass of information is required to answer the question and what study design is likely to produce the answer. Study Design Tutorial. Methods The authors met weekly for 3 months exploring clinical problems and systematically recording the logic and procedural pathways from multiple clinical questions to the selection of proper research approaches. The basic elements required to understand the processes of selection were catalogued and field tested, and a report was produced to define and explain these elements. Results Fundamental to a research approach is the assembly of subjects and the allocation of exposures. An algorithm leading to the selection of an approach is presented. The report is organized into three parts. The tables serve as a rapid reference section. The initial two-part narrative explains the process of approach selection. The examples section illustrates the application of the selection algorithm. Conclusions Selecting the proper research approach has six steps: the question, logic and ethics, identification of variables, data display considerations, original data source considerations, and selection of prototypical approaches for assembly of subjects. Field tests of this approach consistently demonstrated its utility. [source]


    Building the Powerful 10-Minute Office Visit: Part II.

    THE LARYNGOSCOPE, Issue 1 2001
    Beginning a Critical Literature Review
    Abstract Objective This is the second part in a series of sequential Tutorials in Clinical Research. The objective of this tutorial is to introduce methods of searching the vast stores of information now available, to review some of the computer resources available, to reintroduce the concept of an a priori design for the search, and to reveal the need for assessment of the clinical importance and validity of each pertinent article found. Study Design Tutorial. Methods An open working group has been formed with the specific aim of surveying and translating the large volume of complex information on research design and statistics into easily understood, useable, and non-threatening tutorials for the busy practitioner. The hypotheses under which this work is conducted are highly intelligent, but extremely busy, surgeons are interested in evidence-based medicine and will increase personal participation in critical reading of the literature, pending an expanded familiarity with clinical research design and statistics. Results Available resources for literature searching, methods of quick personal overviews, and quick question-specific reviews are discussed. Additionally, the methods, with examples, of beginning a critical literature review are presented. Conclusions Rapid, personal, critical literature review requires succinct formulation of the question, efficient search for the best available evidence, and critical appraisal of the pertinent individual articles to determine if sufficient evidence exists to support a clinical contention. [source]


    Strategies for the numerical integration of DAE systems in multibody dynamics

    COMPUTER APPLICATIONS IN ENGINEERING EDUCATION, Issue 2 2004
    E. Pennestrì
    Abstract The number of multibody dynamics courses offered in the university is increasing. Often the instructor has the necessity to go through the steps of an algorithm by working out a simple example. This gives the student a better understand of the basic theory. This paper provides a tutorial on the numerical integration of differential-algebraic equations (DAE) arising from the dynamic modeling of multibody mechanical systems. In particular, some algorithms based on the orthogonalization of the Jacobian matrix are herein discussed. All the computational steps involved are explained in detail and by working out a simple example. It is also reported a brief description and an application of the multibody code NumDyn3D which uses the Singular Value Decomposition (SVD) approach. © 2004 Wiley Periodicals, Inc. Comput Appl Eng Educ 12: 106,116, 2004; Published online in Wiley InterScience (www.interscience.wiley.com); DOI 10.1002/cae.20005 [source]


    Teaching, Exploring, Learning,Developing Tutorials for In-Class Teaching and Self-Learning

    COMPUTER GRAPHICS FORUM, Issue 4 2007
    S. Beckhaus
    Abstract This paper presents an experience report on a novel approach for a course on intermediate and advanced computer graphics topics. The approach uses Teachlet Tutorials, a combination of traditional seminar,type teaching with interactive exploration of the content by the audience, plus development of self-contained tutorials on the topic. In addition to a presentation, an interactive software tool is developed by the students to assist the audience in learning and exploring the topic's details. This process is guided through set tasks. The resulting course material is developed for two different contexts: (a) for classroom presentation and (b) as an interactive, self-contained, self-learning tutorial. The overall approach results in a more thorough understanding of the topic both for the student teachers as well as for the class participants. In addition to detailing the Teachlet Tutorial approach, this paper presents our experiences implementing the approach in our Advanced Computer Graphics course and presents the resultant projects. Most of the final Teachlet Tutorials were surprisingly good and we had excellent feedback from the students on the approach and course. [source]


    Predicting species distributions from museum and herbarium records using multiresponse models fitted with multivariate adaptive regression splines

    DIVERSITY AND DISTRIBUTIONS, Issue 3 2007
    Jane Elith
    ABSTRACT Current circumstances , that the majority of species distribution records exist as presence-only data (e.g. from museums and herbaria), and that there is an established need for predictions of species distributions , mean that scientists and conservation managers seek to develop robust methods for using these data. Such methods must, in particular, accommodate the difficulties caused by lack of reliable information about sites where species are absent. Here we test two approaches for overcoming these difficulties, analysing a range of data sets using the technique of multivariate adaptive regression splines (MARS). MARS is closely related to regression techniques such as generalized additive models (GAMs) that are commonly and successfully used in modelling species distributions, but has particular advantages in its analytical speed and the ease of transfer of analysis results to other computational environments such as a Geographic Information System. MARS also has the advantage that it can model multiple responses, meaning that it can combine information from a set of species to determine the dominant environmental drivers of variation in species composition. We use data from 226 species from six regions of the world, and demonstrate the use of MARS for distribution modelling using presence-only data. We test whether (1) the type of data used to represent absence or background and (2) the signal from multiple species affect predictive performance, by evaluating predictions at completely independent sites where genuine presence,absence data were recorded. Models developed with absences inferred from the total set of presence-only sites for a biological group, and using simultaneous analysis of multiple species to inform the choice of predictor variables, performed better than models in which species were analysed singly, or in which pseudo-absences were drawn randomly from the study area. The methods are fast, relatively simple to understand, and useful for situations where data are limited. A tutorial is included. [source]


    How do hydraulic vibrators work?

    GEOPHYSICAL PROSPECTING, Issue 1 2010
    A look inside the black box
    ABSTRACT In order to have realistic expectations of what output is achievable from a seismic vibrator, an understanding of the machine's limitations is essential. This tutorial is intended to provide some basics on how hydraulic vibrators function and the constraints that arise from their design. With these constraints in mind, informed choices can be made to match machine specifications to a particular application or sweeps can be designed to compensate for performance limits. [source]


    Developing and evaluating an interactive information skills tutorial,

    HEALTH INFORMATION & LIBRARIES JOURNAL, Issue 2 2006
    Maria J. Grant
    Objective:, To develop and evaluate a web-based interactive information skills tutorial integrated into the curriculum. To determine whether the tutorial was acceptable to students and explore the use of a skills assessment tool in identifying whether the tutorial improved skills. Methods:, The development of a tutorial on OVID medline to teach transferable information skills. A small cohort study to evaluate students' views on the tutorial and its effects on information skills. Results:, Thirteen objective assessments were usable. There was a statistically significant improvement in mean final assessment scores, compared with mean pre-training scores, F(2,14) = 11.493, P = 0.001. Eleven (85%) students had improved their overall information skills. The improvement in overall searching skills was enhanced by referral to the tutorial. Conclusions:, The tutorial was successfully developed and integrated into a Masters programme curriculum. In this setting, it appears to reinforce active learning, and was well received by students, who developed core generic searching skills and demonstrated improved information skills in the short and longer term. Students could use the tutorial for revision and study at a time and place of their choosing. Further evaluation is required to assess the impact of using the tutorial with large groups of students, and as a stand-alone teaching medium. [source]


    Iterative learning control and repetitive control in hard disk drive industry,A tutorial

    INTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, Issue 4 2008
    YangQuan Chen
    Abstract This paper presents a tutorial on iterative learning control (ILC) and repetitive control (RC) techniques in hard disk drive (HDD) industry for compensation of repeatable runouts (RRO). After each tutorial, an application example is given. For ILC, a simple filtering-free implementation for written-in RRO compensation is presented. For the RC part, a new application of RC in dual-stage HDD servo is presented. Copyright © 2007 John Wiley & Sons, Ltd. [source]


    A tutorial on using genetic algorithms for the design of network topology

    INTERNATIONAL JOURNAL OF NETWORK MANAGEMENT, Issue 4 2006
    Bassam Al-Bassam
    The design of network topology is an important part of network design, since network topology is directly associated with network operational behavior, capacity, reliability, and cost. This paper is a tutorial paper concerned with illustrating how the optimization capabilities of genetic algorithms can be used to design suitable network topologies considering basic topology problems. Simple genetic algorithms have been developed for the topology problem of mesh networks, considering single node and single link failure tolerance. The algorithms are based on criteria of two important measures: minimizing the length of communication links; and minimizing traffic flow through these links for given traffic loads. The first measure contributes to minimizing the cost of cabling, while the second measure contributes to minimizing the cost of link capacity. The work provides a useful approach and tools to network students and professionals concerned with the topology design of backbone networks. The developed software is made available on the Internet.,Copyright © 2006 John Wiley & Sons, Ltd. [source]


    Fusion of digital television, broadband Internet and mobile communications,Part I: Enabling technologies

    INTERNATIONAL JOURNAL OF SATELLITE COMMUNICATIONS AND NETWORKING, Issue 4 2007
    F. L. C. Ong
    Abstract The introduction of digital video broadcasting (DVB) satellite systems has become an important tool for future mobile communication and is currently a focus in several research areas such as the integration of DVB satellite systems with different wireless technologies. This tutorial consists of two parts, Enabling technologies and Future service scenarios, which aims to provide an introduction to the current state-of-the-art of DVB standards over satellite and its fusion with mobile and Internet technologies. This paper, Enabling technologies, focuses on providing an overview of the different technologies and issues that facilitates better understanding of the current and future operational scenarios, whereas the second paper, Future service scenarios will emphasize future research directions in this research area. In the first part, the paper will initially be focused on the introduction of different DVB satellite systems, i.e. DVB- via satellite (DVB-S), DVB return channel by satellite (DVB-RCS) and second-generation DVB system for broadband satellite services (DVB-S2). This is then followed by a description of the different Internet Protocol (IP) technologies used to support macro- and micro-mobility and the migration strategies from IP version 4 (IPv4) to IP version 6 (IPv6). Finally, the different security mechanisms for the DVB system and end-to-end satellite network are addressed. Copyright © 2007 John Wiley & Sons, Ltd. [source]


    Radio resource management across multiple protocol layers in satellite networks: a tutorial overview

    INTERNATIONAL JOURNAL OF SATELLITE COMMUNICATIONS AND NETWORKING, Issue 5 2005
    Paolo Barsocchi
    Abstract Satellite transmissions have an important role in telephone communications, television broadcasting, computer communications, maritime navigation, and military command and control. Moreover, in many situations they may be the only possible communication set-up. Trends in telecommunications indicate that four major growth market/service areas are messaging and navigation services (wireless and satellite), mobility services (wireless and satellite), video delivery services (cable and satellite), and interactive multimedia services (fibre/cable, satellite). When using geostationary satellites (GEO), the long propagation delay may have great impact, given the end-to-end delay user's requirements of relevant applications; moreover, atmospheric conditions may seriously affect data transmission. Since satellite bandwidth is a relatively scarce resource compared to the terrestrial one (e.g. in optical transport networks), and the environment is harsher, resource management of the radio segment plays an important role in the system's efficiency and economy. The radio resource management (RMM) entity is responsible for the utilization of the air interface resources, and covers power control, handover, admission control, congestion control, bandwidth allocation, and packet scheduling. RRM functions are crucial for the best possible utilization of the capacity. RRM functions can be implemented in different ways, thus having an impact on the overall system efficiency. This tutorial aims to provide an overview of satellite transmission aspects at various OSI layers, with emphasis on the MAC layer; some cross-layer solutions for bandwidth allocation are also indicated. Far from being an exhaustive survey (mainly due to the extensive nature of the subject), it offers the readers an extensive bibliography, which could be used for further research on specific aspects. Copyright © 2005 John Wiley & Sons, Ltd. [source]


    Teaching diffraction using computer simulations over the Internet

    JOURNAL OF APPLIED CRYSTALLOGRAPHY, Issue 6 2001
    Th. Proffen
    Computer simulations are a versatile tool to enhance the teaching of diffraction physics and related crystallographic topics to students of chemistry, materials science, physics and crystallography. Interactive computer simulations are presented, which have been added to a World Wide Web (WWW) based tutorial. A simple WWW interface is used to choose appropriate values for selected simulation parameters. The resulting structure and diffraction pattern are then plotted on the screen. Simulated structures range from a single atom to complex disordered or modulated structures. The simple interface requires no special computing knowledge and allows students to explore systematically the relationship between a real-space structure and the corresponding diffraction pattern. The large function set of the underlying simulation program (DISCUS) makes it easy to tailor the tutorial to a given syllabus by modifying or extending the current interactive examples. [source]


    Practical aspects of PARAFAC modeling of fluorescence excitation-emission data

    JOURNAL OF CHEMOMETRICS, Issue 4 2003
    C. M. Andersen
    Abstract This paper presents a dedicated investigation and practical description of how to apply PARAFAC modeling to complicated fluorescence excitation,emission measurements. The steps involved in finding the optimal PARAFAC model are described in detail based on the characteristics of fluorescence data. These steps include choosing the right number of components, handling problems with missing values and scatter, detecting variables influenced by noise and identifying outliers. Various validation methods are applied in order to ensure that the optimal model has been found and several common data-specific problems and their solutions are explained. Finally, interpretations of the specific models are given. The paper can be used as a tutorial for investigating fluorescence landscapes with multi-way analysis. Copyright © 2003 John Wiley & Sons, Ltd. [source]


    Grasping determination experiments within the UJI robotics telelab

    JOURNAL OF FIELD ROBOTICS (FORMERLY JOURNAL OF ROBOTIC SYSTEMS), Issue 4 2005
    Raul Marín
    As a result of new technology becoming available it is increasingly possible to develop more natural human-robot interfaces. In particular, interaction channels based on both voice and synthesis recognition, and combined with other sensors, mainly computer vision, are now implemented in current robots. These capabilities enable a more natural face-to-face dialogue in the human-robot interaction. Currently, they are demonstrating their potential in many service robot applications, such as museums, hospitals, and so on. One area where these new forms of interaction have been extensively tested recently is within the educational robotics context. This article addresses a novel user-interface implemented in such a system developed in our lab, namely "The UJI Robotics Telelab", where the word UJI is the acronym for the name of our University. In order to develop this kind of complex system, several years of intensive research have been necessary in both multimedia tutoring systems and robotics. The principal motive for the project was the experimentation and validation of a complete telelaboratory, including an Internet-based robot system, with off-line and on-line control possibilities, and other different facilities (e.g., multimedia tutorial, chat channel, etc.) aimed at teaching undergraduate students in the robotics subject in our university campus. Finally, taking into account experience gained from using this system for regular undergraduate courses in robotics, new facilities have been implemented, and results showing the user performance, usability, and reliability of this novel contribution are discussed, including its advantages and limitations. © 2005 Wiley Periodicals, Inc. [source]