Home About us Contact | |||
Student Assessment (student + assessment)
Selected AbstractsKnowledge and Skills for PISA,Assessing the AssessmentJOURNAL OF PHILOSOPHY OF EDUCATION, Issue 1 2007NINA BONDERUP DOHN This article gives a critique of the methodology of OECD's Programme for International Student Assessment (PISA). It is argued that PISA is invalidated by the fact that the methodology chosen does not constitute an adequate operationalisation of the question of inquiry. Therefore, contrary to the claims of PISA, PISA is not an assessment of the ,knowledge and skills for life' of students, but only of ,knowledge and skills in assessment situations'. Even this latter form of assessment is not fully reliable, however, because of problems at the level of concrete test items and because of an inherent confusion of relative and absolute evaluation. [source] Problem-based Learning in Undergraduate Dental Education: Faculty Development at the University of Southern California School of DentistryJOURNAL OF PROSTHODONTICS, Issue 5 2007Timothy R. Saunders DDS The University of Southern California School of Dentistry (USCSD) seeks to educate oral health professionals with a balanced curriculum covering health promotion, risk assessment and disease prevention, diagnostics, treatments, and therapeutics. Based on critical analyses of a 5-year educational demonstration project, the USCSD proposed to use problem-based learning (PBL) to achieve its goals. Among the many changes required to convert a traditional dental educational curriculum to PBL, none is more important than that of faculty development. To achieve this, the USCSD Curriculum Subcommittee on Faculty Development, Mentoring, and Evaluation has designed and implemented a series of workshops to train its faculty as facilitators. There are four Core Skills Workshops: PBL Process Workshop, Facilitation of Learning Workshop, Student Assessment and Feedback Workshop, and PBL in the Clinical Environment. [source] PISA 2006: An assessment of scientific literacyJOURNAL OF RESEARCH IN SCIENCE TEACHING, Issue 8 2009Rodger Bybee Abstract This article introduces the essential features of the science component of 2006 Program for International Student Assessment (PISA). Administered every 3 years, PISA alternates emphasis on Reading, Mathematics, and Science Literacy. In 2006, PISA emphasized science. This article discusses PISA's definition of scientific literacy, the three competencies that constitute scientific literacy, the contexts used for assessment units and items, the role of scientific knowledge, and the importance placed on attitude toward science. PISA 2006 included a student test, a student questionnaire, and a questionnaire for school administrators. The student test employed a balanced incomplete block design involving thirteen 30-minute clusters of items, including nine science clusters. The 13 clusters were arranged into thirteen 2-hour booklets and each sampled student was assigned one booklet at random. Mean literacy scores are presented for all participating countries, and the percentages of OECD students at the six levels of proficiency are given for the combined scale and for the competency scales. © 2009 Wiley Periodicals, Inc. J Res Sci Teach 46: 865,883, 2009 [source] Student Assessment in Tribal CollegesNEW DIRECTIONS FOR INSTITUTIONAL RESEARCH, Issue 118 2003Anna M. Oritz The unique culture and traditions of Native American communities affect student outcomes and attendance patterns. The interpretation of assessment and accountability mechanisms needs to account for these contextual issues. [source] Bridging the educational research-teaching practice gapBIOCHEMISTRY AND MOLECULAR BIOLOGY EDUCATION, Issue 1 2010Tools for evaluating the quality of assessment instruments Abstract Student assessment is central to the educational process and can be used for multiple purposes including, to promote student learning, to grade student performance and to evaluate the educational quality of qualifications. It is, therefore, of utmost importance that assessment instruments are of a high quality. In this article, we present various tools that instructors could use, both to improve instrument design and validity before presentation to students and, to evaluate the reliability and quality of the assessment after students have answered the questions. In keeping with our goals of the Bridging-the-Gap series, we also present various ideas from the educational literature regarding the evaluation of assessment quality and end with a list of criteria that instructors could use to guide their evaluation process. [source] Preliminary evaluation of ,interpreter' role plays in teaching communication skills to medical undergraduatesMEDICAL EDUCATION, Issue 3 2001K C J Lau Rationale and objectives Multiculturalism presents linguistic obstacles to health care provision. We explored the early introduction of ,interpreter' role-play exercises in teaching medical undergraduates communication skills. The interpreter role creates a natural barrier in communication providing an active prompt for recognizing learning needs in this area. Methods Bilingual Cantonese first-year medical students (n=160) were randomly allocated to either ,Observer' or ,Interpreter' role plays at a small-group introductory communication skills workshop using a quasi experimental design, counterbalanced across tutors. Students assessed their own skill competence before and, together with their perceptions of the different role plays' effectiveness, again after the workshop, using an anonymous 16 item Likert-type scale, analysed using ANOVA and MANOVA. Results Students' assessments of their skills improved significantly following the workshop (F=73·19 [1,156], P=0·0009). Students in the observer group reported greater changes in their scores following the workshop than did students in the interpreter group (F=4·84 [1,156], P=0·029), largely due to improvement in perceived skill (F=4·38 [1,156], P=0·038) rather than perceived programme effectiveness (F=3·13 [1,156], P > 0·05). Subsequent MANOVA indicated no main effect of observer/interpreter conditions, indicating these differences could be attributed to chance alone (F=1·41 [16 141], P > 0·05). Conclusion The workshop positively influenced students' perceived communication skills, but the ,Interpreter' role was less effective than the ,Observer' role in achieving this. Future studies should examine whether interpreter role plays introduced later in the medical programme are beneficial. [source] Standard-Setting Methods as Measurement ProcessesEDUCATIONAL MEASUREMENT: ISSUES AND PRACTICE, Issue 1 2010Paul Nichols Some writers in the measurement literature have been skeptical of the meaningfulness of achievement standards and described the standard-setting process as blatantly arbitrary. We argue that standard setting is more appropriately conceived of as a measurement process similar to student assessment. The construct being measured is the panelists' representation of student performance at the threshold of an achievement level. In the first section of this paper, we argue that standard setting is an example of stimulus-centered measurement. In the second section, we elaborate on this idea by comparing some popular standard-setting methods to the stimulus-centered scaling methods known as psychophysical scaling. In the third section, we use the lens of standard setting as a measurement process to take a fresh look at the two criticisms of standard setting: the role of judgment and the variability of results. In the fourth section, we offer a vision of standard-setting research and practice as grounded in the theory and practice of educational measurement. [source] Rethinking the OSCE as a Tool for National Competency EvaluationEUROPEAN JOURNAL OF DENTAL EDUCATION, Issue 2 2004M. A. Boyd The relatively recent curriculum change to Problem-Based Learning/Case-Based Education has stimulated the development of new evaluation tools for student assessment. The Objective Structured Clinical Examination (OSCE) has become a popular method for such assessment. The National Dental Examining Board of Canada (NDEB) began using an OSCE format as part of the national certification testing process for licensure of beginning dentists in Canada in 1996. The OSCE has been well received by provincial licensing authorities, dental schools and students. ,Hands on' clinical competency is trusted to the dental programs and verified through NDEB participation in the Accreditation process. The desire to refine the OCSE has resulted in the development of a new format. Previously OSCE stations consisted of case-based materials and related multiple-choice questions. The new format has case-based material with an extended match presentation. Candidates ,select one or more correct answers' from a group of up to15 options. The blueprint is referenced to the national competencies for beginning practitioners in Canada. This new format will be available to students on the NDEB website for information and study purposes. Question stems and options will remain constant. Case histories and case materials will change each year. This new OSCE will be easier to administer and be less expensive in terms of test development. Reliability and validity is enhanced by involving content experts from all faculties in test development, by having the OSCE verified by general practitioners and by making the format available to candidates. The new OSCE will be pilot tested in September 2004. Examples will be provided for information and discussion. [source] Report of a workshop on student assessment in undergraduate medical education in the United KingdomMEDICAL EDUCATION, Issue 2000Article first published online: 5 JAN 200 [source] Practical guide to medical student assessmentBIOCHEMISTRY AND MOLECULAR BIOLOGY EDUCATION, Issue 1 2007E. J. Wood No abstract is available for this article. [source] Cross-institutional assessment: Development and implementation of the On-line Student Survey SystemCOMPUTER APPLICATIONS IN ENGINEERING EDUCATION, Issue 2 2002Raymond Hoare Abstract As ABET has increased the need for routine student assessments, engineering faculty are faced with the problem of doing this in an efficient manner that minimizes the time required to conduct, tabulate, and analyze the requisite surveys. To meet this need, researchers at the University of Pittsburgh have developed the On-line Student Survey System (OS3) to facilitate EC 2000 assessment and cross-institutional benchmarking. OS3 allows multiple engineering schools to conduct customized, routine program evaluations using Web-based surveys specifically designed to meet EC 2000 objectives. Since its inception, seven engineering schools have adopted OS3. This article provides an overview of the system, a des-cription of its survey instruments, and an evaluation of the system. © 2002 Wiley Periodicals, Inc. Comput Appl Eng Educ 10: 88,97, 2002; Published online in Wiley InterScience (www.interscience.wiley.com.); DOI 10.1002/cae.10013 [source] The role of the instructor in business games: a comparison of face-to-face and online instructionINTERNATIONAL JOURNAL OF TRAINING AND DEVELOPMENT, Issue 3 2010Ana Beatriz Hernández This study analyses the role of the instructor in the e-learning process fostered by a business game. To achieve this objective, a comparative analysis was conducted with two groups of students regarding their perceptions of the instructor's role in a business game. The first group was composed of 33 participants and facilitated by an instructor in a face-to-face process. The second group was composed of 23 participants and facilitated by the same instructor online. Our results indicate that the students' assessment of the role of the instructor is clearly different in both cases: the face-to-face group valued the relevance of the instructor's role in the learning process more highly than the online group. Our findings also highlight the importance of the instructor's role in improving the students' learning experience and suggest that extra efforts by online instructors are needed to maximize the e-learning process through business games in management training. [source] |