OSCE Stations (osce + stations)

Distribution by Scientific Domains


Selected Abstracts


Rethinking the OSCE as a Tool for National Competency Evaluation

EUROPEAN JOURNAL OF DENTAL EDUCATION, Issue 2 2004
M. A. Boyd
The relatively recent curriculum change to Problem-Based Learning/Case-Based Education has stimulated the development of new evaluation tools for student assessment. The Objective Structured Clinical Examination (OSCE) has become a popular method for such assessment. The National Dental Examining Board of Canada (NDEB) began using an OSCE format as part of the national certification testing process for licensure of beginning dentists in Canada in 1996. The OSCE has been well received by provincial licensing authorities, dental schools and students. ,Hands on' clinical competency is trusted to the dental programs and verified through NDEB participation in the Accreditation process. The desire to refine the OCSE has resulted in the development of a new format. Previously OSCE stations consisted of case-based materials and related multiple-choice questions. The new format has case-based material with an extended match presentation. Candidates ,select one or more correct answers' from a group of up to15 options. The blueprint is referenced to the national competencies for beginning practitioners in Canada. This new format will be available to students on the NDEB website for information and study purposes. Question stems and options will remain constant. Case histories and case materials will change each year. This new OSCE will be easier to administer and be less expensive in terms of test development. Reliability and validity is enhanced by involving content experts from all faculties in test development, by having the OSCE verified by general practitioners and by making the format available to candidates. The new OSCE will be pilot tested in September 2004. Examples will be provided for information and discussion. [source]


assessment: Checking the checklist: a content analysis of expert- and evidence-based case-specific checklist items

MEDICAL EDUCATION, Issue 9 2010
Agatha M Hettinga
Medical Education 2010: 44: 874,883 Objectives, Research on objective structured clinical examinations (OSCEs) is extensive. However, relatively little has been written on the development of case-specific checklists on history taking and physical examination. Background information on the development of these checklists is a key element of the assessment of their content validity. Usually, expert panels are involved in the development of checklists. The objective of this study is to compare expert-based items on OSCE checklists with evidence-based items identified in the literature. Methods, Evidence-based items covering both history taking and physical examination for specific clinical problems and diseases were identified in the literature. Items on nine expert-based checklists for OSCE examination stations were evaluated by comparing them with items identified in the literature. The data were grouped into three categories: (i) expert-based items; (ii) evidence-based items, and (iii) evidence-based items with a specific measure of their relevance. Results, Out of 227 expert-based items, 58 (26%) were not found in the literature. Of 388 evidence-based items found in the literature, 219 (56%) were not included in the expert-based checklists. Of these 219 items, 82 (37%) had a specific measure of importance, such as an odds ratio for a diagnosis, making that diagnosis more or less probable. Conclusions, Expert-based, case-specific checklist items developed for OSCE stations do not coincide with evidence-based items identified in the literature. Further research is needed to ascertain what this inconsistency means for test validity. [source]


The effect of differential rater function over time (DRIFT) on objective structured clinical examination ratings

MEDICAL EDUCATION, Issue 10 2009
Kevin McLaughlin
Context, Despite the impartiality implied in its title, the objective structured clinical examination (OSCE) is vulnerable to systematic biases, particularly those affecting raters' performance. In this study our aim was to examine OSCE ratings for evidence of differential rater function over time (DRIFT), and to explore potential causes of DRIFT. Methods, We studied ratings for 14 internal medicine resident doctors over the course of a single formative OSCE, comprising 10 12-minute stations, each with a single rater. We evaluated the association between time-slot and rating for a station. We also explored a possible interaction between time-slot and station difficulty, which would support the hypothesis that rater fatigue causes DRIFT, and considered ,warm-up' as an alternative explanation for DRIFT by repeating our analysis after excluding the first two OSCE stations. Results, Time-slot was positively associated with rating on a station (regression coefficient 0.88, 95% confidence interval [CI] 0.38,1.38; P = 0.001). There was an interaction between time-slot and station difficulty: for the more difficult stations the regression coefficient for time-slot was 1.24 (95% CI 0.55,1.93; P = 0.001) compared with 0.52 (95% CI , 0.08 to 1.13; P = 0.09) for the less difficult stations. Removing the first two stations from our analyses did not correct DRIFT. Conclusions, Systematic biases, such as DRIFT, may compromise internal validity in an OSCE. Further work is needed to confirm this finding and to explore whether DRIFT also affects ratings on summative OSCEs. If confirmed, the factors contributing to DRIFT, and ways to reduce these, should then be explored. [source]


A study of pre-registration house officers' clinical skills

MEDICAL EDUCATION, Issue 12 2000
R A Fox
Background Little is known about the ability of pre-registration house officers (PRHOs) to perform basic clinical skills just prior to entering the medical register. Objectives To find out whether PRHOs have deficiencies in basic clinical skills and to determine if the PRHOs themselves or their consultants are aware of them. Method All 40 PRHOs at the Chelsea and Westminster and Whittington Hospitals were invited to undertake a 17 station OSCE of basic clinical skills. Each station was marked by one examiner completing an overall global score after completing an itemised checklist. An adequate station performance was the acquisition of a pass/borderline pass grade. Prior to the OSCE, a questionnaire was given to each PRHO asking them to rate their own abilities (on a 5-point scale) in the skills tested. A similar questionnaire was sent to the educational supervisors of each PRHO asking them to rate their house officer's ability in each of the same skills. Results Twenty-two PRHOs participated. Each PRHO failed to perform adequately a mean of 2·4 OSCE stations (SD 1·8, range 1,8). There were no significant correlations between OSCE performance and either self- or educational supervisor ratings. The supervisor felt unable to give an opinion on PRHO abilities in 18% of the skills assessed. Discussion This study suggests that PRHOs may have deficiencies in basic clinical skills at the time they enter the medical register. Neither the PRHOs themselves nor their consultants identified these deficiencies. A large regional study with sufficient power is required to explore the generalizability of these concerns in more detail. [source]