Medical Licensing Examination (medical + licensing_examination)

Distribution by Scientific Domains


Selected Abstracts


Modeling Passing Rates on a Computer-Based Medical Licensing Examination: An Application of Survival Data Analysis

EDUCATIONAL MEASUREMENT: ISSUES AND PRACTICE, Issue 3 2004
André F. de Champlain
The purpose of this article was to model United States Medical Licensing Examination (USMLE) Step 2 passing rates using the Cox Proportional Hazards Model, best known for its application in analyzing clinical trial data. The number of months it took to pass the computer-based Step 2 examination was treated as the dependent variable in the model. Covariates in the model were: (a) medical school location (U.S. and Canadian or other), (b) primary language (English or other), and (c) gender. Preliminary findings indicate that examinees were nearly 2.7 times more likely to experience the event (pass Step 2) if they were U.S. or Canadian trained. Examinees with English as their primary language were 2.1 times more likely to pass Step 2, but gender had little impact. These findings are discussed more fully in light of past research and broader potential applications of survival analysis in educational measurement. [source]


Selection Criteria for Emergency Medicine Residency Applicants

ACADEMIC EMERGENCY MEDICINE, Issue 1 2000
Joseph T. Crane MD
Abstract: Objectives: To determine the criteria used by emergency medicine (EM) residency selection committees to select their residents, to determine whether there is a consensus among residency programs, to inform programs of areas of possible inconsistency, and to better educate applicants pursuing careers in EM. Methods: A questionnaire consisting of 20 items based on the current Electronic Residency Application Service (ERAS) guidelines was mailed to the program directors of all 118 EM residencies in existence in February 1998. The program directors were instructed to rank each item on a five-point scale (5 = most important, 1 = least important) as to its importance in the selection of residents. Followup was done in the form of e-mail and facsimile. Results: The overall response rate was 79.7%, with 94 of 118 programs responding. Items ranking as most important (4.0-5.0) in the selection process included: EM rotation grade (mean ± SD = 4.79 ± 0.50), interview (4.62 ± 0.63), clinical grades (4.36 ± 0.70), and recommendations (4.11 ± 0.85). Moderate emphasis (3.0-4.0) was placed on: elective done at program director's institution (3.75 ± 1.25), U.S. Medical Licensing Examination (USMLE) step II (3.34 ± 0.93), interest expressed in program director's institution (3.30 ± 1.19), USMLE step I (3.28 ± 0.86), and awards/achievements (3.16 ± 0.88). Less emphasis (<3.0) was placed on Alpha Omega Alpha Honor Society (AOA) status (3.01 ± 1.09), medical school attended (3.00 ± 0.85), extracurricular activities (2.99 ± 0.87), basic science grades (2.88 ± 0.93), publications (2.87 ± 0.99), and personal statement (2.75 ± 0.96). Items most agreed upon by respondents (lowest standard deviation, SD) included EM rotation grade (SD 0.50), interview (SD 0.63), and clinical grades (SD 0.70). Of the 94 respondents, 37 (39.4%) replied they had minimum requirements for USMLE step I (195.11 ± 13.10), while 30 (31.9%) replied they had minimum requirements for USMLE step II (194.27 ± 14.96). Open-ended responses to "other" were related to personal characteristics, career/goals, and medical school performance. Conclusions: The selection criteria with the highest mean values as reported by the program directors were EM rotation grade, interview, clinical grades, and recommendations. Criteria showing the most consistency (lowest SD) included EM rotation grade, interview, and clinical grades. Results are compared with those from previous multispecialty studies. [source]


Judges' Use of Examinee Performance Data in an Angoff Standard-Setting Exercise for a Medical Licensing Examination: An Experimental Study

JOURNAL OF EDUCATIONAL MEASUREMENT, Issue 4 2009
Brian E. Clauser
Although the Angoff procedure is among the most widely used standard setting procedures for tests comprising multiple-choice items, research has shown that subject matter experts have considerable difficulty accurately making the required judgments in the absence of examinee performance data. Some authors have viewed the need to provide performance data as a fatal flaw for the procedure; others have considered it appropriate for experts to integrate performance data into their judgments but have been concerned that experts may rely too heavily on the data. There have, however, been relatively few studies examining how experts use the data. This article reports on two studies that examine how experts modify their judgments after reviewing data. In both studies, data for some items were accurate and data for other items had been manipulated. Judges in both studies substantially modified their judgments whether the data were accurate or not. [source]


COMLEX-1 and USMLE-1 Are Not Interchangeable Examinations

ACADEMIC EMERGENCY MEDICINE, Issue 2 2010
John Sarko MD
ACADEMIC EMERGENCY MEDICINE 2010; 17:1,3 © 2010 by the Society for Academic Emergency Medicine Abstract Objectives:, Osteopathic medical students must take the Comprehensive Osteopathic Medical Licensing Examination (COMLEX),USA series of examinations, but not the United States Medical Licensing Examination (USMLE-1) series. Few data are available describing the comparability of the two tests. This study sought to determine if COMLEX-1 scores could predict USMLE-1 scores among osteopathic medical students applying to an emergency medicine (EM) residency and to determine if the scores are interchangeable. Methods:, This was a retrospective analysis of osteopathic medical students applying to an EM residency program in the 2005,2006 and 2006,2007 application seasons. Students were included if they took both the COMLEX-1 and the USMLE-1 examinations. Linear regression was performed and a Bland-Altman plot of the standardized mean scores of each test was created. Results:, Ninety students were included. The mean (± standard deviation [SD]) COMLEX-1 score was 559.5 (±68.6), and the mean (±SD) USMLE-1 score was 207.6 (±15.5). The correlation was 0.79, with an R2 of 62.3%. The Bland-Altman plot showed a mean difference between the standardized scores of 0, with 95% confidence intervals (CIs) of ,1.28 to +1.28 standard normal units. Limitations include that this was a single center study, and only students who took both tests could be studied. Conclusions:, COMLEX-1 scores predict only 62.3% of the variance in USMLE-1 scores, and the scores are not interchangeable. [source]


Conference Attendance Does Not Correlate With Emergency Medicine Residency In-Training Examination Scores

ACADEMIC EMERGENCY MEDICINE, Issue 2009
H. Gene Hern Jr MD
Abstract Objectives:, The residency review committee for emergency medicine (EM) requires residents to have greater than 70% attendance of educational conferences during residency training, but it is unknown whether attendance improves clinical competence or scores on the American Board of Emergency Medicine (ABEM) in-training examination (ITE). This study examined the relationship between conference attendance and ITE scores. The hypothesis was that greater attendance would correlate to a higher examination score. Methods:, This was a multi-center retrospective cohort study using conference attendance data and examination results from residents in four large county EM residency training programs. Longitudinal multi-level models, adjusting for training site, U.S. Medical Licensing Examination (USMLE) Step 1 score, and sex were used to explore the relationship between conference attendance and in-training examination scores according to year of training. Each year of training was studied, as well as the overall effect of mean attendance as it related to examination score. Results:, Four training sites reported data on 405 residents during 2002 to 2008; 386 residents had sufficient data to analyze. In the multi-level longitudinal models, attendance at conference was not a significant predictor of in-training percentile score (coefficient = 0.005, 95% confidence interval [CI] = ,0.053 to 0.063, p = 0.87). Score on the USMLE Step 1 examination was a strong predictor of ITE score (coefficient = 0.186, 95% CI = 0.155 to 0.217; p < 0.001), as was female sex (coefficient = 2.117, 95% CI = 0.987 to 3.25; p < 0.001). Conclusions:, Greater conference attendance does not correlate with performance on an individual's ITE scores. Conference attendance may represent an important part of EM residency training but perhaps not of ITE performance. [source]


Receiving: The Use of Web 2.0 to Create a Dynamic Learning Forum to Enrich Resident Education

ACADEMIC EMERGENCY MEDICINE, Issue 2009
Adam Rosh
Receiving (http://www.drhem.com) is a powerful web-based tool that encompasses web 2.0 technologies. "Web 2.0" is a term used to describe a group of loosely related network technologies that share a user-focused approach to design and functionality. It has a strong bias towards user content creation, syndication, and collaboration (McGee 2008). The use of Web 2.0 technology is rapidly being integrated into undergraduate and graduate education, which dramatically influences the ways learners approach and use information (Sandars 2007). Knowledge transfer has become a two-way process. Users no longer simply consume and download information from the web; they create and interact with it. We created this blog to facilitate resident education, communication, and productivity. Using simple, freely available blog software (Wordpress.com), this inter-disciplinary web-based forum integrates faculty-created, case-based learning modules with critical essays and articles related to the practice of emergency medicine (EM). Didactic topics are based on the EM model and include multi-media case presentations. The educational modules include a visual diagnosis section (VizD), United States Medical Licensing Examination (USMLE) board-style cases (quizzER), radiographic interpretation (radER), electrocardiogram interpretation (Tracings), and ultrasound image and video clip interpretation (Morrison's Pouch). After viewing each case, residents can submit their answers to the questions asked in each scenario. At the end of each week, a faculty member posts the answer and facilitates an online discussion of the case. A "Top 10 Leader Board" is updated weekly to reflect resident participation and display a running tally of correct answers submitted by the residents. Feedback by the residents has been very positive. In addition to the weekly interactive cases, Receiving also includes critical essays and articles on an array of topics related to EM. For example, "Law and Medicine" is a monthly essay written by an emergency physician who is also a lawyer. This module explores legal issues related to EM. "The Meeting Room" presents interviews with leading scholars in the field. "Got Public Health?", written by a resident, addresses relevant social, cultural, and political issues commonly encountered in the emergency department. "Mini Me" is dedicated to pediatric pearls and is overseen by a pediatric emergency physician. "Sherwin's Critical Care" focuses on critical care principles relevant to EM and is overseen by a faculty member. As in the didactic portion of the website, residents and faculty members are encouraged to comment on these essays and articles, offering their own expertise and interpretation on the various topics. Receiving is updated weekly. Every post has its own URL and tags allowing for quick and easy searchability and archiving. Users can search for various topics by using a built-in search feature. Receiving is linked to an RSS (Really Simple Syndication) feed, allowing users to get the latest information without having to continually check the website for updates. Residents have access to the website anytime and anywhere that the internet is available (e.g., home computer, hospital computer, IphoneÔ, BlackBerryÔ), bringing the classroom to them. This unique blend of topics and the ability to create a virtual interactive community creates a dynamic learning environment and directly enhances resident education. Receiving serves as a core educational tool for our residency, presenting interesting and relevant EM information in a collaborative and instructional environment. [source]


Empathy in medical students as related to academic performance, clinical competence and gender

MEDICAL EDUCATION, Issue 6 2002
M Hojat
Context, Empathy is a major component of a satisfactory doctor,patient relationship and the cultivation of empathy is a learning objective proposed by the Association of American Medical Colleges (AAMC) for all American medical schools. Therefore, it is important to address the measurement of empathy, its development and its correlates in medical schools. Objectives, We designed this study to test two hypotheses: firstly, that medical students with higher empathy scores would obtain higher ratings of clinical competence in core clinical clerkships; and secondly, that women would obtain higher empathy scores than men. Materials and subjects, A 20-item empathy scale developed by the authors (Jefferson Scale of Physician Empathy) was completed by 371 third-year medical students (198 men, 173 women). Methods, Associations between empathy scores and ratings of clinical competence in six core clerkships, gender, and performance on objective examinations were studied by using t -test, analysis of variance, chi-square and correlation coefficients. Results, Both research hypotheses were confirmed. Empathy scores were associated with ratings of clinical competence and gender, but not with performance in objective examinations such as the Medical College Admission Test (MCAT), and Steps 1 and 2 of the US Medical Licensing Examinations (USMLE). Conclusions, Empathy scores are associated with ratings of clinical competence and gender. The operational measure of empathy used in this study provides opportunities to further examine educational and clinical correlates of empathy, as well as stability and changes in empathy at different stages of undergraduate and graduate medical education. [source]