Medical Science Educator Volume 14: No. 2

Message from the Association Manager

Julie K. Hewett

Hopefully in the past few weeks you have had the opportunity to visit the IAMSE web site and have seen some of the changes that have taken place. The new website has been under development since late last spring when the Technology Task Force, Chaired by Peter Anderson gathered and made several recommendations regarding site… Read more »

The Medical Educator’s Resource Guide

John R. Cotter, Ph.D.

The goal of the Medical Educator’s Resource Guide is to identify World Wide Web sites that are judged to be of interest to basic science educators. In this edition of the Guide, all of the reviews presented below should be of interest to basic science educators who teach histology. The reviews illustrate how the same… Read more »

Medical Students’ Performance in the IV Year Exit Exam: Effect of Clinical Reasoning Exercises, Self-Observation on Tape, and Faculty Feedback on Clinical Skills

Fabrizia Faustinella, M.D., Ph.D., Philip R. Orlander, M.D., Laura A. Colletti, M.D., Harinder S. Juneja, M.D., and Linda C. Perkowski, Ph.D.

ABSTRACT

The Clinical Performance Examination (CPeX) is administered at the end of the IV year required Internal Medicine Clerkship. The exam consists of eight standardized patient encounters, during which the students are required to perform a focused history and physical exam based on the patient presenting complaint. A remediation plan consisting of the following steps was developed for students who fail the exam:

Step I: Students’ performance review.
Step II: Clinical reasoning exercise assignment.
Step III: Formative feedback session with self-observation on videotape.
Step IV: Additional faculty-guided clinical reasoning exercise.

During the academic year 2002-03, 21 students out of 191 (11%) failed the initial CPeX. The students who failed went through the remediation process and retook the exam. The post-remediation exam scores show significant improvement in both history and physical examination skills. We conclude that: clinical reasoning exercises help students understand how to choose the most important components of the history and physical exam to best delineate the patient’s problem and to develop a differential diagnosis; self observation on tape helps the student to gain an awareness of their deficiencies and to focus on their own areas of weakness; and formative feedback plays a central role in aiding students to improve their performance, as widely supported by the literature.

A Cross-Institutional Partnership for Teaching and Learning Pharmacology

Susan J. Pasquale, Ph.D.1, Alice Gardner, Ph.D.2 and John R. McCullough, Ph.D.3

ABSTRACT

The focus of this paper is the development of a multi-layered teaching and learning partnership between a school of medicine and school of pharmacy, designed to introduce interprofessional teaching and learning in the medical school??bf?s pharmacology course. It features the process of building an alliance between a medical school and school of pharmacy, which includes the students, faculty and administrators of each organization as key participants. The paper emphasizes that the strategies used to move forward with the partnership were key to facilitating effective change, and highlights the benefits of the multi-layered cross-institutional partnership. The authors also highlight what they found most applicable and useful from the organizational change literature in the development of the partnership. This paper provides faculty with an opportunity to recognize challenges and successes for building new and valuable partnerships for their courses and organizations, and an approach to developing partnerships that optimize teaching and learning in the basic sciences.

The Comparison of OSPE With Conventional Physiology Practical Assessment

Aarti Sood Mahajan, M.B.B.S, M.D.1, Nilima Shankar, M.B.B.S, M.D.2, O.P.Tandon, M.B.B.S, M.D, M.N.A.M.S, F.A.M.S.2

ABSTRACT

The Objective Structured Practical Examination (OSPE) is a new concept in practical assessment of physiology in our country. It is a modified form of Objective Structured Clinical Examination (OSCE) but is used for evaluation of pre and paraclinical subjects. Although theoretically known, very few medical colleges have incorporated OSPE as an assessment tool in the curriculum of first year medical students. We wanted to compare the marks obtained in OSPE and the other conventional methods. The OSPE marks showed similarity with clinical examination and were different from marks of other experimental procedures like graph and chart. A similar result was found in two examinations. We conclude that OSPE can replace the existing pattern of clinical examination. To replace others it would require an elaborate and structured OSPE bank. Presently, it can supplement but not replace the conventional methods. Any change must first be thoroughly evaluated before it can uproot a well-defined and time-tested assessment methodology.

Using Problem-based Learning Evaluations to Improve Facilitator Performance and Student Learning

Scott A. Cottrell, Ed.D., Mary Wimmer, Ph.D., Barry T. Linger, Ed.D., James M. Shumway, Ph.D. and Elizabeth A. Jones, Ph.D.

ABSTRACT

This paper will report the relationship between course and faculty evaluations for a problem-based learning (PBL) experience in a medical school curriculum. Identifying relationships between students??bf? reflections about the problem-based learning experience and how well facilitators guided the group (e.g., helped identify key learning issues) can answer fundamental questions about the potential of PBL to advance essential skills and knowledge. In 45 PBL groups across the 2001 and 2003 academic years, students completed a facilitator and a PBL course evaluation. The facilitator evaluation included nine questions. Each question used a five-point scale from Poor (1), Fair (2), Somewhat good (3), Good (4) to Excellent (5). The PBL course evaluation included 9 questions on a standard 5-point scale, ranging from Not at All (1); Slightly (2); Somewhat (3); Mostly (4), and Completely (5). Two statistical analyses were conducted to address the research questions. First, a factor analysis was used to explore the organization of underlying factors in the facilitator and course evaluations. Factor analysis can provide evidence of construct validity for both instructional and learning dimensions.Using each factor as a variable, factor scores (mean of the items in each factor) for the facilitation evaluations were used to predict factor scores yielded from the course evaluation. A regression analyses explored the potential for facilitator performance scores (independent) to predict student observations about their own learning (dependent). An analysis of the questions reveals reasonable interpretations of the two factors (Collaboration and Independent Leaning Skills). The results revealed significant relationships between the facilitator scale score and both scale scores for the course evaluations. Overall, these results suggest that the facilitator evaluation reveals a global indication of facilitator performance. Targeting the quality of specific skills, then, may require additional assessment strategies, such as having trained raters evaluate facilitator performance. An analysis of the course evaluation also reveals that students distinguish self-directed learning skills from collaborations skills. The connection between these factors suggests that facilitator performance, although limited, does impact the extent of students learning and development. Failing to recognize the importance of appropriate facilitation skills may ultimately compromise the learning environment.

Evaluation of Student Learning: A Continuum from Classroom to Clerkship: A Webcast Audioseminar Series for Spring 2004

Carol F. Whitfield, Ph.D., Phyllis Blumberg, Ph.D., Byron Crawford, M.D., Debra DaRosa, Ph.D., Rebecca Henry, Ph.D., Brian Mavis, Ph.D., and Sebastian Uijtdehaage, Ph.D.

ABSTRACT

In the spring of 2004, IAMSE sponsored a webcast audioseminar series titled “Evaluation of Student Learning: A Continuum from Classroom to Clerkship”. Six nationally recognized experts in evaluation of student learning presented seminars that described various ways to develop and use evaluation methods in settings generally found across the medical curriculum. Our audience included members of institutional faculty development programs and individual faculty members from many countries across the world. Our webcast series allowed registrants to listen to the presentation in real time while viewing the presenter’s slides on their computer web browser. The presentations were interactive, allowing the audience to ask questions or provide information from their own experiences. Audio recordings of the seminars, accompanied by the slides were archived on the International Association of Medical Science Educators (IAMSE) website, and are available to registrants who want to review the seminars. Evaluation of student learning proved to be a very popular topic, and the audience numbered well over 100 for each of the six seminars. We urge educators to carefully read the following philosophical and practical approaches to evaluation of student learning. Use these white papers to convince colleagues, Chairs and Deans that there must be a solid evaluation plan for their institution. It is important for educators to measure the return values on education and make them a part of annual reports. Each seminar speaker provided a summary of content and major points of discussion following their presentation. These summaries are reproduced below.

Preparing A Manuscript for Submission to the Journal of the International Association of Medical Science Educators

David L. McWhorter, Ph.D., Jean-Fran'ois Bertholon, M.D., Ph.D., David L. Bolender, Ph.D., Jennifer Brueckner, Ph.D., John R. Cotter, Ph.D., Carlos A. Feldstein, Ph.D., E. Pat Finnerty, Ph.D., and Douglas J. Gould, Ph.D.

ABSTRACT

The objective of this paper is to address one of the primary reasons that manuscripts are rejected for publication in the Journal of the International Association of Medical Science Educators (JIAMSE), poor manuscript writing. One of the primary goals of the International Association of Medical Science Educators (IAMSE) annual meeting is to improve the way we teach medical science students. The information that IAMSE members share in their poster presentations represents cutting-edge medical education research. The impact of these presentations is limited if the results are not disseminated beyond the annual IAMSE meeting to a larger audience. It remains a goal of the JIAMSE Editorial Board to encourage IAMSE members to share their medical education research with the community of medical educators by publishing the results of their work in JIAMSE. The journal is the peer-reviewed, biannual (June and December) electronic journal of IAMSE that is published in three languages (i.e., English, French, and Spanish). JIAMSE publishes multiple types of medical education related contributions, including: original research manuscripts, reviews, editorials, opinion papers, and announcements. Submissions address a wide range of topics that are of interest to IAMSE members, such as the introduction, application, and success of new teaching methods. In this paper, readers will receive practical information on how to strengthen their medical education reports for publication in JIAMSE. Guidelines for each section of a medical education research manuscript will be addressed as well as key elements that JIAMSE editors use when reviewing a paper for publication.

Back to Archives