Medical Science Educator Volume 18: No. 2

Message From the Editor-In-Chief

Uldis N. Streips, Ph.D., Editor-in-Chief, JIAMSE

Dear JIAMSE readers!

Welcome to JIAMSE volume 18-2. This is an important issue, because with this issue we implement the decision reached at the IAMSE meeting in Salt Lake City to publish the Journal four times a year with every type of contribution included. Consequently, in this issue you will find research manuscripts, innovations, a case report, MERGE, letter to the editor, and also instructions for publication. Publishing four times a year will allow us to get your manuscripts published more quickly.

All we need to make this publication schedule a success is your cooperation. First of all, please contribute all that you do innovative at your schools for peer-review, as an innovation, an article, opinion, or letter to the editor. Remember your “solution” to the educational cases is also considered as a publication following editorial review. Educational publication may not be easy for many of you, as it was not for me, who are used to publishing classical bench research. This is one reason why we ran a workshop in Salt Lake City for publication in JIAMSE and other education venues, though we hope you would consider JIAMSE first for your work. This workshop will be repeated in Leiden at the next IAMSE meeting. However, even if not easy, our editorial team is very user-friendly and will work with you to make your work as publishable as possible.

I look forward to your submissions and in the meanwhile enjoy the educational work presented in JIAMSE Volume 18-2.

All best,

Uldis N. Streips, Ph.D.
Editor-in-Chief

The Medical Educator’s Resource Guide

John R. Cotter, Ph.D.

The World Wide Web is always changing. New sites are added, some of the sites that we are familiar with are removed or become inaccessible (password protected) and others are revamped and/or the content is modified. In the process, the Uniform Resource Locator (URL) may no longer work, even when a site still exits, because the locator has been changed.

In this issue, the websites reviewed by the Resource Guide since 1999 that can be still located on the Web have been gathered and listed by subject. When necessary, the URL has been updated.

The original reviews for the websites in the new listing can be found in back issues of the Journal or its predecessor, the Basic Science Educator. The current list of websites includes the title of the website, the URL, the reviewer’s name, the volume and issue number(s), the page number(s) and the year of publication.

If you know of a website that basic science teachers, clinical instructors, and students working in the medical sciences would find useful, please consider submitting a review to The Medical Educator’s Resource Guide. You can do so by sending the review by e-mail (jrcotter@buffalo.edu).

The Medical University of South Carolina Hosts the National Pops Website

Gabriel Virella, M.D., Ph.D.

Group-based learning is a powerful educational experience, satisfactory for both students and faculty. Parker Small’s POPS platform has been available for over 30 years and is widely used. The Clinical Correlation Exercises (CCE) platform is a derivative of POPS originated at the Medical University of South Carolina (MUSC), which emphasizes differential diagnosis and laboratory tests. At MUSC, the student feedback is highly positive; 90-94% of the student agree/strongly agree that “POPS and CCE are effective platforms to integrate clinical material in the Infection and Immunity course”

The POPS website (http://etl2.library.musc.edu/pops/) includes links to the following cases, in both Word and pdf formats (except when noted):

    •AIDS
    •Elderly with Pneumonia
    •Hepatitis
    •Immediate Hypersensitivity
    •Immunodeficiency Disease
    •Influenza: Serologic Diagnosis and Epidemiology
    •Jaundiced Baby
    •MI Sick (PDF only)
    •Opportunistic Infections
    •Painful Rash
    •Paternity
    •Tetanus Immunity
    •Tuberculosis (2 versions)
    •Transplantation Immunology (2 versions)

All cases can be freely downloaded and modified by the users. Users that modify cases are requested to forward an electronic copy of their version to virellag@musc.edu.

Case Study Instructions

First – THANKS so much for agreeing to write a case for our Medical Education Case Study. This is a valuable contribution to our journal and to our readership, as they think about how to effectively work with situations in their own institutions. All in the effort to improve how we each educate ourselves and others in our medical education responsibilities.

Examples of areas that might be of interest:

    1.Course director interaction with students.
    2.Individual faculty interaction with curriculum office.
    3.Faculty affairs office issues revolving around teaching and tenure/promotion, faculty development, etc.
    4.Use of IT in teaching.
    5.Student affairs issues

Our request of you is:

    1.Write a description of your dilemma or interesting situation that you are interested in how others might respond. This description should end with one or a few questions that you expect responses to address. This should be about 2 pages (single space, Times New Roman, 12 font, 1inch margins on all sides).
    2.If possible give a “catchy” title to the case. If not “catchy”, at least give it a descriptive title. In either case try to keep the title length to about 50 characters or less.
    3.Provide your name, title and institutional affiliation and location. Students and residents – please provide your year of training (i.e. MS 1, PG2, or fellow) as your titles.
    4.Send your document to Kathryn.mcmahon@ttuhsc.edu
    5.Upon review of the case, if modifications are needed, send your revisions within 2 weeks of receipt.

We will ask respondents to

    1.Read the case and give us their “first impression” response to the questions you pose.
    2.Draft a short (3 to 4 paragraph) response to the questions posed or at least one of the questions posed in the case.

Any questions or comments can be sent to Kathryn.mcmahon@ttuhsc.edu.

Respondent Instructions

First – THANKS so much for agreeing to be a respondent for our Medical Education Case Study. This is a valuable contribution to our journal and to our readership as they think about how to effectively work with situations in their own institutions. All in the effort to improve how we each educate ourselves and others in our medical education responsibilities.

Our request of you is:

    1.Read the case and give us your “first impression” response to the questions posed.
    2.Draft a short (3 to 4 paragraph) response to the questions posed or at least one of the questions posed in the case.
    3.Provide your name, title and institutional affiliation and location. Students and residents – please provide your year of training (i.e. MS 1, PG2, or fellow) as your titles.
    4.Send your document to Kathryn.mcmahon@ttuhsc.edu within 2 weeks of receipt.

Any questions or comments can be sent to Kathryn.mcmahon@ttuhsc.edu.

Increased Acceptance of Group Learning Exercises by Second Year Medical Students from 2001-2007

Laura M. Kasman, Ph.D., Gabriel Virella, M.D., Ph.D., Gene E. Burges, M.D., Ph.D.

INTRODUCTION AND RESULTS: The MUSC medical microbiology and immunology course is an 11 credit hour, integrated, second-year course that in the 2007-2008 academic year included 36 contact hours of small group learning exercises. This represented a 20% increase in small group contact hours from 2001-2002. Small group exercises included both Patient Oriented Problem Solving (POPS) and Clinical Correlation Exercises (CCE), the increase in contact hours being entirely made up by new POPS.

Over the same period, the percentage of students who strongly agreed that small group exercises were more effective than lectures in improving retention of material increased from 59% to 77% (a 30% increase). The largest increases in student satisfaction appeared to correlate with the addition of post-tests to CCE exercises in 2006-2007, a modification suggested by students in past evaluations. Post-tests were previously only given after POPS exercises.

POPS and CCE are open-book learning exercises administered to groups of four students. The POPS format splits the information into four different packages, so that students have to share information during the activity. The CCE exercises are a derivative of POPS, originated at MUSC, which emphasize differential diagnosis and laboratory tests. Students receive identical information, including a detailed faculty-generated discussion of each case.

METHODS: From 2001-2008 approximately 35 groups of 4 students met for 15 (2001-2004) or 18 (2005-2008) two hour small group sessions over the 15 week course. Fourteen faculty in 7 classrooms facilitated student groups and administered post-tests. Students were allowed to form groups of their choosing, but student groups and faculty facilitators were constant for the duration of the course. Attendance was mandatory. Post-tests were 10-14 question computer-graded quizzes. Students were provided with explanatory answer keys immediately after each quiz. Quiz grades were recorded, but were mostly used for student self-assessment. Student assessment of the course was by PACE or E-value web-based course evaluation (response rate >90%).

CONCLUSIONS: The data suggest a general increase in satisfaction with group learning techniques among second-year medical students over the past 6 years, which may have been enhanced by addition of post-tests with immediate feedback at the end of all exercises.

RESOURCES:

National POPS Website:
http://etl2.library.musc.edu/pops/med_ed/mededportal_pops.doc

National CCE Website:
http://etl2.library.musc.edu/pops/med_ed/mededportal_cce.doc

Saving Time With PBL?

Edward C. Klatt, M.D.1, Andrew F. Payer, Ph.D.2

Small group teaching, and problem-based learning (PBL) in particular, demands greater time from faculty than a lecture format, but rewards active learning for development of clinical reasoning skills. How can faculty time be used judiciously, and still retain a small group format? We instituted an integrative PBL model at the end of the 2nd year curriculum that combined faculty, subject material, quiz and examination items, as well as contact hours across 3 existing courses. Each course previously had its own small group sessions. Time was saved with fewer faculty development sessions, substitution of wrap-up sessions for small group hours, and by having students work on their own for a two hour session without faculty facilitators. There was a 56% reduction in faculty time required with the integrative format. Student adherence to goals of the sessions were enforced through required attendance, quizzes that covered the content of the sessions, randomly calling on students during wrap-up sessions to discuss the findings, and having each small group compose a summary of their findings for each session. The students were engaged, as reported by faculty facilitators, the faculty were enthusiastic, students called on in wrap-up session gave excellent responses, and ratings from student evaluations were equivalent to those of other small groups for the whole year. This integrative format had advantages through placement at the end of the 2nd year: this wasn’t the first PBL session of the year and students were familiar with the format, the knowledge base of students was considerable, and there was a core of faculty already assigned to small group teaching. Through integration, the small group size went down (7 to 8 students per group, instead of 8 to 10 in existing courses), and in the student-run session the student roles were sometimes different from those with faculty present – some students became more active participants. The major disadvantages of this format included coordinating schedules of faculty from multiple departments, providing multiple faculty review sessions given by development person because of irregular faculty schedules, and course directors other than the lead author made little attempt to familiarize themselves with wrap-up format.

Strategy to Help Medical Students Learn Biochemistry Despite Course Structural Problems

Janet E. Lindsley, Ph.D., Timothy Formosa, Ph.D.

Our Medical Biochemistry course is a compact, primarily lecture-based course. There are four instructors in the course, each with different teaching styles, emphases and goals. The high density of lectures causes a problem for students who do not adopt strong study habits from the outset of the course since there is no opportunity to catch up. Additionally, the very distinct teaching styles of the participating faculty leave many students frustrated and confused about how and what they should be studying. After trying various strategies with mixed success over the past several years, we have recently implemented a tool that helps to solve both of these problems. The course director worked with the other teaching faculty to write a set of USMLE-format quiz questions that cover key concepts from each lecture. After one or a small set of lectures has been presented, a 5-10 question quiz is released on the course WebCT site. Students have 3 days to complete each open-book quiz. They are encouraged to consult other students and use reference materials. In the spirit of self-assessment, students may re-submit the answers once to improve their score. The WebCT program is set for timed access to the quizzes and manages the grades. The 41 quizzes (317 total questions) count for a total of 10% of the course grade, enough to engage the attention of the students. As the course progresses the quiz questions become progressively more complex, building on previous material. The frequency and progressive nature of the quizzes encourage students to adhere to a more optimal study schedule and to retain previous concepts. These quizzes have increased consistency throughout the course and have helped to focus attention on the key objectives of the more complex lectures. All of the students participated and achieved an average cumulative quiz score of 93%. The vast majority of students reported that the quizzes were very (85%) or moderately (10%) helpful. Importantly, the mean on the comprehensive final exam increased 11% compared to that on a similar exam given the two previous years.

Is it Learning or is it a Prelude to Cheating?

Sandy Cook, Ph.D., Janil Puthucheary, M.B., BCh, BAO, MRCPCH, Robert K, Kamei, M.D.

As I (Associate Dean, Curriculum Development) was sitting in my office pouring through hundreds of e-mails, a faculty member stopped in to express concerns over the students copying material from the tests. “You’ve got to do something about this!” he exclaimed. I sighed, and casually walked to the rear of the classroom to observe what was going on. Sure, enough, several students had their head buried in the test papers, busily typing away on their computers in what looked very much like: Question stem, response option A, response option B, etc. They were not engaged in any of the discussions going on around them – just typing. I came up behind one student who quickly put a piece of paper over his screen. Hmm, I thought, that seem a bit suspicious. As I neared another student, I reminded the class loudly – “Only key points not the full questions.” The faculty member and I walked out again, and I was reminded that I had talked to the class once before. What was I going to do now, he asked?

Evaluating the Teaching-Learning Methodology of the Gross Anatomy Course at San Juan Bautista School of Medicine

Ramonita Correa, Ph.D., Yocasta Brugal, M.D., Jorge PĂŠrez, M.D.

ABSTRACT

This is the first time that assessment tests were applied at the San Juan Bautista School of Medicine in the Gross Anatomy course. The results showed that our teaching/learning methodology was adequate, effective in enhancing performance by the students, and helped to analyze their outcomes at the individual course level.

The Impact of Online Lecture Recordings on Learning Outcomes in Pharmacology

Lynette Fernandes, Ph.D.1, Moira Maley, Ph.D., Cert.Med.Ed.2, Chris Cruickshank, B. Sc. (Hons)1

ABSTRACT

Interactive and/or active approaches to learning are known to be associated with better outcomes for the student. Increasingly, lecture recordings can be accessed online and offer irresistible convenience, particularly to students. In Pharmacology, the use of online lecture recordings via Lectopia was designed to provide an adjunct for revision and clarification. The introduction of Lectopia in second year Pharmacology was associated with a marked decrease in lecture attendance and an apparent increase in the failure rate. Therefore, the impact of Lectopia on learning outcomes was examined using an online, voluntary survey. Of the 295 students enrolled during the course of this study, 86% completed the survey. Students were sorted into three groups, depending on whether they usually attended both, one or no lectures per week. Students who reported they usually did not attend any lectures had a significantly lower mark in continuous summative assessments than students who attended both or one lecture per week (p < 0.03). Students who usually used Lectopia instead of attending lectures scored significantly lower marks in continuous summative assessments, exam and the final mark compared with students who did not do so (p < 0.05). The major reasons cited for using Lectopia were revision, clarification, timetable clashes and missed lectures. Interestingly, females performed better than males, provided they usually attended both lectures per week and did not use Lectopia instead of attending lectures. Attendance at lectures appears important for achievement of learning outcomes in second year Pharmacology. The use of online lectures as a replacement for face-to-face lectures may be inappropriate in the biological sciences which require in-depth understanding of mechanistic and fundamental concepts.

Back to Archives