Medical Students’ Performance in the IV Year Exit Exam: Effect of Clinical Reasoning Exercises, Self-Observation on Tape, and Faculty Feedback on Clinical Skills

Fabrizia Faustinella, M.D., Ph.D., Philip R. Orlander, M.D., Laura A. Colletti, M.D., Harinder S. Juneja, M.D., and Linda C. Perkowski, Ph.D.

University of Texas Health Science Center at Houston
Houston, TX 77030, U.S.A.

(+)1-713-500-6714
(+)1-713-500-0654

ABSTRACT

The Clinical Performance Examination (CPeX) is administered at the end of the IV year required Internal Medicine Clerkship. The exam consists of eight standardized patient encounters, during which the students are required to perform a focused history and physical exam based on the patient presenting complaint. A remediation plan consisting of the following steps was developed for students who fail the exam:

Step I: Students’ performance review.
Step II: Clinical reasoning exercise assignment.
Step III: Formative feedback session with self-observation on videotape.
Step IV: Additional faculty-guided clinical reasoning exercise.

During the academic year 2002-03, 21 students out of 191 (11%) failed the initial CPeX. The students who failed went through the remediation process and retook the exam. The post-remediation exam scores show significant improvement in both history and physical examination skills. We conclude that: clinical reasoning exercises help students understand how to choose the most important components of the history and physical exam to best delineate the patient’s problem and to develop a differential diagnosis; self observation on tape helps the student to gain an awareness of their deficiencies and to focus on their own areas of weakness; and formative feedback plays a central role in aiding students to improve their performance, as widely supported by the literature.


INTRODUCTION

In this era of great technological progress, it’s becoming common to think that diagnoses can be made easier by ordering tests and that complex decisions that face physicians every day can be made by looking at computerized practice guidelines. However, this is not the case. Excellence in basic clinical skills (i.e. history taking skills, physical examination skills, clinical reasoning skills, patient-physician interaction) continues to be of critical importance in the professional life of the practicing physician. Despite the obvious importance of basic clinical skills, there is a wealth of studies documenting significant deficiencies among students and residents in developing these skills.1-4

At our institution, faculty observation of student performance in the exit exam shows that a large number of students have inadequate clinical skills, especially with respect to history-taking and physical examinations. These data are in line with many publications and with the latest report of the Association of American Medical Colleges on the clinical education of medical students in the United States. The report concluded that there is a clear need to ensure that medical students acquire fundamental clinical skills, particularly history-taking and physical diagnosis skills.5

The students who failed the CPeX, were allowed in the past to retake the exam without undergoing any type of intervention. On average, students’ clinical performance remained poor even if the exam cases didn’t change. To help the students better understand their own deficiencies and improve their performance, we decided to develop a targeted remediation process for students who fail the exam. The remediation process is comprised of clinical reasoning exercises, self-observation on tape and formative feedback. In addition we have made changes to the CPeX including case revision by a committee of faculty members and development of case specific checklists to better identify the students’ areas of deficiencies. We also introduced a formal faculty discussion-feedback session following the patient encounters.

MATERIALS AND METHODS

CPeX
The Clinical Performance Examination (CPeX) is administered at the end of the IV year Required Internal Medicine Clerkship. The exam consists of eight standardized patients (SP) encounters, during which the students are required to perform a focused history and physical exam based on the SP’s presenting complaint. Prior to entering the examination room, the students are provided with the patient’s name, age, gender, chief complaint, and vital signs. Specific instructions are posted on the door to each examination room, reminding the students to perform a focused interview and physical examination as indicated for the problems suggested in the interview. Faculty observe the students from a video room as the encounters are being taped. SP encounters are 12 minutes each. The students do not receive verbal feedback from the standardized patients. After each patient encounter, the students return to a writing station where they are given eight minutes to write a problem list and a differential diagnosis. At the end of the exam, students have two 15-minute faculty discussion/feedback sessions where they are asked programmed case specific questions about two of the patient encounters. The students are asked to justify the problem list based on the data gathered and to explain the differential diagnosis based on the problem list. Questions are asked by the faculty on the pathophysiologic principles that underlie the patient’s problems as outlined by a specific faculty guide to assure standardization among different faculty. The faculty give students feedback on their performance, pointing out strengths and weaknesses of the interview, physical examination, and patient/physician interaction.

Case Specific Checklists and Standard Patient Reliability
In order to better identify the areas of deficiencies, we have developed case specific checklists comprised of critical items on the history and physical. The standardized patients complete the checklists at the end of each student’s encounter. The faculty observes one student during two clinical encounters and completes the relative checklists as well. A random review of videotapes has shown high concordance between patient and faculty checklists, assuring grade accuracy. Specifically, a sample of 164 SP checklists was selected. All of the eight cases were represented. Each SP checklist was compared to a faculty’s checklist. Items were identified for which the SP decision was not confirmed by the faculty observer. A second observer was subsequently assigned to review the videotapes. Confirmed items on the checklists were supported by at least two of the three observers. The percentage of confirmed items for each sampled encounter was calculated. The final data show that the SP decisions were confirmed 91 percent of the time (Table 1).

CPeX Grade
In order to pass the exam, the students have to correctly perform 65 percent of the critical items on both the history and physical examinations. The final CPeX grade is based on multiple components as follows:

History: 40%
Patient/Physician Interaction: 10%
Physical Exam: 40%
Faculty Discussion & Feedback Session: 10%

Remediation Process
Students who fail the exam go through a remediation process with a faculty member. The remediation process consists of the following steps:

Step I: Students’ performance review. The faculty member reviews the patient’s checklists and the clinical encounter tape to identify in advance the students’ areas of weakness.

Step II: Clinical reasoning exercise assignment. The faculty contacts the student and asks him/her to focus on two of the exam cases. Specifically the student is asked to think about what elements of the history and physical exam are relevant in those clinical situations.

Step III: Formative feedback session. The faculty meets with the student for a formative feedback session, during which the cases are discussed and pertinent parts of the tape are reviewed with focus on the students’ areas of weakness.

Step IV: Additional clinical reasoning exercise.
The student is asked to read a short clinical case unrelated to the exam and identify which portions of the history and physical are relevant to the diagnosis under consideration. The case is then discussed with the faculty who guides the students through this clinical diagnostic exercise.

After going through the remediation process, the student is allowed to retake the exam.

Statistical Analysis
The Cochran-Armitage trend test was used to analyze the monthly pass and fail data throughout the academic year.

RESULTS

Throughout the academic year 2002-03, 21 students out of 191 (11%) failed the initial CPeX (Table 2). The students who failed went through the remediation process and retook the exam. The post-remediation exam scores, with a few exceptions, show significant improvement in both history and physical examination skills (Table 3). Statistical analysis of the monthly CPeX failure rate shows that there is an increasing proportion of students passing the test in the second half of the academic year. Specifically, the Cochran-Armitage trend test shows that there is an increasing trend of better performance among students later in the year (p = 0.041).

A comparison of the case component percentages shows that many students, including those who pass the exam, demonstrate difficulty with both history taking and physical examination skills, while they usually have a good command of interpersonal skills.

DISCUSSION

Our data show that, with a few exceptions, there is significant improvement in student’s performance on CPeX after targeted remediation. We believe that clinical reasoning exercises, self observation on tape and formative feedback play a central role in aiding students to improve their clinical performance.

Students’ observation reveals that they have difficulty taking a pertinent focused history and deciding what items of the review of systems, past medical history, social history, family history are relevant in a specific patient. As a result of that, it becomes harder for them to choose the appropriate items of the focused physical examination. This problem is certainly due in part to insufficient knowledge and clinical exposure. Indeed, as mentioned above, the Cochran-Armitage trend test shows that there is an increasing trend of better performance among students later in the year (p = 0.041). This could be due to several factors, including a general improvement in clinical skills due to more exposure to patient care. Nevertheless, even in the presence of adequate knowledge and clinical exposure for their level of training, students need help to understand how to choose the most important components of the history and physical exam to best delineate the patient’s problem. That’s where the clinical reasoning exercises come into play. They help the students to analyze and integrate data, to select and use information effectively and to develop a differential diagnosis.6-7 As a result of our findings, we have introduced a series of clinical reasoning exercises during the IV year Internal Medicine Clerkship. The students break into small groups and a faculty member helps them to work through several of the most common complaints and symptoms physicians are faced with in the daily practice of medicine. There seems to be a consensus among students that clinical reasoning exercises are very helpful in improving their clinical skills. Preliminary analysis of the pass/fail data of CPeX for academic year 2003-2004, after implementation of the clinical reasoning exercise, shows an increasing trend of better performance compared to last academic year.

Faculty feedback on videotaped performances is also a useful tool in improving medical students’ clinical skills. Self observation on videotape helps students to fully understand and to gain an awareness of their deficiencies, in order to focus on their own areas of weakness.8-9 After receiving feedback the students improve their ability to elicit relevant information and to perform the appropriate physical exam.10-12

CONCLUSIONS

Gaps in medical knowledge can be easily identified when they generate major obstacles to the student’s ability to take a pertinent history and perform an adequate physical exam. On the other hand, the lack of effective clinical reasoning strategies is harder to diagnose and more difficult to remedy. Direct observation of students’ performance plays a central role in identifying problems with clinical skills. We find that the use of SPs in simulated medical encounters is a valuable tool to assess students’ performance, in addition to direct observation with real patients. The use of clinical reasoning exercises and self observation on tape with faculty feedback is a useful tool to help students improve their clinical skills.

REFERENCES

  1. Pfeiffer C., Madray H., Ardolino A., Williams J. The rise and falloff student’s skill in obtaining a medical history. Medical Education. 1998; 32:283-8.
  2. Meuleman J.R., Caranasos G.J. Evaluating the interview performance of internal medicine interns. Academic Medicine. 1989; 64:277-9.
  3. Li J.T.C. Assessment of basic examination skills of internal medicine residents. Academic Medicine. 1994; 69:296-9.
  4. Johnson J.E., Carpenter J.L. Medical house staff performance in physical examination. Archives Internal Medicine. 1986; 146:937-41.
  5. Nutter D., Whitcomb M. The AAMC project on the clinical education of medical students. Association of American Medical Colleges. 2002.
  6. Kassirer J.P., Gorry G.A. Clinical problem solving: a behavioral analysis. Annals of Internal Medicine 1978; 89:245-255.
  7. Elstein A.S., Shulman L.S., Sprafka S.A., Medical problem solving: an analysis of clinical reasoning. Harvard University Press. 1978.
  8. Scheidt P.C., Lazoritz S., Ebbeling W.L., Figelman A.R., Moessner H.F., Singer J.E. Evaluation of system providing feedback to students on videotaped patient encounters. Journal of Medical Education 1986; 61:585-590.
  9. Davis E.L. et al. Use of videotape feedback in a communication skills course. Journal of Dental Education 1988; 52, (3), 164-166.
  10. Maguire P., Roe P., Goldberg D., Jones S., Hyde C., O’Dowd T. The value of feedback in teaching interviewing skills to medical students. Psychological Medicine 1978; 8:695-704.
  11. Ende J. Feedback in clinical medical education. Journal American Medical Association. 1983; 250:777-781.
  12. Carr C. How performance happens (and how to help it happen better) part 9: Feedback. Performance & Instruction. 1991; 30, (8), 26-30.

Published Page Numbers: 42-45