An Electronic Mail Tutorial to Teach Problem Solving in Pharmacology

Joseph Goldfarb, Ph.D.

Department of Pharmacology

Mount Sinai School of Medicine
New York, NY 10029 U.S.A.

(+) 1-212-659-1710
(+) 1-212-831-0114

INTRODUCTION

Over the past few decades medical schools in both the United States and Great Britain have been urged to decrease lecture hours and use formats that foster problem solving and self-learning.1-4 One response has been the increased use of small group exercises in which students are provided with a case description, a journal article, a data set, or a numerical or analytical problem to solve. During the exercise, students are encouraged to discuss the case or problem with faculty guidance, and to raise any issues that they need clarified. Depending on the instructional model, the faculty member may provide information and/or extract it from the students. Alternatively, the issues raised by students and faculty may form the basis for independent student inquiry, the results of which are discussed at a subsequent group meeting.

The transition to increased reliance on small group sessions raised many problems, including the recruitment of sufficient faculty during regularly scheduled class time; wide disparities in the level of student (and often faculty) facility with the material being discussed; student (and faculty) inhibitions about publicly admitting to lack of knowledge or understanding; and the difficulty, during group sessions, of extended one-on-one student-faculty interaction tailored to the needs of individual students.

The Pharmacology Course at Mount Sinai includes 31 hours of small group conference, performance in which contributes 20% to the final course grade. To facilitate evaluation of students, faculty members spend at least eight conference hours with a single group often covering diverse drug classes. The wide range of subject matter and the small number of faculty permit us to field7 groups of 15 students each, so significant attention to individual student needs is difficult.

An e-mail tutorial was initiated in 1997 as a supplement to the Pharmacology course in an attempt to overcome some of the limitations of our small group teaching, while retaining the focus on problem solving and case discussion. All students have e-mail accounts at Mount Sinai and there are electronic mailing lists for each of the classes. Student participation in the e-mail tutorial is voluntary and is not part of the evaluation that contributes to the course grade. This paper describes the tutorial, and discusses it in relation to small group conference.

METHODS AND RESULTS
Description
Problems similar to those included in small group conference are sent via the class electronic mailing list on Thursdays. Problems are sent weekly save for the last two weekends before exams and weekends during holidays. In the 1998 course, 9 e-mail problems were sent, six before the midterm examination during the first 9 weeks of the course and 3 during the remaining 6 weeks. Students must reply by e-mail no later than 11:59 PM the following Monday to initiate the tutorial. In their replies, students are expected to explicitly describe their reasoning, not just to provide a numerical answer or a list of drugs. If they do not know how to approach the problem they are asked to be specific about what they don’t understand. Individualized responses are sent to each student. If the student answers appropriately and it appears that the student has good command of the material, this is noted and, when applicable, supplemental information that would generally be considered beyond the general course requirements might be offered. If the student evinces minor difficulties or has made an easily correctable mistake, the correction is made and the student is invited to continue the e-mail conversation if any uncertainty remains. If, on the other hand, there seems to be some basic deficit in either knowledge or reasoning, then rather than simply providing an answer, the basic principles necessary to solve the problem would be provided, and the student asked to try the problem again. In this case, the student still retains the option of doing no further work and just requesting a correct solution. Although there is a time limit for the first student response to an e-mail problem, once the student-faculty conversation has begun, there are no time constraints and the conversation could theoretically continue until the student is satisfied.

The first e-mail tutorial problem from the 1998 series is shown below as an illustration of the format:

Welcome to the Pharmacology E-Mail Tutorial.

In answering the following problem, please briefly explain the rationale for your answers; don’t just give numbers for the numerical parts of the problem. Also please make sure all your numerical answers have correct units.

If you don’t know how to get an answer to a particular part of the problem try to verbalize your concerns. What information do you think you need, but haven’t been given? What is confusing you? Remember, there are no “dumb” questions.

Problem 1:
At therapeutic concentrations, Drug X is eliminated with ZERO ORDER KINETICS. It also obeys a one compartment model (that is, its distribution after intravenous injection is so rapid it can be considered to be instantaneous).

A 200 mg bolus IV injection of X administered at noon yielded an initial plasma concentration of 10 mg/L. Plasma concentration of X measured at 3:00 PM was 8 mg/L.

a) At what time would plasma concentration reach 4 mg/L?

b) if you gave this 200 mg bolus dose as a loading dose, and then immediately started an IV infusion of Drug X, what IV infusion rate of X would you have to give to maintain the plasma concentration of X constant at 10 mg/L?

c) What would happen if you used an IV infusion rate higher than the one you calculated in part b?

d) Under what circumstances, in real life, would you expect a drug to exhibit zero order kinetics? Given this circumstance, what would normally occur as the drug’s plasma concentration decreased?

Remember, for you to receive an answer to your e-mail, I must receive it no later than Monday, January 12.

Although figures or tables can be scanned and attached to the e-mail so that students can work from real data, this was not done because there was no assurance that all students would have the appropriate software for handling these. It was possible, however, to mimic graphical data as part of the text of the e-mail message. This was used in the e-mails to display drug-induced changes in heart rate and blood pressure, and to draw simple line graphs.

Student Participation

In 1997, the year this exercise was initiated, 25 students participated at least once. In 1998, out of a class of 110 students, 51 participated at least once in this exercise. The number of students replying to each of the 9 e-mail problems is shown in Figure 1, along with the number of faculty responses. The differences in the numbers of faculty replies and student participants reflect both continued e-mail conversations and a few instances in which two or three students elected to do the problem as a group. Overall, about one quarter of the initial responses were continued for at least one additional iteration. On two of the nine problems, no students continued the e-mail conversation beyond the initial response and faculty feedback. On the other problems, from 15% to 64% of the initial respondents were asked to redo the problem or had comments about the feedback that prompted them to respond with an additional e-mail. Participation peaked during the middle of the first half of the course, and was lower after the mid-term exam. Only 5 students did all 9 problems. Figure 2 shows the degree of participation by individual students. Data are shown for the entire course.

It should be noted that students may use the problems even if they don’t take part in the tutorial. In response to a request for feedback on the tutorial, one student who had answered only one problem during the course indicated, “I have been saving the [problems] to use as review for the exam, even though I don’t have ‘official’ answers.”

Nature of Student Responses
Almost all students who used the tutorial were explicit in their reasoning. One of the motivating factors for starting the e-mail tutorial was the perception by many of our faculty that some students were hesitant to participate in conference because they were reluctant to explicitly state their ignorance about a particular area, or to inadvertently reveal it in an attempt to answer a question. Students who used the tutorial were certainly explicit about their uncertainties. Following are two quotes from initial responses to problems:

“I am rather confused about how to attack these problems. Although I understand the concept behind the formulas, I was uncertain as to how to integrate them to solve this problem. Here is my attempt: …”

“I tried to work on this problem, but I really don’t have an idea where to start. Can you give me a feedback in which way is best to approach this problem. Thank you”

Students sometimes raised questions when they recognized that there was something illogical about their answer or when they considered doing the problem a different way. This often revealed a lack of understanding that would have gone undetected had the problem been assigned as a written exercise to be turned in, rather than as the first stage in what could develop into an extended conversation.

For example a student answering a multi-part problem about a dosing regimen ended up predicting that the highest concentration attained was lower than the mean concentration, an impossible result. She recognized this and asked what error she had made. Analysis of her answer and explanations revealed that she was using equations in a rote way without fully understanding their meaning. However, she was not blindly plugging numbers into equations because she was aware that her answers to two parts of the problem were incompatible.

Timing of Student Replies
The e-mail format permits the students to answer problems at their convenience, so it is not unusual to have initial replies to the problems posted well after school hours and throughout the weekend. But the time of e-mailing may not indicate when students worked on the problems, because many students told me that they made a hard copy of the problem on Thursday or Friday, worked on it over the weekend and e-mailed the answer on Monday.

Faculty Response to Participants
Faculty feedback to students was sent in the order their e-mails were received, often within 24 hours, sometimes sooner. Virtually all student e-mails received by Sunday evening were answered by Monday evening.

As this suggests, the major disadvantage of the tutorial was that it was time intensive. On weeks when the response rate was 20 or more students, it took from 12 to 20 hours to answer all the e-mails, depending on the difficulty of the problem and the number of students who continued with a second or third iteration of the process. In part, this reflected the fact that many of the problems had multiple sections. The length of feedback on the initial student responses varied widely. For tutorials number 3, 4, and 5, for example, replies to students averaged about 300, 350 and 130 words respectively with a range of about 40 to 680 words. At the expense of covering less ground, the problems could be simplified so that the length of the feedback to students and the faculty time commitment would be decreased. Another alternative would be to have the initial faculty recipient distribute the student replies to a number of designated faculty who would share the workload. The most obvious way to reduce time expenditure would be to use pre-composed boilerplate answers for the bulk of the feedback, but this would be in direct conflict with the objectives of the exercise.

Examination Performance
Because 6 of the 9 e-mail tutorial problems occurred before the midterm exam, and because participation was higher during the first section of the course, only midterm exam performance was analyzed. The midterm consisted of 70 multiple choice questions. The mean score for the entire class was 71.2% with a standard deviation of 10.9%. The median score was 73.6%.

Figure 3 compares the distribution of exam scores of students who participated in the e-mail tutorial, and those who did not. Students who participated were more likely to have total examination scores above the median. This primarily reflected students who did at least half the e-mail problems (24 of 28 such students had grades above the median grade for the entire class), but not those who did 1 or 2 e-mails (only 10 of 21 such students had grades above the class median grade). These data, of course, do not reveal whether the e-mail tutorial just attracted the students who had less trouble mastering the material, and who, therefore, felt they had the time to participate in a supplemental voluntary exercise, or whether participation in the tutorial was instrumental in improving performance. Because our goal was to offer the tutorial to all students, that prohibited a prospective design in which the intervention was offered to only part of the class in a randomized, controlled experiment.

However, it is possible to assess the contribution of the tutorial by analyzing student’s performance on the midterm questions that were most closely related to the tutorial problems, compared to the remainder of the exam. Twelve of the 70 exam questions were directly related to problems presented in the e-mail tutorials. Based on the grade distribution show in in Figure 3, the class was divided into two groups: students doing 3 or more of the 6 pre-midterm problems, and those who did not participate or did ony 1 or 2 problems.

As shown in Table 1, the 28 students who did 3 or more problems performed better than the remainder of the class on both related and unrelated questions. A 2-way ANOVA comparing the performance of the two groups of students on the two subsets of exam questions showed an overall difference between the two groups of students.

Student Evaluation of the Tutorial
Students were asked by e-mail, to comment on the strengths and weaknesses of the tutorial as compared to small group conference. Two of the comments follow.

“I also appreciate you taking time to answer us
individually because, as opposed to small group where
there are too many of us, my thought processes could be
picked apart. On the same token, it was a useful exercise
for me to have to verbalize my thought process because
it helped me identify my weaknesses for myself.”

?The only way it’s [the e-mail tutorial] worse than conference is that there is more of a delay in feedback. Otherwise it’s great b/c I’m a lot more likely to attempt to answer a question w/ “I don’t know exactly but I think the answer might be….” in a one-on-one situation instead of in a group of 20. Also, the feedback is more focused to my individual weaknesses vs. a group as a whole where different people may have different problems and not all are addressed.?

Students varied in their opinions about the consequences of the delay noted above between a student’s answer and the faculty feedback:

“The dialogue succeeding the original question–most
importantly the depth and care of the responses from
you…is a chance to get one on one attention with
intricate problem solving and have individual problems
addressed. The speed of replies keeps the dialogue fresh
and valuable.”

Students also had the opportunity in the end-of-course narrative evaluations to comment on this exercise. Of 97 students who wrote narrative evaluations, 12 cited the e-mail tutorial as one of the strengths of the course.

DISCUSSION

E-mail is an extremely common communication medium in academia and elsewhere, but there are few reports of its use as a stand-alone educational tool. Latting5 introduced e-mail in a graduate social work class with use of electronic communication itself (rather than its use as a teaching and learning tool) being a major goal. Letterie, et al.6 report the use of an e-mail system for didactic teaching in an Obstetrics and Gynecology residency program. In this instance e-mail was used to distribute questions, and residents replied by e-mail. However, rather than individual feedback keyed to the resident’s response, stock answers and references were distributed by e-mail and the questions discussed in later group meetings.

Coulehan, et al.7 used e-mail to supplement a small group curriculum in ethical and social issues in medicine. Students formed small e-mail(NET) groups with a tutor. Problems were distributed by e-mail every 3 weeks and students had a week to respond with an initial analysis that was distributed to the tutor and to the other students in the group. This was followed by student critiques of at least one of their colleagues’ initial submissions. All students received an open copy of at least one tutor’s critique of the student submissions and a discussion summary. In addition faculty could, and sometimes did, post individual private critiques to students. In this model the NET group was also a small-group conference group and so there was spillover from the e-mail exercise into the class meetings.

Perhaps the closest parallel in format to the current Pharmacology tutorial model is that of Todd8 who used e-mail in an undergraduate Child Health Nursing course. Students were required to do 10 “critical thinking exercises” in which a scenario was distributed by e-mail along with four possible courses of action. Students were given 5 to 7 days to select an answer to the question and to present the rationale for their answer. The goal was for students to read the relevant material before the issues were discussed in class. Faculty feedback to the students consisted of a “generalized feedback response” which was “personalize[d] as needed.” Performance in this exercise was evaluated and contributed to the course grade.

Some of the advantages of the use of e-mail cited by Coulehan, et al.7 include giving voice to students whose personalities militated against full expression during small group discussions and permitting students to consider problems at their convenience with fewer time constraints than with in-class discussions. Similar advantages have been recognized by other faculty who have used e-mail in various ways in the classroom9,10 and appear to be at work as well, in the Pharmacology tutorial. A specific advantage of using regularly scheduled e-mail problems that was evident in our course and that has been recognized by others,8,11 is that it often enables faculty to recognize deficits in a student’s basic fund of knowledge or in their thought processes early on, and correct them immediately. Because such gaps are rarely restricted to a single student, this also allows faculty to review these areas with the entire class during lectures or small group sessions, or to use e-mail to alert the class to a possible widespread misconception.

The Pharmacology tutorial differs from previously described models in many respects. First, it is totally voluntary. Not only do students have the option of participating or not participating, but participating students have no continuing obligations during the course of the semester. One can enter the process at any problem and can pick and choose from week to week whether or not to participate in the tutorial. While this presents maximal flexibility to students, it also means that participation is likely to wane as other factors take precedence. Thus, participation dropped as the Pharmacology midterm exam (as well as the final exams in Endocrine and GI-Liver Pathophysiology) approached. Participation also remained low during the second half of Pharmacology, when many students were starting their review for Step I of the USMLE exams. Second, participation in no way contributed to the course grade, not even as a supplement for those students who were borderline. That is not to say that student responses were not evaluated. In fact, the quality and quantity of the evaluative comments was probably higher than for any other aspect of the course, in both expressions of praise for jobs well done and constructive criticism when responses are less than satisfactory.

Third, improving computer literacy was not one of the goals of the program; ability to use e-mail was assumed. Fourth, this model was focused entirely on faculty-student interaction, as opposed to the student-student interaction described for the NET groups of Coulehan, et al.7 Notwithstanding this, a few students elected to do some of the tutorials in groups of 2 or 3, which shows that the model does not prohibit cooperative interactions among students.

CONCLUSIONS

This exercise is, at its core, a one-on-one interaction between student and faculty with faculty feedback intimately dependent on the student’s particular response, and with the possibility for an extended student-faculty conversation. Unlike almost all the examples in the literature, the Pharmacology tutorial has no feedback component that is “boilerplate.” Except for the fact that each week’s exercise is based on a different faculty-chosen problem, it is the electronic equivalent of the classic model of weekly meetings between a student and tutor.

As an elective component of the course, it provides an opportunity for students who want to extend their understanding by engaging in sophisticated dialog with faculty in a problem based setting, while simultaneously affording students who need remedial work a private tutorial where they can ask any questions and feel free to openly voice uncertainty and ignorance.

Overall exam grades were higher for students who were the most active tutorial participants and performance, relative to that of non-participants, was even better on questions directly related to tutorial content. This suggests that even if higher achieving students self-selected to participate, the tutorial could improve performance even in this group.

The major disadvantage of this model is the immense faculty time commitment required. But despite the time invested, the e-mail tutorial is, at least to this faculty member, one of the most satisfying endeavors in large group teaching.

REFERENCES

  1. Association of American Medical Colleges Physicians for the Twenty-First Century: The GPEP Report: Report of the Panel on the General Professional Education of the Physician and College Preparation for Medicine. AAMC, Washington DC, 1984.
  2. Association of American Medical Colleges Educating Medical Students: Assessing Change in Medical Education–The Road to Implementation AAMC, Washington DC, 1992
  3. General Medical Council Education Committee Recommendations on Undergraduate Medical Education The General Medical Council, London UK, 1993.
  4. Arston, R.Q. and Jones, R.M. (eds.) Medical Education in Transition: Commission on Medical Education: The Sciences of Medical Practice Robert Wood Johnson Foundation, Princeton NJ, 1992.
  5. Latting, J.K. Diffusion of Computer-Mediated
    Communication in a Graduate Social Work Class: Lessons from “the Class From Hell.” Computers in Human Services 10: 21-45, 1994.
  6. Etterie, G.S., Morgenstern, L.L. and Johnson, L.
    The Role of an Electronic Mail System in the
    Educational Strategies of a Residency in Obstetrics
    and Gynecology. Obstetrics & Gynecology
    84:137-139, 1994
  7. Coulehan, J.L., Williams, P.C. and Naser, C.
    Using Electronic Mail for a Small-Group
    Curriculum in Ethical and Social Issues. Academic
    Medicine
    70: 158-160, 1995.
  8. Odd, N.A. Using E-mail in an Undergraduate Nursing Course to Increase Critical Thinking Skills. Computers in Nursing 16:115-118, 1998.
  9. Folaron, G. Enhancing Learning with E-Mail.
    Journal of Teaching in Social Work 12: 3-18, 1995
  10. Manning, L.M. Economics on the Internet:
    Electronic Mail in the Classroom. Journal of.
    Economic. Education
    27: 201-204, 1996.
  11. Pitt,M. The Use of Electronic Mail in Undergraduate Teaching. British Journal of Educational Technology 27: 45-50, 1996.

NOTE: Please refer to Vol 10 No 1&2 Complete PDF file for all Tables & Figures.