Expert and Non-expert Tutors: Role in Problem-Based Learning

Ihab A.M. Ahmed, MBBS, FRCS, CME, MD, FRCS (Gen)1, Maha A.M. Elseed, MBBS, MRCPCH2, and Mohamed. A. El-Sheikh, MBBS, FRCS, MSc, FRCS (Gen)3




Problem-based learning (PBL) is considered by many to be an important innovation in medical education. Integral to the success of PBL is the role of the tutor, which is considered a prime determinant of how the tutorial group functions. Recently, there has been considerable debate about the role of subject expert and non-expert tutors in facilitating PBL session and who will be more effective and efficient in facilitating the learning experience. The aim of this review is to evaluate the role of expert and non-expert tutors in facilitating PBL sessions.

An electronic search of the Medline database was undertaken for the articles published on tutoring in PBL with specific reference to expert and non-expert tutors in PBL. Relevant articles were chosen for review and analysis.

Tutoring in PBL is a multifaceted process, with the issue of the tutor expertise forming only one part of this complex process. Published literature was equivocal on this issue because of the inconsistency in defining expert and non-expert tutors and the inconsistency in evaluation tools used. There is no consistent evidence to suggest that groups facilitated by expert tutors do better, in terms of student academic achievement and student perception, than those facilitated by non-expert tutors.


Problem-based learning (PBL) was first developed at McMaster University in the 1960s and is regarded by some as the most important innovation in medical education. Definitions of PBL vary, but a comprehensive example would be “an educational method characterized by the use of patient problems as a context for students to learn problem-solving skills and acquire knowledge about the basic and clinical sciences”.1

The PBL approach is based on principles of adult education and cognitive psychology. It differs fundamentally from traditional curricula, in which students acquire “background” knowledge of the basic sciences in the early years of the course and in the later years apply this knowledge to the diagnosis and management of clinical problems. The traditional approach has been criticized for several reasons including the fact that it creates an artificial divide between the basic and clinical sciences, wastes time in acquiring knowledge that is subsequently forgotten or found not to be useful, and for the difficulty in the application of the acquired knowledge. In addition, in traditional learning, the acquisition and retention of information may seem irrelevant and can even be perceived as boring to the students. Problem-based learning, with its educational objectives, can avoid many of these problems. Various disciplines, particularly the basic and clinical sciences, are integrated throughout the curriculum. As the students attempt to understand and solve clinical problems, they learn about normal bodily structure and function, and apply this knowledge for their search for a solution. Learning occurs in this context and builds on what students already know. PBL can aid retention, add interest and increase motivation to learn. Students, with initial help from tutors, determine their own learning needs and the strategies they need for learning e.g., the efficient accessing of library resources or the formation of study groups.1

Integral to the success of a PBL program is the role of the tutor. The tutor performance is considered (along with the student prior knowledge and the quality of the cases discussed) to be a prime determinant of how the tutorial group functions.2 The main role of the tutor in PBL is to facilitate the proceedings, to ensure that the group achieves appropriate learning objectives in line with those set by the curriculum design team, to ensure that all the students have done the appropriate work and to help students self assess their understanding of the material by encouraging them to ask questions.3 Some authors have argued that subject-expert tutors would facilitate the tutorial process more effectively and more efficiently than non-experts, because such tutors were more comfortable tutoring in areas of their expertise.4 Recently, there has been considerable debate about this theory and about the role of expert and non-expert tutors.


There has been no published uniform definition of “tutor expertise” in PBL. Generally, most of the published literature about tutoring in PBL, have divided expertise into two categories:

  1. Some authors have defined an expert tutor as a tutor with previous PBL experience.2 Expertise was further subclassified by these authors according to the number of PBL modules tutored by a particular tutor.
  2. Other authors have defined an expert tutor as a subject-matter expert.4-8 However, different authors have employed different definitions of what constitutes subject-matter expertise in tutoring. Some authors have defined expertise in terms of background training. For example, Schmidt described non-expert tutors as either those whose previous training was unrelated to the problem at hand or tutors who have a general background knowledge regarding the unit, but not specific expertise.9 He describes expert tutors as those with specific background knowledge relevant to the unit.9 Bochner and coworkers defined an expert as one who displays specific skills or knowledge derived from training or experience.2 Dolmans and colleagues have defined expertise as the tutor self-perceived subject matter expertise,8 whereas Schmidt and Moust defined expertise as a tutor with advanced disciplinary training and/or research experience in the problem under study.7
  3. In addition, expertise has been further defined by whether the tutor is medically qualified and/or whether he/she holds an academic (compared with non-academic or a student tutor) appointment.

These inconsistencies in the definition of expertise might have contributed to the variable results from different studies, as discussed later.


Studies that have examined the effect of expertise (with specific regard to subject-matter expertise) have used different methods to measure the outcome. Most studies have employed the academic achievements of students at the end of the unit or module as the major outcome. Other studies have used different end points, including student and tutor perception or time spent on studying. Other studies have used a combination of two or more assessment methods.


Some authors found that students guided by subject-matter experts achieved better than those guided by non-experts. Schmidt and coworkers examined the effect of expertise in PBL and compared students academic achievements relative to expert and non-expert PBL tutors.5 Achievement was measured from 100-150 true-false items and by short essay questions. They found that students guided by subject-matter experts, defined as tutors who have specific knowledge derived from training or experience, achieved better than did the students guided by non-expert tutors. The effect of subject matter expertise on achievement was strongest in the first curriculum year, suggesting that new students are more dependent on their tutor expertise than are the more advanced students. In another study to assess the effect of subject-matter expertise on outcome, Hay and Katsikitis assessed the student achievement by administering a random set of five questions to each student at the end of the module.10 Students who were taught by experts scored higher in the end-of-course test in the topic area, suggesting that students guided by an expert have a better learning outcome. Expertise was again defined as professional competence in the subject.
Dolmans and coworkers also compared groups guided by subject-matter expert versus non-expert tutors. Expertise here was defined as the tutor’s self-perceived subject matter expertise. In this study, measurement of achievement consisted of three parts for each individual student: a multiple-choice test, short essay questions and a tutor assessment regarding the student performance.8 No difference emerged from the multiple-choice tests. In one of the two years studied, the students guided by the expert tutors performed significantly better on the essay tests. The authors attributed this finding to the fact that the experts were actually involved in the production of the questions for the tests, which may have contaminated the results.

However, other authors also demonstrated that there was no difference in performance of students in PBL groups led by subject-matter experts or non-expert tutors. Steele and colleagues compared learning outcomes in PBL groups led by students and those led by faculty.11 Learning outcomes were assessed by performance on objective examinations covering factual materials pertinent to the case. No differences were detected in student performance on the objective evaluation, based on whether the facilitator was a faculty member or peer group member. Similarly, in a retrospective study to establish whether or not tutor subject-matter expertise influences student achievement on content-based examination in a PBL curriculum, no difference was demonstrated between groups led by subject-matter experts or non-expert tutors.6 Assessment was conducted by end-of-block modified essay questions. In this study, the expert tutor was a medically qualified tutor while the non-expert tutor was from a humanities background. Matthes and colleagues compared the learning outcomes by an end-of-term examination consisting of multiple-choice questions and short-essay questions. No difference was demonstrated between subject-matter experts versus non-experts led groups.12 In this study, non-expert tutors were peer undergraduate students and junior staff members while expert tutors were senior staff members who had completed post-graduate education. Other studies also did not reveal any difference in the level of achievement of students led by expert or non-expert tutors.13, 14


Student perception, tutor perception and student-related factors, e.g. time for preparation, have also been assessed as outcomes to compare expert and non-expert tutors.

Self-report ratings by tutors and students were one of the measures used by Hay and Katsikitis to evaluate outcomes in PBL groups led either by expert or non-expert tutors.2 There were no significant differences between the student overall ratings of expert or non-expert tutors; however, the non-expert tutor was rated more highly for group management skills. Steele and colleagues compared perceptions of facilitator behaviors and group functioning in PBL groups, led by students or by faculty, using a questionnaire completed at the end of each individual case.11 No differences were detected in the perception of group process based on whether the facilitator was a faculty member or peer group member. Matthes and colleagues compared the influence of tutor expertise on the process and outcome in a PBL basic medical pharmacology course.12 Expert tutors were staff members who completed postgraduate education while non-expert tutors were junior staff members. In addition, peer tutors (i.e., undergraduate medical students) were included in the study. The authors found that the tutor performance scores of peer-led groups did not differ from those of staff-led groups. Student weekly preparation time tended to be lower in peer-led groups, while learning time spent specifically on exam preparation seemed to be increased compared to PBL-groups of staff tutors. As part of their study, Matthes and coworkers looked at tutor experience in coaching PBL groups. Groups led by experienced staff, defined as tutors with at least one term of previous PBL tutoring, were found to have significantly higher evaluation scores.12


The results of the studies reviewed here were generally equivocal. The reasons for that are:

  1. These equivocal results might be related to the inconsistent definition of expertise. While few studies looked at expertise as experience in PBL tutoring, most of the studies looked at expertise as subject-matter expertise. There was a wide spectrum in defining subject-matter expertise. For instance, Schmidt described an expert as someone who has specific background knowledge relevant to the unit.10 Davis described an expert as someone with advanced disciplinary training and/or research experience in the problem under study,13 while Dolmans employed a self-report measure of expertise by which tutors indicated the extent to which they considered themselves experts.9
  2. The extent to which students were exposed to PBL may be another confounding factor. It is often observed that students with little to no experience in PBL rely more heavily on their tutors as sources of guidance and information. If these tutors are familiar with the subject matter to be mastered, this may make a difference in learning outcome and level of achievement.5
  3. The methods of assessment for learning outcomes may confound the results. Some of the studies have assessed the student achievement, in the form of examinations, while others have analyzed the student and tutor perception of the PBL sessions as a measure of outcome.

For the studies that assessed outcomes using an examination, both short-essay type of questions and multiple-choice questions (MCQs) were used. MCQs are probably a better form of assessment because of the consistency of the questions and expected answers. However, short-essay questions might be a subjective form of examination as they can be open to different interpretation.

Another method of assessment was questionnaires. Use of questionnaires can be a very subjective way of assessment (there was no mention of use of a validated questionnaire format in the reviewed studies).


Schmidt and colleagues demonstrated that the effects of tutor expertise showed up in only some of the first-year units, whereas in other units the effect could not be found.5 These authors suggested that the effects of tutor expertise on learning must be mediated through an unknown, unit-related factor. Another study by Schmidt concluded that a minimum level of structure is needed for students to benefit from problem-based instruction.10 This structure can be externally provided, through the structure provided by the learning materials. Alternatively, the structure can be internally provided through prior knowledge available for understanding the new subjects, or offered by the environment in the form of cues of what is relevant and what should be the focus of the activities. If prior knowledge falls short, or if the environment lacks structure, students will turn to their tutors for help and direction. Under these conditions, students who are guided by a subject-matter expert tutor may benefit more than students guided by a non-expert tutor.


It is believed by some that subject-expert tutors would facilitate the tutorial process more effectively and more efficiently than non-experts, since such tutors were supposedly more comfortable tutoring in areas of their expertise. Some of the studies we reviewed have challenged this hypothesis. Silver and Wilkerson have shown that subject-matter experts, compared to non-experts, were significantly more directive, spoke more often and for longer, provided more direct answers, suggested more discussion topics and presided over exchange patterns that were predominantly tutor to student compared with student to student.14 This study suggested that tutors should recognize the potential effects of their authority and knowledge. The study has also raised concerns that students with dominant tutors might miss opportunities to prioritize their learning needs, ask and answer crucial questions and synthesize their learning. The authors concluded that tutor expertise might have deleterious effects on the process of collaborative learning, endangering the development of student skills in active, self-directed learning.


In addition to the use of subject-matter knowledge, five other sets of behaviors guide tutorial groups. These are: use of authority, achievement orientation, an orientation towards cooperation in the tutorial group, role congruence, and cognitive congruence. Role congruence is defined as the willingness of the tutor to be a “student among the students,” that is, to seek an informal relationship with the students and display an attitude of personal interest and caring. Cognitive congruence was defined as the ability to express oneself in the language of the students, using the concepts they use and explaining things in ways easily grasped by students. This was framed in the context of a theory of problem-based learning.9 The authors assume that the tutor behavior is one of three factors affecting the way in which small-group tutorials function (the other two being the student prior knowledge and the quality of the problem handled). In turn, the small group functioning would influence time spent on self-directed learning activities and intrinsic interest in the topic studied. Finally, time spent would influence achievement on appropriate tests.9
From the above, it can be concluded that tutoring in PBL is a complex process with the issue of tutor expertise forming one part of this complex process. The student prior knowledge, quality of the presented problem and the student-tutor relationship form other parts of this process. Most of the reviewed studies failed to take these factors into consideration, which will limit the interpretation or external validity of these studies.


The question of the effect of the tutor expertise on student learning has raised considerable controversies. The published literature was equivocal in answering this question. Part of the published literature demonstrated that expert tutors have a positive impact of student learning and educational outcomes, while other studies failed to confirm these findings. Several reasons might explain these inconsistencies, including an inconsistent definition of expertise, differences in the methods used to assess outcome and previous exposure of students to PBL. The authors suggest that the definition of “expert tutor” should be restricted to those who have back-ground professional knowledge and/or expertise in the subject-matter while “non-expert” tutors should be those who have no back-ground knowledge about the subject.

Tutoring in PBL is a complex process with the tutor subject matter expertise forming only part of this process. Future larger studies, taking into consideration factors which resulted in inconsistencies in the published literature and other factors affecting tutoring in PBL, are needed to answer the question of the impact of tutor subject-matter expertise in PBL presentation and the role of PBL in student learning.


  1. Finucane, P.M., Johnson, S.M., and Prideaux, D.J. Problem based learning: its rationale and efficacy. Medical Journal of Australia. 1998; 168: 445-8.
  2. Bochner, D., Badovinac, R.L., Howell, T.H., and Karimbux, N.Y. Tutoring in a problem-based curriculum: Expert versus nonexpert. Journal of Dental Education. 2002; 66: 1246-1251.
  3. Wood, D. ABC of learning and teaching in medicine: Problem based learning. British Medical Journal. 2003; 326: 328-330.
  4. Schmidt, H.G., Van Der Arend, A., Moust, J.H.C., Kokx, I. and Boon, L. Influence of tutors’ subject-matter expertise on student effort and achievement in problem-based learning. Academic Medicine. 1993; 68: 784-791.
  5. Kwizera, E.N., Dambisya, Y.M. and Aguirre, J.H. Does tutor subject-matter expertise influence student achievement in problem-based learning curriculum at UNITRA Medical School? South African Medical Journal. 2001; 91: 514-516.
  6. Gilkison, A. Techniques used by ‘expert’ and ‘non-expert’ tutors to facilitate problem-based learning tutorials in an undergraduate medical curriculum. Medical Education. 2003; 37: 6-14.
  7. Schmidt, H.G., and Moust, J.H. What makes a tutor effective? A structural-equations modeling approach to learning in problem-based curricula. Academic Medicine. 1995; 70: 708-14.

  8. Dolmans, D.H.J.M., Wolfhagen, I.H.A.P., and Schmidt, H.G. Effect of tutor expertise on student performance in relation to prior knowledge and level of curricular structure. Academic Medicine. 1996; 71: 1008-1011.
  9. Schmidt H.G. Resolving inconsistencies in tutor expertise research: Does lack of structure cause students to seek tutor guidance? Academic Medicine. 1994; 69: 656-662.
  10. Hay, P.J., and Katsikitis, M. The ‘expert’ in problem-based and case-based learning: necessary or not? Medical Education. 2001; 35: 22-26.
  11. Steele, D.J., Medder, J.D., and Turner P. A comparison of learning outcomes and attitudes in student- versus faculty-led problem-based learning: an experimental study. Medical Education. 2000; 34: 23-29.
  12. Matthes, J., Marxen, B., Linke, R.M., Antepohl, W., Coburger, S., Christ, H., Lehmacher, W., and Herzig, S. The influence of tutor qualification on the process and outcome of learning in a problem-based course of basic medical pharmacology. Naunyn-Schmiedeberg’s Archives of Pharmacology. 2002; 366:58-63.
  13. Swanson, B.D., and Case, S.M. Assessment in basic science instruction: directions for practice and research. Advances in Health Sciences Education 1997; 2(1): 71-84.
  14. De Volder, M.L., and De Graves, W.S. Approaches to learning in a problem-based medical program: a developmental study. Medical Education. 1989; 23: 262-4.
  15. Davis, W.K., Nairn, M.E., and Anderson, R.M., Effect of expert and non-expert facilitators on small-group process on a student performance. Academic Medicine. 1992; 67:470-4.
  16. Silver, M., and Wilkerson, L.A. Effects of tutors with subject matter expertise on the problem-based tutorial process. Academic Medicine. 1991; 66: 298-300.

Published Page Numbers: 9-13