Assessment of an Online Learning Objective Answer Database

Douglas J. Gould, Ph.D. and Brian R. MacPherson, Ph.D.

Educational Technology Development Group

Department of Anatomy and Neurobiology

University of Kentucky
Lexington, KY 40536-0298 U.S.A.

(+)1-859-323-5484
(+)1-859-323-5946

ABSTRACT

The objective of the present study was to create and evaluate an on-line learning objective answer database. The goal of the answer database was to make information about, and/or answers to, learning objectives that students were provided with in their lecture notes available to them anytime. Our hypothesis was threefold: i) that student satisfaction with the availability and access to the learning objective answers would be high, ii) that having the answers made available to them would not affect student performance in the course, raising or lowering, the class average artificially, and iii) that less instructor time would be spent discussing routine issues/questions, freeing up office hours to deal with students with more serious problems with the course content. The answer database was created using Filemaker Pro 4.1 R and housed on Macintosh computers in faculty members’ offices. The effectiveness of the database was evaluated over a 3-year period and compared with the previous 3-year period in which the database was not used. Evaluation methods included student and faculty surveys and numerical assessments of overall course and database rating and average course grades. Results indicate that there is no significant difference in student satisfaction and grades with or without the database, while faculty time spent in office hours and answering routine questions is decreased.


INTRODUCTION

Learning objectives are widely considered to be a valuable tool for both instructor and student 1, 2, 3. Ferguson states that learning objectives are statements of desired, observable, teachable, and learnable behaviors that are evidence of learning.3 Wyte et al. claim that learning objectives describe in precise, measurable terms what learners must do to meet course objectives, and that they can improve general education instructional outcomes.2 Properly constructed objectives represent relatively specific statements about what students should be able to do following instruction 4. Instructors use learning objectives to guide lectures, prepare lecture notes and formulate test questions. Educators identify certain benefits from the use of objectives. Used as statements of intended learning outcomes, they provide direction of intended learning opportunities and aid in selection of appropriate teaching strategies and materials3. For students, learning objectives are often relied upon to guide studying by focusing attention on critical areas and serve as a summary tool for exam preparation. A learning objective describes what students should do to demonstrate that they are competent in an area5. Objectives can serve as educational guidelines that help students organize information, establish priorities and assess progress1.

However, the literature does contain criticism in the use of learning objectives. Results of a study by Wyte et al., indicated that learning objectives do not augment core content retention of material.2 Other criticisms suggest that learning objectives are too restrictive and may inhibit students’ future learning, that they may focus teaching and evaluation efforts on items of a trivial, but easily defined nature, that a large, carefully defined list of objectives may be to cumbersome and inflexible to be used effectively1. It is this latter criticism that would seem to be the key to making learning objectives beneficial to both instructor and student alike. The stated criticisms are valid if learning objectives are approached dogmatically and rigidly and if insufficient time and effort are expended in their formulation. Some feel that learning objectives must be continually refined based on careful evaluation of curricula and discussion with colleagues and students1.

The practice of continually refining and updating learning objectives renders the majority of the criticism of their usefulness irrelevant. It is also the same need for continual and ongoing refinement that makes distribution, presentation and discussion of objectives via the internet a logical choice. Material made available via the web is easily distributed to a wide audience, allows for a great deal of flexibility regarding presentation method and is modifiable by one central individual – the instructor. One aspect of learning objectives that may lend itself particularly well to online distribution is the answers to, or discussion of, the objectives.

A further benefit of providing the answers to learning objective online is the ability of an instructor to provide students with information about specific objectives. In addition, following the initial creation of the information database, instructor time devoted to learning objectives and student concerns about the answers is greatly reduced. Students consistently indicate their desire to be provided with the answers to learning objectives. Instructors on the other hand are somewhat reluctant to give them the answers en mass knowing that simple memorization is generally the result – without the student understanding the material and being capable of deductive reasoning or extrapolation through use of the critical information the objectives were designed to ensure the student understood. Instructors often prefer that students seek out the answers from their texts and lecture notes. While there is considerable merit to this faculty strategy, such as promoting active learning, it does not increase students’ opinion of the course and often increases student visits to the instructor, often to obtain answers that do not require elaboration, extensive discussion, or consultation with an instructor. In addition, many students simply want each answer they provide verified by the instructor.

Databasing software such as Filemaker ProR or Microsoft AccessR make it possible to enter answers and points of discussion related to specific learning objectives into a searchable online database for students. While the initial investment of time taken to master the database software, create a user interface, and enter the data, can be significant, modifications to, and refinement of, the data set are easily accomplished on a continuing basis and are immediately available to all users. Memorization of learning objective answers is facilitated by students simply printing out the database. In an effort to stem the rush to print out the answers, the database can be configured such that answers can be displayed only one item at a time. In addition, it should be reasonable to expect that the average student has no rational reason to have to check every answer (although some do). Displaying individual learning objective answers makes accumulating an entire list of answers a cumbersome and time-consuming process, thereby increasing the likelihood that students will only use the database for the most difficult objectives and only after they have failed to answer the question by searching their notes or text. This approach to optimal use of the learning objective database demonstrates that the instructor wants to provide a high level of active learning to the students.

The purpose of the present study was to create and evaluate an online learning objective database. The goal of the database was to make information/answers about the objectives, that students were provided for each topic in the course in their lecture notes, available to them anytime. Our hypothesis was threefold: i) that student satisfaction with the availability and access to the learning objective answers would be high, ii) that having the answers made available to them would not affect student performance in the course, raising or lowering, the class average artificially, and iii) that less instructor time would be spent discussing routine issues/questions, freeing up office hours to deal with students with more serious problems with the course content.

MATERIALS AND METHODS

A database consisting of 945 individual records was created using Filemaker ProR 4.1. Each L.O. has a unique identifying ascension number. The database was hosted on an older Power Macintosh CPU provided with an active Ethernet port. Filemaker ProR version 4.1 has an important advantage over version 5.0 in that it does not limit the number of users who can access the database via the web at any given time. The web companion feature of Filemaker ProR makes the database instantly accessible via the Internet. To prevent students from accessing the entire database and printing a single list of all the answers, access to the database was created through a Filemaker (Claris) HomepageTM HTML interface. This allows users to search for, and display, only one L.O. answer at a time. Links to the database were supplied to students from the course website as well as having the URL provided at the top of each L.O. page in the lecture notes.

The database was created for undergraduate (ANA 209) and professional undergraduate level (ANA 530) students enrolled in our anatomy courses. There are an average of 40 lectures per course and 23 L.O.s per lecture, which are all represented in the online database. Each individually numbered L.O. answer is created as a separate file in the database. The files contain short, succinct answers to the L.O.s provided at the end of each lecture. The L.O.s used, especially those in the undergraduate course, are focused and specific, making them more amenable to definitive answers for the use in the database. For example, rather than an integrative L.O. indicating that students should understand taste, a combination of focused L.O.s written to accomplish the same goal include: 1) What structures receive taste sensations?, 2) Where are the structures that receive taste sensations located?, 3) Which cranial nerves are involved in taste sensation?, and 4) What is the function of the serous glands at the base/between the lingual papillae?
Beginning in 1999, the effectiveness of the L.O. answer database was assessed via targeted questions on student course evaluations. A series of

questions dealing with its perceived effectiveness to users were asked to four student groups (ANA 209; 1999, 2000, and 2001 fall semesters and ANA 530; 2000 fall semester). These questions focused on whether the L.O. answer database was a good way to obtain answers to the learning objectives, the effectiveness of the L.O.s themselves, and how the students rated the course overall. Each question was answered using one of the following 5 rating levels: 5 = strongly agree; 4 = agree; 3 = not sure; 2 = disagree and 1 = strongly disagree.

Starting at the beginning of the spring 2001 semester, a ‘hit counter’ Nedstat was used to record each time students accessed the database. The counter was incorporated into the page that linked to the database. Incorporation of the counter immediately ‘upstream’ of the actual database counted the number of times the students visited the database, not the number of L.O. answers searched with each visit. The web log feature of Filemaker ProTM can track each user through their access IP address, which L.O.s were accessed and the time taken to respond to their search request with a response from the database.

RESULTS

Analysis of the database was composed of five main themes: 1) student satisfaction, as measured using student opinion surveys, 2) student use, as measured using the Nedstat counter, 3) pattern of student use as assessed by the Web Log feature of Filemaker ProR, 4) student success, as measured by comparing course averages from groups who used the database vs. groups who did not, and 5) faculty time saved, as measured by a survey of faculty involved in courses using the database.

The numerical data presented here was gathered from a large (~200 students per semester) undergraduate basic human anatomy course at the University of Kentucky – ANA 209. Written comments were selected from the student evaluation (opinion survey) of the course and its features, carried out at the end of each semester. Included in the evaluation are several questions that address the effectiveness of the learning objectives in general and the online component of the course (syllabus, learning objective answer database, announcements, grades, etc.). The numerical assessment of two survey items; i) The online learning objective answer database was a good way to obtain answers to the learning objectives (Figure 1), and ii) How would you rate this course overall (Figure 2), are presented here.

Figure 1 illustrates the student rating of the usefulness of the learning objective database (1 = strongly disagree, 5 = strongly agree). While the database has been continually rated in excess of 4 (agree) since it’s inception in 1999, there has been a slight decline in student assessment of its usefulness. The evaluation also allows space for specific comments about features of the course being evaluated by the instructor. Selected comments (occurring more than twice) on the usefulness of the online database are: “I think the online learning objective answers are very helpful/useful”, “Online learning objectives on the Internet were fantastic/wonderful”, “Pros ” Online learning objective answers”. Overall, student satisfaction with the online learning objective answer database is high. Criticisms primarily center on it’s inherent design – the inconvenience of not being able to print/view all of the answers in a single list.

Figure 2 compares overall satisfaction with the course from two separate groups of undergraduate students. Group 1 (n=90), is composed of students selected at random from each of the three years prior to incorporation of the answer database: 1996 (n=30), 1997 (n=30), and 1998 (n=30). Group 2 (n=90), is composed of students selected at random from each of three years after incorporation of the answer database: 1999 (n=30), 2000 (n=30) and 2001 (n=30). Regardless of the availability of the answer database, students continually rate the course in excess of 4.0 (high). The difference between groups was not significant using the unpaired student??bf?s t-test with significance set at P<0

Figure 3 illustrates the average academic performance levels for students in the course. Again, the two student groups were compared. The average final grade of students in Group 1 (those from 1996 through 1998) that did not have access to the answer database was compared to students in Group 2 (those from 1999 through 2001) who did have access to the database. The average final grade in Group 1 was 75.15% over the three-year period. After integration of the database, the average final grade in the course was 75.11%. The difference between groups was not significant using the unpaired student’s t-test with significance set at P<0

Analysis of the data accumulated in the Web Log tracking feature of Filemaker ProR indicated that roughly half of the students that used the database (identified by IP address) sequentially checked every learning objective answer. The other half checked only answers they were unsure about. The Nedstat hit counter recorded student visits to the answer database, from the course web site, over a three semester period (Figure 4). There were 1,224 visits during the spring 2001 (N = 212 students), 1,570 visits during the fall 2001 (N=185 students) and 1,161 visits during the spring 2002 (N = 185 students) semesters. An interesting fact the counter brought to light was that the most active day of the semester accessing the database occurred the week prior to the second exam (that point at which the student can still drop the class and not have it appear on their transcript).

Instructors in this large undergraduate course who have taught over the six-year period with and without the online learning objective database were asked several questions regarding student visits during office hours. Instructors indicated that there were substantially fewer student visits. Furthermore, instructors felt they were able to spend more time with the reduced number of students who did visit their office experiencing substantial problems with the course or concepts covered in it. The general sense of the instructors is that routine verification of learning objective answer visits have been eradicated with the presence of the answer database. An instructor survey listed the following as the top three benefits of the learning objective answer database. 1) Time is spent with students that are in more need of attention; 2) Questions from these students tend to require more in-depth discussion/explanation; and 3) That while the initial expenditure of time in becoming familiar with the software and creating the database is significant, the ease of maintenance, modification and distribution of the material frees up instructor time for students more in need of their attention.

DISCUSSION

The results of student opinion survey indicate that student satisfaction with the online L.O. answer database is high. Written comments show that students in the courses in which the online database is used were pleased to have unlimited access to the answers to their L.O.s, used the database, and felt that it was generally helpful. Further, there is no significant difference in satisfaction with the course since the inception of the database (Fig. 2). Thus, the database is a tool that is widely used, accepted and valued by students. Results from use of the NedStat “hit counter” indicate that the database was used often by students (Fig. 4), particularly prior to the exam immediately before the last day to drop the course. It is not possible to determine the average number of times a single user used the database. It may be the case that a large number of “hits” were from the same user, repeatedly accessing the site. However, this is not particularly important because regardless of the user the database was accessed often, especially before exams and was therefore a heavily utilized resource for students.

The inclusion of the database had no significant effect on average course grades between students who used the database and those who did not (Fig. 3). This information becomes more significant when presented in the context that the faculty of the courses using the database felt that less time was spent answering questions during office hours after inclusion of the database as part of their course. It is likely that a large number of students received answers to their questions from the database.

Further, faculty felt that the nature of student office visits has also evolved after adding the database. While fewer students are visiting faculty offices, the visits now regard more in-depth questions or are more likely from students who are in much need of assistance. Overall, faculty felt that simple, straightforward questions were now being addressed online by students. In addition, faculty indicate that another probable benefit of the relative anonymity of the online answer database is that students who would otherwise be reluctant to seek help from a faculty member, now have a resource through which to have their questions answered.

Faculty acknowledge that the initial input of time necessary in order to create the database (learning software, entering data, etc.) is larger than to compile paper versions of the L.O.s. However, after the initial investment of effort, it takes almost no time to maintain and alter the database (correct mistakes, add/delete items, etc.). So, while student grades and active learning are maintained, student satisfaction is high and faculty time is saved.

L.O.s may lend themselves particularly well to administration via the Internet. An online, digital L.O. answer database is capable of maintaining all of the benefits associated with providing students with answers to L.O.s, minimizing faculty effort and increasing flexibility in order to accommodate change. Proper and thorough evaluation leads us to conclude that this is a valued new tool for inclusion into the curriculum. As faculty rush to incorporate a multitude of course materials into online presentations, critical evaluation of their value becomes of increasing importance.

REFERENCES

  1. Foulds, D.M. The current place of learning objectives in paediatrics. Medical Education. 1989; 23: 407-408.
  2. Wyte, C.D., et, al. Effect of learning objectives on the performances of students and interns rotating through an emergency department. Academic Medicine 1985; 70 (12), 1145-1146.
  3. Ferguson, L.M. Writing learning objectives. Journal of Nursing Staff Development 1989; March/April, 87-94.
  4. Gallagher, R.E. and Smith D.U. Formulation of teaching/learning objectives useful for the development and assessment of lessons, courses, and programs. Cancer Education 1989; 4 (4): 231-234.
  5. Engle, C.E. For the use of objectives. Medical Teacher 1980; 2: 232-237.

Published Page Numbers: 14-18