IAMSE Fall 2018 WAS Session 5 Highlights

WAS Fall 18

[The following notes were generated by Mark Slivkoff.]

 IAMSE Webinar Series, Fall 2018

Speaker: Stanley J. Hamstra, PhD
VP, Milestones Research and Evaluation
Accreditation Council for Graduate Medical Education (ACGME)
Title: Realizing the Promise of Big Data: Learning Analytics in Competency-Based Medical Education
Series: Evolution and Revolution in Medical Education: Technology in the 21st Century

    • Hamstra outlined the following for his webinar:
      • A Review of Milestones
      • Learning Analytics
      • Future Directions
    • Overall, the use of milestones is important because their use helps health professionals remain accountable for the education of our students

     

    A Review of Milestones

    • A special report on milestones was published by Nasca et al. in The New England Journal of Medicine (366:1051-6; 2012).
      • A main point expressed by its authors was that there had been a significant amount of variability in the quality of resident education (graduate medical education, GME)
      • The ACGME, founded in 1981, responded appropriately and the quality of GME has increased over the past 30 years
    • A competency-based approach was necessary to combat the variability characteristic of the traditional model (curriculum with educational objectives and assessments).
    • In a competency-based education model, the health needs and systems are analyzed, then the competencies are built.
    • Competence is multi-dimensional and allows faculty to ask residents:
      • What do they know? (Medical Knowledge)
      • What can they do? (PatientCare)
      • How do they conduct themselves? (Interpersonal and Communication Skills, Professionalism)
      • Are they critical and reflective? (Practice-based Learning and Improvement, Systems-based Practice)
    • Milestones were modeled on the five stages outlined in the Dreyfus Developmental Model of Learning (Novice, Advanced beginner, Competent, Proficient, and Expert).
    • Some of the key points regarding milestones:
      • Articulate shared understanding of expectations
      • Describe trajectory from a beginner in the specialty to an exceptional resident or practitioner
      • Set aspirational goals of excellence
      • Organized under six domains of clinical competency
      • Used as one indicator of a resident’s educational progress
    • Various studies have been done on the effectiveness of milestones. One article considered in the webinar was Conforti et al. (J Surg Educ. 2018;75(1):147-55), who focused on the benefits to the residency Program Director:
      • Changes in the remediation process can be implemented
        • Catching struggling residents earlier
        • Targeted improvements for individual learners
        • Identifying gaps in otherwise high performers
      • Structuring of learning goals
      • Making defensible decisions
        • Milestones provide “built-in”documentation which potential helps mitigate resident’s fighting their evaluation
      • There are numerous sub-competencies (~22) spread across over 130,000 residents, thus crunching the data is a significant task since the number of data points approaches about 3.2 million!

     

    Learning Analytics

    • Learning analytics was broken down into three pieces:
      • Concepts
      • Examples
      • Implementation
    • The purpose of the introduction of milestones revolves around entrustability: can we develop a system to ensure residents and fellows are ready for unsupervised practice by graduation?
    • S. Department of Education describes learning analytics as such: the interpretation of a wide range of data produced by and gathered on behalf of students in order to assess academic progress, predict future performance, and spot potential issues”
    • Hamstra described a Generic Milestones Template
      • Five levels
      • Emphasis placed on Level 4, What does a graduating resident look like? This level serves as the main target for graduation for most specialties, however, is NOT a requirement for graduation
    • The Data: Cross-sectional analysis at the specialty level…
      • Hamstra noted that his favorite data set, his favorite graph, is the “Proportion of Residents Attaining Level 4 or Higher: PC Sub-Competencies (June 2015) – Neurological Surgery”
        • Data makes sense, according the feedback from residency directors
        • In order of attainment, with PC08 at about 95% and PC03 at about 72%:
          • PC08: Traumatic Brain Injury
          • PC02: Critical Care
          • PC01: Brain Tumor
          • PC06: Spinal Neurosurgery
          • PC05: Pediatric Neurological Surgery
          • PC07: Vascular Neurosurgery
          • PC04: Pain and Peripheral Nerves
          • PC03: Surgical Treatment of Epilepsy and Movement Disorders
        • Conclusion: Pain and Peripheral Nerves, Epilepsy and Movement Disorders are not getting covered at specific sites
        • Change in curriculum design and assessment can be addressed
      • The Data: Longitudinal analysis at the individual level…
        • Various graphs were presented which showed milestone “trajectories” (entrance to graduation) within various specialties including:
          • Surgery, pathology (MK01 competency)
          • Wound management
        • Hamstra discussed the Odds Ratio (OR) for residents not attaining Level 4 under threshold. Students at level 1.5 or above during their first assessment (year 1) have a much greater chance at attaining Level 4 than those who score lower.
      • QUALITATIVE RESEARCH: How do Raters Make Decisions?
        • The phenomenon of “straight-lining” has been extensively looked at as well. Straight-lining is when an evaluator, such as a busy physician, gives the same score for each milestone. This, of course, is not good to do.
        • Hamstra’s group is working on how best to mitigate this phenomenon by combing through lots of specialty and sub-specialty data

    Future Directions

    • Version 2.0 of milestones are currently underway.
    • Overall, the new version (new tables) contain milestones which are more refined

    Questions asked after seminar:

    (Note that some questions and/or answers have been reworded for clarity.)

    How do these measures play in to the overall evaluation process? Some teachers do not want to personally evaluate or judge residents, and residents may take their evaluations personally.
    A couple ways to mitigate the personal nature of evaluations. First, evaluations can be done by groups rather than by individual faculty. Second, as evaluator you can have a conversation with the resident in the beginning, letting them know that they will start at level 1.

    What about the variability between residencies?
    We’re looking at the data to help us address a few questions. Do milestone ratings as a whole differ between large and small residencies? Does the size of the program matter? What is the low hanging fruit, that is, what explains the differences?

    What is a good way to ask for feedback on milestones?
    The “O” score assessment which can be applied to other specialties and skills. We’ve created a form called the OCAT (Ottawa Clinical Assessment Tool). Overall, when building forms for evaluation, you need to keep them simple.

    Do you think that the milestones should reflect the Dreyfus model?
    Dreyfus models seems to be the best fit for designing and building milestones.

    If a student finishes the milestones early, do they finish the program early?
    This is a good question and is a key debate right now. Again, milestones are used to supplement evaluations of program directors. But the gist of competency education is that if you are comfortable in graduating someone early, then go for it. Jury is still out, but in theory we’d strive for this.

    Do you have qualitative data on which students make it to Level 4 earlier?
    We do not, but we also want to ask why didn’t those residents make it to Level 4.

    How can a student reach a level of 2.5 when 2.5 is not defined?
    Half-levels are defined. I didn’t talk about this but there are specific instructions on giving a 2.5. The scale is actually a 9-point scale (1 to 5, in 0.5 increments).

    What software do you use to analyze your data?
    We use SAS, but programmers use other programs as well.


    Dr. Hamstra can be reached at shamstra@acgme.org