[The following notes were generated by Douglas McKell MS, MSc and Rebecca Rowe, PhD]
The Fall 2023 IAMSE WAS Seminar Series, “Brains, Bots, and Beyond: Exploring AI’s Impact on Medical Education,” began on September 7, 2023, and concluded on October 5, 2023. Over these five sessions, we will cover topics ranging from the basics of AI to its use in teaching and learning essential biomedical science content.
The Fall 2023 IAMSE WAS Seminar Series, “Brains, Bots, and Beyond: Exploring AI’s Impact on Medical Education,” began on September 7, 2023, and concludes on October 5, 2023. Over these five sessions, we will cover topics including the foundational principles of Artificial Intelligence and Machine Learning, their multiple applications in health science education, and their use in teaching and learning essential biomedical science content.
The fourth session in this series is titled Artificial Intelligence (AI) Tools for Medical Educators and is presented by Drs. Dina Kurzweil, Elizabeth Steinbach, Vincent Capaldi, Joshua Duncan, and Mr. Sean Baker from the Uniformed Services University of the Health Sciences (USUHS). Dr. Kurzweil is the Director of the Education & Technology Innovation (ETI) Support Office and an Associate Professor of Medicine. She is responsible for all of the strategic direction for the ETI, including instructional and educational technology support for the faculty. Dr. Steinbach is the Academic Writing Specialist in the newly established writing center at USUHS. She has 20 years of experience teaching and facilitating the learning of academic writing. LTC (P) Vincent F. Capaldi, II, MD is the Vice Chair of Psychiatry (Research) at USUHS and Senior Medical Scientist at the Center for Military Psychiatry and Neuroscience at the Walter Reed Army Institute of Research in Silver Spring, MD. Dr. Capaldi is also the program director of the National Capital Consortium combined Internal Medicine and Psychiatry residency training program and chair of the Biomedical Ethics Committee at Walter Reed National Military Medical Center. Dr. Joshua Duncan is the assistant dean for assessment. He earned his medical degree and MPH from the UHSUS and is board-certified in pediatrics, preventative medicine, and clinical informatics. Mr. Sean Baker is the chief technology and senior information security officer, where he leads a team of 80 technologists to support the IT needs of USUHS and the entire military health system.
Dr. Kurzweil reviewed the goals of this webinar presentation and the learning outcomes.
- Understand AI terminology
- Identify AI teaching opportunities
- Review citation options for AI tool use
- Explain course policies using AI-generative tool(s)
- Describe two accountability measures for using AI systems
- List several impacts of using AI for assessment
Dr. Duncan briefly described AI as an intersection of Big Data, Computer Science, and Statistics and defined AI as a computer performing a task that would typically take the cognition of a human. A subset of AI is Machine Learning (ML), where machines are programmed with algorithms to perform some of these tasks, which can be supervised or unsupervised by human interaction. Supervised learning can include Computer Vision, Natural Language Processing, and Machine learning, in contrast to Deep Learning, which is unsupervised and mimics human cognition.
Dr. Duncan emphasized that understanding and using AI is becoming a required competency in health care, medical education, and research. He provided several examples such as using AI to do large database statistical analysis, keyword database searching, use of clinical algorithms in clinical decision support, and to support clinical thinking and dialogue. One specific example he discussed, with references, was using Natural Language Processing in medical education assessment to evaluate three categories: Processing Trainee Evaluations, Accessing Supervisory Evaluation Techniques, and Accessing Gender Basis.
Dr. Duncan then presented a demonstration of Chat GPT to illustrate the many uses for medical educators. He used ChatGPT to generate the following six topic prompts: Curriculum development, Assessment creation, Teaching, Teaching methodology, Research ideas, and Adaptive teaching.
Using the ChatGPT platform, he provided a prompt for the above areas. For curriculum development, he asked ChatGPT to create a 6-week course on medical ethics that included lecture topics, readings, and assessments. In a manner of seconds, the 6-week course was designed. He pointed out that while the course topics and sequence generated by ChatGPT may be only a partial version of the course, it provides the user with a great starting point if they want to create a course like this from scratch. Dr. Duncan emphasized that it was essential to be cautious about all references ChatGPT provides because AI models, as text-predictors, can hallucinate, meaning that if they do not have access to real answers, they will make up some. The AI user needs to verify all content and references to ensure they are valid and legitimate. He then demonstrated an Assessment creation using a detailed ChatGPT prompt to create five NBME style multiple choice questions with answer explanations on cardiovascular physiology, suitable for first-year medical student assessment. Like the first ChatGPT demonstration, the five questions were generated with five possible answers, the correct answer was indicated, and an explanation was given for why this answer was the most accurate choice. Dr. Duncan stated that there is an art of asking good questions (or prompts) so that the output generated is close to what you were looking for or expecting. The prompts that Dr. Duncan used during his demo were one-sentence prompts and can be specific, for example, asking for effective teaching methodologies for imparting clinical skills to medical students. He concluded his presentation by prompting ChatGPT for three research topics in medical education that are currently under-explored and why they are important. Dr. Duncan stated that AI can be an important member of the medical education team by providing the user with a draft that is 80% complete with answers to their prompts.
Mr. Sean Baker, in charge of IT security at USUHS, discussed the need for careful compliance when using all AI tools. He stressed the importance of not entering information not already cleared for public release, such as personal data and information, controlled unclassified information, hiring, performance management or contract data, student data, evaluations, and Personal Identification Information (PII). Mr. Baker then highlighted the need to be aware of the policies at the user’s institution and provided examples of how they use Generative AI at USUHS. He compared using AI to using Social Media in that you do not want to post anything on AI that you would not post on Social Media.
Dr. Kurzweil then presented the topic of higher education’s need to think critically about user agreements and how we present these agreements to our students and faculty. These policies must be discussed and decided at all levels, from Federal, State, University, College, Departments, including individual courses and classrooms. She emphasized that AI will be widely used, and its use will depend on individual institutions’ decisions, especially when it comes to student use in courses and faculty use in the classroom. She pointed out that it is important to clearly state examples of where AI cannot be used, such as requiring all course assignments to be exclusively the student’s work and specifying that the student cannot use AI applications like Spinbots, DALL-E or ChatGPT. She also provided examples of when AI use is permitted, such as when the assignment will require a topic or content search strategy or provide a reference for additional information.
Dr. Kurzweil discussed a 2023 Educause article by McCormack1, describing use cases clustered around four common work areas to incorporate Generative AI in higher education. They are:
- Dreaming: Brainstorming, summarizing information, research, and asking questions.
- Drudgery: Sending communications, filling out reports, deciding on materials, and gathering information to help develop syllabus reading.
- Design: Using Large Language Models to create presentations, course materials, and exams.
- Development: Creating detailed project plans, drafting institutional policies and strategic plans, or producing images and music.
Many AI tools are currently available, and you, as the user, need to decide how best to use them. It is essential to consider how these tools can be used in teaching and what we must do to prepare our learners and faculty to develop their digital fluency. She cautioned that these tools can hallucinate, i.e., makeup sources, so you need to check your work. You need to check all citations to be sure they are real and that the information is correct. Dr. Kurzweil emphasized that nothing comes out of these tools that she would take at face value without first verifying the information source.
Dr. Kurzweil then described opportunities to use AI tools to help you teach, including:
- Altered active real learning
- Independent thinking and creativity
- Review of data and articles quickly
- Overcoming writer’s block
- Research and Analysis skills
- Real-time response to questions
- Tutoring and Practice
- Creation of Case Studies
She then described several ways to create Curriculum Integration Opportunities with AI in the classroom, including:
- AI formalized curriculum
- Introduction to AI concepts
- Computer literacy and fluency
- Data Science
- Hands-on AI tool practice
- Medical Decision-Making with AI
- Professional Identity Formation
- Ethical Decision-Making
- Computer Science Theory
Dr. Kurzweil presented the application of Assessment with AI using seven examples, including:
- Project-based learning
- Expectations of draft completeness
- Rubrics created and applied to student work
- Annotated references
- Using pen and paper in class for initial (draft) work development
- Testing centers
She then highlighted these examples linked to specific Assessment Practices examples impacted by AI, including:
- Requiring students to work collaboratively
- Scaffolding assignments
- Becoming familiar with students’ writing style
- Making assignments personal, timely, and specific
- Creating assignments that require higher-level cognitive skills
- Authentic assessments with Observation and Simulation experiences
Dr. Kurzweil then listed six ways that AI can be incorporated into the medical education curriculum:
- Provide medical students with a basic understanding of what AI is and how it works.
- Introduce medical students to the principles of Data Science
- Introduce medical students to the use of AI in radiology and pathology.
- Teach medical students how AI can be used to analyze patient data and provide treatment recommendations.
- Introduce medical students to ethical considerations of AI, such as privacy, bias, and transparency.
- Provide medical students with an opportunity to apply their AI foundational knowledge in real-life clinical scenarios.
She then turned the session over to Dr. Steinbach to discuss plagiarism.
Dr. Steinbach focused on our need to be aware of plagiarism occurring with AI, especially when students use ChatGPT to complete assignments. Many AI detectors utilize a perplexity score, which measures the randomness of text, and a burstiness score, which measures the variation in perplexity to differentiate between text composed by humans or text written by AI. She noted in a paper published in 2023 that the software GPTZero correctly classified 99% of human-written articles and 85% of AI-generated content. Educators will have concerns that our students may be using AI, such as ChatGPT, to generate text for their writing assignments without correctly citing the source of the generated text, which could give them an advantage over students who are not using AI to help them complete their assignments. Dr. Steinbach stated that writing assignments that focus on students’ reflections or interpretations that are generated by ChatGPT could pass without getting identified by the AI detectors. The same can be said for the writing of scientific papers and abstracts, where the software was only able to identify that humans wrote 68% of these. The way to help avoid these issues is to be very clear about the policies and expectations in your course syllabus.
If you allow your students to use Generative AI in your course assignments, you must be clear on how you want them to cite the AI-generated information. Dr. Steinbach focused on two main style guides, AMA and APA, and a guide on how to cite text generated through AI. First, AI tools cannot be listed as an author because they are not human and cannot answer questions about the work that was produced. For both citation style guides, you can put in the method sections how AI was used and also note it in the acknowledgment sections for AMA. According to the APA style guide, you can also mention AI in the introduction section. She stated that the APA style guide requires the author to include the prompt and identify the text generated by the AI tool. AMA style guide is not clear in their guidance yet, nor do they provide any advice on in-text citations.
The last speaker, Dr. Capaldi, emphasized that there isn’t a perfect AI detector because as the large language models develop and become more sophisticated, the AI detectors tend to lag behind these software improvements. The best AI detectors can do is provide the user with a probability score of whether the text was AI-generated. When used as an AI detector, Watson was only able to detect as AI-generated about 60% of what ChatGPT produced. Dr. Capaldi stated it is harder to detect text that has been edited, combined, or paraphrased. He also noted the probability scores are not perfect either, and there can be false positives and determinations as to whether or not the text was generated using AI tools. He asked the audience to be careful when using AI detectors because they are not entirely accurate and are not completely foolproof when it comes to their implementation in the academic setting since probability scores are not absolute determinations of text that are or are not AI-generated.
Dr. Kurzweil ended the session by stating that AI and education have immense promise, but it also comes with responsibility. She asked that we commit to using AI to empower our learners, faculty, and educational institutions as AI a tool and not as a replacement for us as educators. AI needs to be viewed as a partner working with educators to enhance our ability to make education efficient and effective. She stated we need to embrace innovation and digital fluency while upholding the values of equity, privacy, and ethics in education.
- McCormack, M. Quick Poll Results: Adopting and Adapting to Generative AI in Higher Ed. Tech. Educause Research Notes, 2023. https://er.educause.edu/articles/2023/4/educause-quickpoll-results-adopting-and-adapting-to-generative-ai-in-higher-ed-tech?utm_source=Selligent&utm_medium=email&utm_campaign=er_content_alert_newsletter&utm_content=06-21-23&utm_term=_&m_i=KLvwCDTJUoupZ8FnwYkdq9V07qSZlQeD9ZID2uHfuGiuD%2BGrd53tXNOEA7c6mzGSLdnJzOY6_I0FO0uh8dBaxv0XVHjX0R1KKK&M_BT=36667538866