Context
The context of thus work in exploring immersive learning in Medical Ethics was that 10 modules were created to teaching Phase 3 medical students about medical ethics in clinical decision making. These modules consisted of a trigger video, filmed using a 360-degree camera and best watched on a phone using 3D glasses of some type. They could also be watched on a computer or tablet. These videos had interactive aspects to dig deep into the perspective of stakeholders in the medical interaction. These interactions may be between Doctor and patient and family members. The students then went on to do an interactive learning module created in Smart Sparrow which consisted of an entry quiz, learning activities and content and an exit quiz. The exit quiz required 100% correct answers for module accreditation.
Tool
These modules were evaluated using a single evening of online evaluation and focus groups. Sixty students attended a single session during which they completed 2-3 modules. The students then evaluated them using an online survey. A subset was also interviewed in one of the two focus groups (each consisting of approximately 10 students). The evening was also filmed. The evaluation questionnaire asked about the user experience, engagement and knowledge gains. The focus group asked questions mainly about engagement and was free flowing. Video and photo footage gave us further information about user experience and engagement.
Results
From the user experience perspective, students informed us that modules were realistic. They were highly valued as relatable to real clinical experience. Some students still asked for faster approaches to the presentation of the clinical scenario (such as transcripts). When asked about the added benefit of the VR immersive experience, students were equally divided about whether this added to the learning experience. More feedback on the exit quiz was requested. Student engagement results were high. Students thought that the scenarios were relevant and highly valuable. On Likert scale questions, 98% of students agreed or strongly agreed that the modules were engaging and interesting.
From a knowledge gain perspective, approximately 10% of students achieved full marks on the entry quiz. The average mark was 50% (including the 10%). Between 22-50% achieved full marks on the exit quiz (first try). The Average mark was 60% on the first attempt. Most students needed 2-3 attempts to get 100%. In relation to the student’s assessment of their knowledge gains, 99% reported that the modules increased their knowledge. In relation to the impact of these modules on their ability to understand the ethical issues that affect clinical decision making, 89% of students said these modules helped either a lot or a great deal. Another 10% felt that they helped a moderate amount. The feedback from small focus groups included that the students felt that this was a great initiative and wanted more modules created. They found them very relevant to clinical years. A number of students had been in similar clinical situations and didn’t know what to do. They felt that these modules really helped them with that situation. They reported that they didn’t like the technical glitches such as the slow loading of videos.
Impact
The impact of my evaluation has been the significant Improvements to modules. I have tried to ‘close the loop’ and in doing so I have made changes to improve the modules. These changes mainly related to the comments on the user experience. For example, the loading speed of videos was improved to positively impact the user experience. The instructions about using modules were also improved and the focus on clinical experience and practice was made even clearer. Students were given flexibility in timing and place of learning and this improved student understanding. Discussion of modules was encouraged, as was a reflective component that mapped student incorporation of learning into practice. It has been important to evaluate my work as a teacher in an ongoing way. Reflective pieces were written by students and these showed a strong incorporation of learning into practice. Subsequent feedback was obtained from students and this feedback was even more positive which clearly demonstrated my excellence in teaching practice. Presentation and possible publication of my innovative approach to teaching Medical Ethics was another impact. The development and evaluation of these modules illustrated the use of innovation in teaching. Classes were developed to fill a gap using a novel technology and delivered in a student-centred, flexible, novel (VR) technology. The use of innovative teaching methods has potential to greatly improve our practice as educators, but new techniques, methods, content always needs to be evaluated to assess to ensure best practice and quality of our deliverables.
References
https://news.aamc.org/medical-education/article/future-or-fad-virtual-reality-medical-education/
Maran NJ, Glavin, RJ. (2003). Low- to high-fidelity simulation – a continuum of medical education? Medical education; 37 (Suppl. 1):22–28
Self, D. J., Wolinsky, F. D., & Baldwin, D. C. (1989). The effect of teaching medical ethics on medical students' moral reasoning. Academic Medicine, 64(12), 755-759.
http://dx.doi.org/10.1097/00001888-198912000-00014
Stirratt, GM. (2015). Reflections on learning and teaching medical ethics in UK medical schools. J Med Ethics; 41:8–11.https://jme.bmj.com/content/41/1/8
Mattick K, Bligh J. (20016). Teaching and assessing medical ethics: where are we now? Journal of Medical Ethics; 32:181-185.
Eckles RE, Meslin M, Gaffney M, Helft PR. (2005). Medical Ethics Education: Where Are We? Where Should We Be Going? A Review. Academic Medicine. 80(12):1143-1152.
Evaluating and Improving Undergraduate Teaching in Science, Technology, Engineering, and Mathematics (2003), Fox MA, Hackerman N, (eds). Chapter: 5 Evaluation Methodologies, https://www.nap.edu/read/10024/chapter/7