PIs: Marjorie Zielke, Scotty Craig, Robert Rege, James Wagner
University of Texas at Dallas
Award Details
In the context of medical student education, this Cyberlearning project will investigate how to design rich social learning experiences that integrate real and virtual features (objects or people) to enhance the learning process. To simulate a role-playing experience for patient communication, the project will incorporate augmented reality, a 3D technology that enhances perception of the real world through a contextual overlay of virtual objects/information onto physical objects in real time. This project mirrors how medical students may work in a future telemedicine environment where intelligent virtual entities and human teams seamlessly interact for patient care. Results will inform the design of virtual agents/humans to support learning in a variety of educational domains. The research will provides powerful new tools for medical education including communication with patients which will foster lifelong, and just-in-time training and will contribute to advancing national health, prosperity and welfare.
The research employs a human-like, artificial intelligence-driven, high-fidelity, Emotive Virtual Patient that has life-like emotions and nonverbal expression with conversational and assessment capability using natural language processing. This is integrated within the Microsoft HoloLens to allow a remote participant to review a student’s performance and provide feedback through text, audio, and video. Through iterative design-based research, the research will investigate: 1) how students learn by proxy through observing co-located and remote virtual and real collaborators; 2) whether students prefer to receive educational feedback from a co-located or remote peer, virtual peer, or a virtual professor; and, 3) the effects of collective agency when students can choose to learn socially from combinations of real or virtual learning companions.