Virtual Reality in Educational Settings

Back to Primers

Authors: Britte Cheng and Cynthia D’Angelo
Printer-Friendly PDF | Google Doc for Comments | Questions? Contact CIRCL.

Overview

Virtual reality (VR) is an emerging platform for creating meaningful, engaging user experiences. VR typically refers to a computer-generated experience of a fictional or real place that you can interact with through words and gesture. Done well, VR can make people feel that they are physically present in the virtual world –– and react as if they are in a real world –– because the brain buys into the illusion that the experience is, in fact, real. In gaming, VR headsets, hand-held remote controllers, and wearable gloves and suits offer new levels of immersion and interactivity. Beyond gaming, VR is being used in the workplace to help doctors practice surgery, customer-service employees improve understanding of body language, and colleagues feel connected in virtual meetings. VR can even help people develop empathy around (for example) climate change, understand scientific phenomena they can’t see, and experience historical events they otherwise couldn’t experience (Bailenson, 2018).

From a learning science perspective, VR ties into long-standing themes around how dynamic representations and visualization can support conceptual learning. It opens new opportunities to consider how affect and cognition are mutually supportive in learning processes. And it raises challenging issues of how groups collaborate in virtual spaces, or how learning moves among virtual and everyday group spaces and contexts. As learning scientists, we see an opportunity here. The next sections of this primer present some key learning science lessons and issues for VR. But first, to orient the reader, we provide a brief overview (below) of VR technologies and the related concepts of virtual, augmented, and mixed reality.

VR Technologies. VR is making inroads into education as it becomes more user friendly and economically accessible. VR technologies differ in the amount of immersiveness that they afford, from relatively non-immersive 2D computer-based environments to fully interactive spaces in which you can walk around and interact with objects using headsets and other devices. Headsets typically use either smartphones or computers to drive screen graphics. smartphone headsets (e.g., Cardboard, HTC Vive, Google Daydream, Samsung VR Gear) offer mobility, but are limited in their processing power and display resolution, which impacts how visually immersive an experience can be. Alternatively, computer-based headsets (e.g., Playstation VR, Oculus Rift) have the benefit of more processing power and better resolution, but restrict the mobility of the user and require special Input devices for each setup. The forms of interaction possible range from no input (e.g., Cardboard), to basic controllers for selecting things within the VR (e.g., Gear, Vive), to additional features like tracking the motion of the user (e.g., Oculus).

Augmented reality. In many discussions of VR in education, there is also a mention of augmented reality (AR). While VR replaces your current view with a simulated one, AR overlays virtual objects (e.g., labels) onto your current view. A well-known example is Pokemon Go. In this app, in one mode, your phone’s camera shows you a live view of the world around you, and sometimes a virtual Pokemon creature appears in the screen as part of the game. You can then interact with the creature using your phone. Although AR technologies currently lag behind VR technologies, applications of AR are promising for a variety of learning applications: For example, AR applications could project accurately sized dinosaurs into a classroom to helps students to understand scale, support hearing impaired learners by projecting interpreter’s signs in the same field of vision as the object, and add additional context or scaffolded support to hands-on activities.

Mixed reality. Features of virtual and augmented reality technology are combined in mixed reality — usually by projecting virtual objects into a real space that is more immersive than augmented reality. For example, the Concord Consortium and University of Virginia have developed a mixed-reality gas laws activity that allows students to interact with a visual molecular dynamics simulation of a gas through tactile inputs spatially aligned with objects in the simulation. Commercial devices like Microsoft HoloLens allow users to interact with the environment without a physical controller.

Next »