Design of Interactive Educational Applications Virtual Reality Research
Design of Interactive Educational Applications     Virtual Reality Research

At InteractionDesignVR we design equity-focused educational applications as we try to answer fundamental VR research questions about how increased spatial presence, immersion and embodiment can improve performance in various kinds of learning tasks.

 

Last update: June 28, 2016

 

Interaction and Immersion

We design not just for single-user simulations in virtual environments, but an interactive experience with activity, shared with others.  Imagine any science or engineering content area integrated into a Computer Supported Collaborative Learning environment, using virtual reality headsets!

The Embodied Interactive Design:

In our immersive environment, once the participant puts on the HMD and begins, she can be free from the keyboard, mouse, or typical gamepad. She becomes, in a sense, the controller, as she views the environment through the eyes of her avatar. Her hand movements navigate her through virtual space, and allow her to select objects for interaction. She can also sometimes see other student avatars in this multi-user (yet private) virtual reality environment-- she can explore with them, learn with them, and together they will solve game-like problems that involve careful collaboration and re-engineering of spaces using intuitive body movements to navigate and manipulate objects. The collaborative tasks in the scenario are carefully programmed so that they can only be completed if everyone present is actively contributing.

Screenshot Above: Player rotates a bridge piece while the other player prepares to assist. The "UW rescue droids" work together to repair the bridge, rescue the wildlife, and put out the fire caused by an earthquake.

Screenshot Above: Starting point and orientation area. Players learn how to navigate and interact with objects in a VR environment before either creating or joining a multi-person cooperative scenario.

Our Current Project

Learning Sciences Research

We are a team of graduate student researchers at the University of Washington working on a VR-based experiment which will measure how VR can improve cognitive performance on difficult tasks.

We have been developing with Unity Pro and the Oculus Rift since November of 2013, the second iteration since September of 2014 using the Oculus Rift DK2 combined with the Microsoft Kinect 2 (for Windows) as an input device, and our newest design since early 2016, which incorporates features of the HTC Vive and full room-scale. Our experiment's final round of pilot testing will take place during Q3 2016, with the study data to be collected in Q4.

The participants will experience an interactive multi-player immersive VR environment we have designed and built using Unity Pro (see screenshots of our prototype below), through a story about training rescue droids to save wildlife and to do other heroic tasks.  These particular droids need to have their spatial reasoning chips calibrated, and the only way to accomplish this is via a cognitive tele-link, which allows the player to see through the visual sensors of the droid and train it through experience. According to the story, the goal is to recalibrate the spatial reasoning chip of the droid so that it can go on rescue missions without the need for human guidance.  Players work together as a team to solve problems and complete a challenging engineering task in the virtual reality environment. They also work independently to improve their own cognitive skills through a scaffolded support program, to "accomplish the mission."

With the launch of the HTC Vive Lighthouse VR system, the participants have a more comfortable embodied experience because the input and movements are mapped to physical reality "1 to 1," which was not quite the case with our DK2 & Kinect design.

After the experiment

We have many plans for further research into educational applications using the upcoming Oculus Touch as well as our current HTC Vive VR systems (we currently have a pair of Vives for designing additional CSCL learning spaces) after our current experiment is completed. The unprecedented tracking quality of the handheld controllers in these systems could introduce new directions for experiments measuring the advantages of fine hand interactions with objects. This will open up new affordable and accessible avenues for research related to embodied cognition, collaborative building of 3D visualizations, and the designing of new spaces for equity-focused interactions.

 

Future projects

We envision that our findings will drive our future projects and the assets we create will become components of immersive educational VR experiences to teach both concrete and abstract concepts. We hope that research verifying the cognitive implications of VR will help legitimize the medium as an active learning tool, which our fellow educators will be interested in exploring as well.

Screenshot Above: Bridge area near the waterfall, without the fire.

Screenshot Above: Player rotates object during mental rotation practice activity. This VR design is based on prior studies suggesting benefits from 3D computer experiences for improving mental rotation skills.

View from the Unity Editor showing the prototype starting area and the first multi-player cooperative training space. View from the Unity Editor showing the prototype starting area and the first multi-player cooperative training space.

Design and Research Team

Suzette -  PhD candidate at the UW College of Education, studying equity-focused immersive technologies, psychological presence & embodied cognition

 

As a former classroom teacher I have always enjoyed utilizing technology and games as tools to help facilitate learning.  I now design immersive Computer-Supported Collaborative Learning Environments created in Unity, and other software, mediated by the next-generation technological tools (such as VR HMDs and body-tracking cameras) and study the effects that these environments have on the participants’ sense of presence, embodiment, and cognitive performance.

 

Recent advances in virtual reality headset technology, including wider field-of-view, lighter weight, smoother positional-tracking, and higher pixel density displays have allowed for significant improvements in these devices' potential, especially for multi-user co-creation and spatial problem-solving.  As a result of the work we (and others) are doing, the participants enjoy a true sense of self-presence, spatial presence, as well as social presence while interacting with others in a VR environment.  Immersion ---> Presence --> Performance

Matthew - PhD Student, College of Engineering, University of Washington

 

Although Matthew is using computer-based models to study Materials Science & Engineering, he also has a background in game/app development and is the primary programmer of our multi-user virtual reality application (using Unity Pro) for use with the HTC Vive, as well as the Oculus Rift head mounted display (HMD) and Microsoft Kinect 2.  He is volunteering to work on our project as a Research Assistant because of his interest in and enthusiasm for VR technologies.  Matthew is interested in developing with input devices that bring us closer to a "holodeck-like experience."

Interaction Board- coming soon

Print Print | Sitemap
© Suzette Lewis 2013-2016