Immersive interfaces like virtual reality (VR) and augmented reality are becoming ever more powerful, eliciting investment from the likes of Google, Samsung and Microsoft. As these technologies become commercially available, their use in education is expected to grow.
“After 20 minutes of watching a good movie, psychologically you’re in the context of the story, rather than where your body is sitting,” Dede said.
Dede’s research tries to determine how immersive interfaces can deepen what happens in classrooms and homes and complement rich, educational real worldlearning experiences, like internships.
Two decades ago, with funding from the National Science Foundation (NSF), Dede started investigating head-mounted displays and room-sized virtual environments. Some of his early immersive cyberlearning projects included River City (a virtual world in which students learned epidemiology, scientific inquiry and history) and Science Space (virtual reality physics simulations).
In his most recent projects, Dede and his colleagues have tackled the subject of environmental education, teaching students about stewardship.
“Our projects take a stance. They’re not just about understanding,” Dede said. “There’s a values dimension of it that’s important and makes it harder to teach. This is a good challenge for us.”
With support from NSF and Qualcomm’s Wireless Reach initiative, Dede and his team developed EcoMUVE, a virtual world-based curriculum that teaches middle school students about ecosystems, as well as about scientific inquiry and complex causality.
The project uses a Multi User Virtual Environment (MUVEs)—a 3-D world similar to those found in videogames—to recreate real ecological settings within which students explore and collect information.
At the beginning of the two to four week EcoMUVE curriculum, the students discover that all of the fish in a virtual pond have died. They then work in teams to determine the complex causal relationships that led to the die-off. The experience immerses the students as ecosystem scientists.
At the end of the investigation, all of the students participate in a mini-scientific conference where they show their findings and the research behind it.
EcoMUVE was released in 2010 and received first-place in the Interactive Category of the Immersive Learning Award in 2011 from the Association for Educational Communications and Technology.
Writing in the Journal of Science Education Technologies, Dede and his team concluded that the tools and context provided by the immersive virtual environment helped support student engagement in modeling practices related to ecosystem science.
Whereas EcoMUVE’s investigations occurred in a virtual world, the group’s follow-up, EcoMOBILE, takes place in real world settings that are heightened by digital tools and augmented reality.
“Multi-user virtual environments are like ‘Alice in Wonderland.’ You go through the window of the screen and become a digital person in an imaginary world,” Dede said. “Augmented reality is like magic eyes that show an overlay on the real world that make you more powerful.”
The researchers took the same pond (Black’s Nook Pond in Cambridge, Mass.) and asked themselves: Why not augment the pond itself and have different stations around the pond where students can go and learn? What emerged was a cyberlearning curriculum centered on location-based augmented reality (AR).
Dede’s team created a smart-phone based AR game that students play in the field. Initial studies show significant learning gains using AR versus a regular field trip.
The virtual worlds and augmented reality are containers in which learning can be developed across a range of subjects, Dede says. The same approach could be applied to economics or local history.
But as appealing as the technology may be, truly successful immersive learning technologies are all about the “3 E’s” according to Dede: evocation, engagement and evidence.
Evocation and engagement are fairly easy in VR and AR applications. And increasingly, gathering evidence is too.
Dede’s project uses the rich second-by-second information on student behavior gathered by the digital tools to determine how well the project is teaching the students the core concepts. This allows for more accurate individual evaluation.
“We can now collect rich kinds of big data,” Dede said. “If we understand how to interpret this data, they can help us with assessment.”
With an award from NSF, Dede and his colleagues developed a rubric for virtual performance assessments, working towards a vision of diagnostics happening in real-time that impact the course of the learning.
“The embedded diagnostic assessments—or ‘stealth assessments’—provide rich diagnostic information for students and teachers,” Dede said.
The team has even developed animated pedagogical agents (like “Dr. C”, a simulated a NASA mars expert) as a way to bring people-based expertise to learning in a scalable way.
The latest direction for Dede’s research: using Go-Pro cameras to capture the inter- and intra-personal dimensions of students’ activities and equipping the AR devices with sensors so students can map critical biomes.
“There’s a whole bunch of great citizen science that we can see coming out of this,” he said.
[“source-phys.org”]