Introducing the interactive nature simulation – empathy machines for nature.
Our planet is warming, ecosystems are collapsing, and we may already be in the midst of the sixth great mass extinction on Earth, with tens of thousands of species threatened. We know people can be responsive to these issues when they begin to care about affected animals. For example, global warming often becomes more salient to people after they see images of cute polar bears struggling to survive on melting sea ice. I believe that we can help people develop even greater understanding—and be more responsive to these pressing issues—by experiencing them first-hand, through realistic, interactive VR experiences.
It is well established that VR apps can serve as “empathy machines”, helping users view the world through the eyes of another person with a different experience of the world, and broadening our perspectives about what it means to be human in the process. I believe we can build on this important work by creating applications that serve as empathy machines that connect us more broadly to the animal kingdom. In other words, creating engaging VR experiences that allow people to defy the limitations of their own bodies and experience the world as if they were another living creature.
For a brief time, our users will enter a new reality in which they become another animal with a vastly different experience of reality, such as a butterfly, frog, or even an octopus. Toward this end, I am in the process of developing a series of short, interactive nature simulations for the Rift in which the player takes on the reality of a particular animal. These simulations will be educational and lightly gamified, designed to be fun, immersive, and realistic, while providing users with engaging goals, feedback, and end states. The resulting VR experiences will provide users with a simplified—but accurate—depiction of the lived experience of the animal being investigated, allowing users to learn about these animals and their environments naturally through gameplay, rather than more traditional forms of education like reading or listening to text, or watching a video.
The animals and environments are carefully selected based on a number of factors. First and foremost, they must present an opportunity to defy the everyday reality of our users. Our targets might be animals that live at unfamiliar scales, such as a small insect; move through the world in unfamiliar ways, such as a butterfly; and experience the world through unfamiliar sensory inputs, such as a bat using echolocation. Second, there must be a clear “fun factor” associated with the animal, like the ability to jump 20x the length of your body as a frog or to soar from flower to flower on the wings of a butterfly. Third, the simulation should present an opportunity to touch on larger environmental issues, such as the impact of pollution on amphibian ecosystems or the effect climate change has on migrating Monarch butterflies. Finally, users must be able to inhabit the body of the animal in VR, such that the use of the hand controllers and HMD control the animal avatar in the virtual space in a believable, intuitive way.
My hope is that our users will learn about the natural world and some of our most pressing environmental challenges while gaining new insights into how the reality we experience can be shaped by the position we hold in our shared environment.
Image Credit: Monado