2015 is shaping up to be an important year for a few technologies that have been around for a long time. Perhaps even more exciting are the new experiences that we can start to imagine when they converge and hit the mainstream. Virtual reality, haptics, and 3D printing are emerging on the popular culture scene and creating quite the stir in the process.
Virtual Reality or VR, as it is known short hand, has been around for a long time, since at least the mid-1960s. Its goal is to transport users into a completely different reality, one created by a computer, but which is so convincing that users forget their physical bodies and inhabit this new reality. This technology has been used in military and industrial applications for decades but has generally been too expensive for consumers.
In addition to head-mounted displays (e.g. Oculus, Gear VR), virtual reality experiences rely on some type of interface device to enable users to interact with and manipulate the virtual world. These come in many varieties, including sensor gloves, styli, and game controllers. The usefulness and realism of a virtual environment is directly limited by the control interface. The current trend of using eye-gaze or touchpads on head mounted displays to control one’s VR experience are barely functional, and certainly not immersive.
It turns out that without some type of tangible interaction, even the best gesture interface does not create the sense of immersion needed for a true virtual reality experience. Humans really do understand and create belief in the world through the use of their hands. There’s overwhelming evidence in the research that users are much more proficient at performing tasks in virtual environments with tactile information than without. It’s for this reason that, while I think camera based gesture technologies like Leap Motion have their place, they don’t provide any tangibility, so there is still something critical missing from the experience.
In virtual reality gaming, there are some good options: game controllers, including the Razer Hydra, provide a decent tangible interface. However, these controllers inherit the form factor and tactile experience of non-VR game controllers.
Things will get really interesting when VR experience designers appreciate the value of an AR concept called Projection Augmentation. The idea is that by projecting an interface directly onto a physical object, it can be made into an intuitive user interface: users can pick up the object and get all the inherited tactile benefits of the actual physical object, but with a flexible digital overlay. As it turns out, this idea can also be used to create perceptual illusions: a normal cylinder can be made to feel convex or concave based on the visual overlay. This is powerful, because it allows us to utilize the way people naturally perceive the world to provide interface flexibility.
For immersive VR, the concept carries over: a tangible interface (for example, a game controller) can be tracked, rendered in the virtual space and then given new physical properties by the VR experience designer. This is where haptics can play a role, expanding the design space with vibration, texture and shape.
Haptics has been around for more than 50 years. Way back in the 1950s, it was developed as a way to enable technicians to service nuclear reactors by providing an operator with tactile feedback from a robot down in the fuel chamber. The idea was that tactile sensors and video cameras could capture the experience of a robot (in a dangerous situation) and when this was presented to a remote operator, it would create a sense of presence (more accurately, telepresence).
As humans, we perceive and understand the world through our senses. However, touch and vision complement each other in a unique way: vision allows us to get the big picture while touch allows us to explore, understand and feel a specific part of that big picture. Only when we can touch and feel something do we believe with certainty that what we’re looking at is real.
Of course, there are limits to how a preexisting physical object can have its properties expanded virtually, which brings us to 3D printing. If my VR controller is some type of physical artifact with haptic capabilities, then the interactive virtual experiences that can be created are limited only by what I (as a designer) can reasonably convince users that this artifact is. If I can now augment this physical artifact with arbitrary 3D printed models, the possibilities expand dramatically. Together, these three technologies enable experiences that are truly worthy of the name “virtual reality,” and line between virtual experiences and real world experiences begins to blur.
This year, we are seeing mainstream products with haptics, VR, and 3D printing coming to market. Taken together, these technologies will enable the next generation of human experience. It will be fascinating to see new kinds of entertainment and social interactions unfold as a result.
As anyone can guess, virtual reality was undoubtedly the hot button topic of this year’s GDC.
Even though the industry is still in its infancy stage, it is predicted to be a $150B market by 2020. One significant note of difference with years past was there was a new tone and pace of excitement about virtual reality and what it means for gaming that hasn’t been there before. We’ve come a far way since the beginning of virtual reality in the 1950s but we still have a long way to go.
At the show there was a number of headset manufacturers, peripheral makers and app developers all trying to capitalize on its huge potential. Valve, coming fresh off an announcement for their own VR headset, had an exclusive demo booth that was sold out before the conference even started. I had the opportunity to demo Oculus’ Crescent Bay, their most advanced headset, and was thoroughly impressed by the feeling of immersion within the simulation and also by the complete lack of nausea.
From my observations, there is a massive race amongst all the stakeholders in the virtual reality industry. The headset manufacturers are competing to create the best virtual reality experience, with commercial versions expected to release at the end of this year or early next year. The peripheral makers are also busy at work, creating products that enhance the overall virtual reality experience like Virtuix’s Omni treadmill. The treadmill tracks the player’s feet position while he is wearing the headset – this allows player to feel as if he is running around in the game. Peripherals like the Omni treadmill are vital to the virtual reality experience. They give players a sense of agency within virtual reality. Instead of enjoying from a third-person perspective, the player controls his own movements in the game through his actions in real life.
I attended an insightful lecture on Sony’s Project Morpheus and the challenges behind virtual reality. Sony boasted that it achieved immersion in its VR headset (the feeling of being surrounded by an alternate world), but that its next step is to achieve presence (the feeling of agency in the alternate world).
For virtual reality to be complete, it is not enough to be immersed in an alternate world. The user must be able to interact with his surroundings so that the brain is tricked into thinking that the virtual world is reality. At the moment, there is a disconnect between the virtual world and reality. The first thing that I did when I put on the headset was to stick my hands out in front of me. I was slightly disappointed when I didn’t see my hands, and my brain immediately framed it simply as a simulation that I was peering into. When we talk about presence in virtual reality, I believe that haptics needs to be a central part of the discussion.
Haptics bridges the disconnect between virtual world and reality. It provides confirmation to players that what they are experiencing is real. Imagine if you could feel the sand while walking alongside a virtual beach. It gives players a greater sense of agency – they actually feel like they’re there. Haptics is not a new concept in gaming. We’ve seen haptics in the console space, most notably in the Rumble controllers for Xbox and PlayStation. When the controller rumbles as you’re getting hit by a barrage of bullets, you instinctively know to run for cover and get out of the line of fire. Haptics tricks your brain into thinking that you are the game character. In the same way, haptics can have a significant impact in virtual reality. In fact, it is a necessary feature since the ultimate goal of virtual reality is to achieve full presence.
After trying the Crescent Bay, I know now that virtual reality is going to be the next big thing. People are going to be using virtual reality not just for gaming, but in all facets of life – from education to medical to social media. The possibilities are truly endless. It is no surprise that headset manufacturers, peripheral makers, and app developers are all rushing into this industry. It’ll be a glorious day when I can finally experience virtual reality in full presence. In the meantime, I’ll be playing around patiently on my Google Cardboard.
Watch Dogs has been the talk of the town in the gaming world lately.
Going into E3 this week, I’m reminded of when I first saw the Watch Dogs game trailer at the show in 2012. I knew it was going to push the limits of immersive game play.
Throughout the game, it is clear that the game developers spent their time on detailing the environment to create an enhanced reality of virtual Chicago landscape. The cityscape in the game sets the tone for the environment, down to the shadows and lighting effects that are specific to the changing weather patterns. The level of interactivity throughout the game is as good as it gets. I personally like how you can switch on your Swiss Army knife of a cellphone to more game play options (hint: you’ll instantly get useful information on anyone in the immediate area and you’ll find yourself standing and looking around quite a bit between missions).
Watch Dog has haptics (as any good game would); and the audio and haptic vibration effectively submerge gamers into the environment. Gamers experience an array of sounds from ambient wind and thunder, to industrial city sounds of cars, alarms, weapons, gates, forklifts, breaking glass and bursting steam valves. The haptics goes beyond the standard “crash – impact.” You can feel the difference between fully colliding with a car versus slightly grazing one on the freeway. The ability to sense your car shift gears, land after being airborne, and bump into someone on the street, only gives you additional feedback to enhance your game play. I believe that any type of overt feedback, and haptics in particular, is an added advantage to a gamer’s game play because it sharpens your ability to react.
The gaming world is turning into a virtual world with visual vastness, multiple interactive possibilities, rich audio and rumble feedback, enticing gamers to happily give up hours of real world time. Okay, so maybe not such an arduous task. But between us gamers, we can’t argue that we don’t enjoy this new virtual world. Watch Dogs effectively reminds that it is still possible to improve the game design to be much more realistic, and more importantly, to expect more of this to come in the next generation of game titles.
We know that visual elements will continue to improve, and with virtual reality headsets, like the Oculus Rift (which is being tested by everyone from aging vets to traders), I expect to see much more development in this area over the next few years. Game audio elements will also expand beyond what is used today. We have games, like Dead Rising 3, that uses the Xbox Kinect microphone to lure zombies into traps using your voice. It shows an area full of innovative potential for different types of game control, to create interactions beyond the gamepad.
In the area of haptics, we’re investigating how we can make better game controllers, whether it is peripherals or personal devices. With the broader range of distinct feedback, cues, “feels” that we can create with today’s haptic technology there is still a lot to explore.
We’ve been using the same two motor rumble pads to feel our games f-o-r-e-v-e-r. Now that gaming platforms are redefining itself, the time is right to crack open the casing and take a second look at what is next.
Imagine if you could feel the hacking missions in Watch Dogs with subtle tactile feedback, like a safe cracker feels tumblers in a safe. Or imagine feeling a crisp click as you unlock a door instead of the typical slower rumble of today’s rumble pads. If our game controllers made game interactions feel more like our real life experiences, we’d become more connected to these virtual worlds like the one in Watch Dogs.
Watch Dogs is only one representation of the growing world of immersive game play. Ubisoft has done a good job to create something more than just a game, they’ve created a worthy experience. It is time to ask, what else can we do? Let me know what you think by contacting me at @BobHeubel