2015 is shaping up to be an important year for a few technologies that have been around for a long time. Perhaps even more exciting are the new experiences that we can start to imagine when they converge and hit the mainstream. Virtual reality, haptics, and 3D printing are emerging on the popular culture scene and creating quite the stir in the process.
Virtual Reality or VR, as it is known short hand, has been around for a long time, since at least the mid-1960s. Its goal is to transport users into a completely different reality, one created by a computer, but which is so convincing that users forget their physical bodies and inhabit this new reality. This technology has been used in military and industrial applications for decades but has generally been too expensive for consumers.
In addition to head-mounted displays (e.g. Oculus, Gear VR), virtual reality experiences rely on some type of interface device to enable users to interact with and manipulate the virtual world. These come in many varieties, including sensor gloves, styli, and game controllers. The usefulness and realism of a virtual environment is directly limited by the control interface. The current trend of using eye-gaze or touchpads on head mounted displays to control one’s VR experience are barely functional, and certainly not immersive.
It turns out that without some type of tangible interaction, even the best gesture interface does not create the sense of immersion needed for a true virtual reality experience. Humans really do understand and create belief in the world through the use of their hands. There’s overwhelming evidence in the research that users are much more proficient at performing tasks in virtual environments with tactile information than without. It’s for this reason that, while I think camera based gesture technologies like Leap Motion have their place, they don’t provide any tangibility, so there is still something critical missing from the experience.
In virtual reality gaming, there are some good options: game controllers, including the Razer Hydra, provide a decent tangible interface. However, these controllers inherit the form factor and tactile experience of non-VR game controllers.
Things will get really interesting when VR experience designers appreciate the value of an AR concept called Projection Augmentation. The idea is that by projecting an interface directly onto a physical object, it can be made into an intuitive user interface: users can pick up the object and get all the inherited tactile benefits of the actual physical object, but with a flexible digital overlay. As it turns out, this idea can also be used to create perceptual illusions: a normal cylinder can be made to feel convex or concave based on the visual overlay. This is powerful, because it allows us to utilize the way people naturally perceive the world to provide interface flexibility.
For immersive VR, the concept carries over: a tangible interface (for example, a game controller) can be tracked, rendered in the virtual space and then given new physical properties by the VR experience designer. This is where haptics can play a role, expanding the design space with vibration, texture and shape.
Haptics has been around for more than 50 years. Way back in the 1950s, it was developed as a way to enable technicians to service nuclear reactors by providing an operator with tactile feedback from a robot down in the fuel chamber. The idea was that tactile sensors and video cameras could capture the experience of a robot (in a dangerous situation) and when this was presented to a remote operator, it would create a sense of presence (more accurately, telepresence).
As humans, we perceive and understand the world through our senses. However, touch and vision complement each other in a unique way: vision allows us to get the big picture while touch allows us to explore, understand and feel a specific part of that big picture. Only when we can touch and feel something do we believe with certainty that what we’re looking at is real.
Of course, there are limits to how a preexisting physical object can have its properties expanded virtually, which brings us to 3D printing. If my VR controller is some type of physical artifact with haptic capabilities, then the interactive virtual experiences that can be created are limited only by what I (as a designer) can reasonably convince users that this artifact is. If I can now augment this physical artifact with arbitrary 3D printed models, the possibilities expand dramatically. Together, these three technologies enable experiences that are truly worthy of the name “virtual reality,” and line between virtual experiences and real world experiences begins to blur.
This year, we are seeing mainstream products with haptics, VR, and 3D printing coming to market. Taken together, these technologies will enable the next generation of human experience. It will be fascinating to see new kinds of entertainment and social interactions unfold as a result.
After more than a decade of running an audio production company for games, and evangelizing, promoting, and praising the virtues and importance of music, sound effects, and voice over in game play experiences, I have to admit that the irony behind this blog entry is a bit uncanny.
While I will always be a proponent of the value audio provides in the media we consume, I’ve somehow found myself on the other side of the fence when it comes to audio experience in the mobile game industry.
Let me be clear, I still do believe that audio is a core component of the mobile gaming industry. Audio in the mobile platform is the salt and pepper on your steak and eggs – pretty good on its own, a little better with some added flavor.
However, as oppose to the PC/console experience where you would be severally handicapped as a player, if able to play the game at all without auditory cues, audio is actually somewhat problematic when it comes to mobile.
Candy Crush, as the ubiquitous example, has more than 100 million downloads through the Google Play store. And, while most everyone who is familiar with Candy Crush has heard the Candy Crush song, the delightful clicking and popping of candies falling and being crushed, and the deep soulful voice saying “Delicious” after a fortuitous cascade of matches, there’s a likelihood that a small percentage has continue to play with the audio on while struggling through all the levels up to 78, 325, or 1,495 (yes, they’re on level 1,495 and counting).
This shouldn’t come as a surprise – it would be true to say that regardless of the mobile game you most recently downloaded, there is a high chance you’ve played these games with the sound off. Why? Because you’re in a public area, you don’t want anyone to know that you’re playing a game, you don’t want to wear a headset, or the repeat of the same audio track over and over again has become too distracting. It could be any one reason, but you’ve now grown accustomed to playing mobile games without audio.
Candy Crush, or the equivalent highly addictive mobile game, has inadvertently trained you to turn off the audio on your phone while playing the game. The limitations of size available to support a rich and varied landscape of audio cues simply doesn’t exist on the mobile platform. No matter how beautifully arranged or perfectly crafted the music and SFX are in the mobile platform, it is simply impossible to avoid repetition when you limit the quantity of music to five minutes total running time and expect players to stay engaged for dozens of playing hours.
So, what do we do? As game designers and creators of a fundamentally creative expression of art and design, how do we embrace this new paradigm in which audio, once a key tool to enrich game experiences, in now effectively gone. We are reduced from an A/V experience, to just an “A”. A single sense, sight, in which to create the entire experience for engagement. A single layer to the try and immerse the player into the experience. Imagine trying to create a film or television program without the use of audio. We would be back in the world of silent films. George Lucas is famous for stating that audio is 50 percent of the experience, but the reality in mobile gaming today is that audio is closer to five to 20 percent of the experience, depending on the type of game being played.
Now, I’m not saying that audio is an unimportant ingredient of the game design recipe. No matter the frequency that audio experiences take place, it must be there for the player to enjoy when they choose, and the quality of the audio must reflect the overall production values of the game itself. However, we need to think beyond audio as an engagement layer, and look to alternate means to engage gamers in the mobile platform. One obvious, yet under-utilized opportunity to engage, is through the sense of touch.
Touch feedback has been a part of gaming experiences since the early rumble packs, and as part of the force feedback in dual-shock controllers used by Xbox and PlayStation. The console platform has embraced touch as an opportunity to increase engagement in traditional gaming content for years, with “rumble” being a core part of nearly all gaming experiences.
This same concept is available on the mobile platform. However, unlike the dual-shock or rumble in gaming consoles, this is not just vibrations. Mobile phones and the amazing actuators within them are capable of providing a rich and dynamic spectrum of tactile effects. On the mobile phone you can create tactile experiences that are much more realistic than a rumble pack. Just imagine a feeling the roar of a fire breathing dragon. Or now, you can download your choice of games on Google Play at Games You Can Feel.
When thinking about how game designers should adapt and evolve to provide this critical information to players, without audio, we need to consider that there is only so much that can be done visually to pass along this information, especially on a device in which pixel count is limited and the viewable area compacted (as compared to PC console gaming).
Flashing lights and beeping noises can be distracting or annoying (thus the mute switch). To get past this, we need to embrace the realities that people will play with the mute switch on, on mobile platform and look for opportunities to replace this layer of engagement with another sensory experience – the sense of touch. Today we are somewhat limited to sight, sound, and touch. Let’s use it to its full abilities. And then we can start to explore smell and taste another day.