Mobile has changed the dynamic of gaming in a dramatic way.
Over the last few years, game developers have grown accustomed to mobile as one of the platforms in which they can engage the consumers. The approach for designing a top-tier mobile game still includes the majority of the same components – game design, action, gamer engagement, sound design, user retention strategies. However the way that these components are executed in the mobile gaming platform can have a huge impact on success.
In a recent contributed article on Exploring Mobile Games as An Engaging Platform, posted in gamesauce, last month, I recapped a conversation I had during panel with a few industry leaders, namely Jeff Drobick of Tapjoy, Jeffrey Cooper of Samsung, and David Zemke of DeNA. On the panel we debated the balance between the left brain and the right brain when it came to designing mobile games.
The one consensus across the industry experts was that these creative game design remains the highest priority in gaining and retaining users. It makes sense. After all, you can’t even start to think about how you can make money until you have gamers playing your game. From that perspective every piece of the design is a balancing act. The mobile device is a unique platform because of the way gamers use their handsets. (See my blog post on Embracing the Mute Button). Maintaining a lasting relationship with the user is exponentially harder on the mobile device, however the opportunity to target a broader audience makes it a worthwhile effort.
This is an ever present component of how we look at tactile design at Immersion. There is huge benefit to adding tactile effects to games. We know it increases retention, improves people’s intent to share their experience with others, and provides users with an overall better impression of the game design quality. We also know that at some point the intent in a tactile design can go bad. This happens when design principles are not followed and the strategy for tactile design is not completely thought through.
Touch is a great engagement tool and to do it right can mean a lot to your users, and can result a more elegant game design, higher retention and new monetization opportunities. Getting the right balance in means that the focus should be on improving game design and increasing retention before monetization consideration. It is possible to use haptics for both. We should continue to explore how users play their mobile games to find other creative avenues for monetization as we consciously design haptics to improve game play and engagement.
After more than a decade of running an audio production company for games, and evangelizing, promoting, and praising the virtues and importance of music, sound effects, and voice over in game play experiences, I have to admit that the irony behind this blog entry is a bit uncanny.
While I will always be a proponent of the value audio provides in the media we consume, I’ve somehow found myself on the other side of the fence when it comes to audio experience in the mobile game industry.
Let me be clear, I still do believe that audio is a core component of the mobile gaming industry. Audio in the mobile platform is the salt and pepper on your steak and eggs – pretty good on its own, a little better with some added flavor.
However, as oppose to the PC/console experience where you would be severally handicapped as a player, if able to play the game at all without auditory cues, audio is actually somewhat problematic when it comes to mobile.
Candy Crush, as the ubiquitous example, has more than 100 million downloads through the Google Play store. And, while most everyone who is familiar with Candy Crush has heard the Candy Crush song, the delightful clicking and popping of candies falling and being crushed, and the deep soulful voice saying “Delicious” after a fortuitous cascade of matches, there’s a likelihood that a small percentage has continue to play with the audio on while struggling through all the levels up to 78, 325, or 1,495 (yes, they’re on level 1,495 and counting).
This shouldn’t come as a surprise – it would be true to say that regardless of the mobile game you most recently downloaded, there is a high chance you’ve played these games with the sound off. Why? Because you’re in a public area, you don’t want anyone to know that you’re playing a game, you don’t want to wear a headset, or the repeat of the same audio track over and over again has become too distracting. It could be any one reason, but you’ve now grown accustomed to playing mobile games without audio.
Candy Crush, or the equivalent highly addictive mobile game, has inadvertently trained you to turn off the audio on your phone while playing the game. The limitations of size available to support a rich and varied landscape of audio cues simply doesn’t exist on the mobile platform. No matter how beautifully arranged or perfectly crafted the music and SFX are in the mobile platform, it is simply impossible to avoid repetition when you limit the quantity of music to five minutes total running time and expect players to stay engaged for dozens of playing hours.
So, what do we do? As game designers and creators of a fundamentally creative expression of art and design, how do we embrace this new paradigm in which audio, once a key tool to enrich game experiences, in now effectively gone. We are reduced from an A/V experience, to just an “A”. A single sense, sight, in which to create the entire experience for engagement. A single layer to the try and immerse the player into the experience. Imagine trying to create a film or television program without the use of audio. We would be back in the world of silent films. George Lucas is famous for stating that audio is 50 percent of the experience, but the reality in mobile gaming today is that audio is closer to five to 20 percent of the experience, depending on the type of game being played.
Now, I’m not saying that audio is an unimportant ingredient of the game design recipe. No matter the frequency that audio experiences take place, it must be there for the player to enjoy when they choose, and the quality of the audio must reflect the overall production values of the game itself. However, we need to think beyond audio as an engagement layer, and look to alternate means to engage gamers in the mobile platform. One obvious, yet under-utilized opportunity to engage, is through the sense of touch.
Touch feedback has been a part of gaming experiences since the early rumble packs, and as part of the force feedback in dual-shock controllers used by Xbox and PlayStation. The console platform has embraced touch as an opportunity to increase engagement in traditional gaming content for years, with “rumble” being a core part of nearly all gaming experiences.
This same concept is available on the mobile platform. However, unlike the dual-shock or rumble in gaming consoles, this is not just vibrations. Mobile phones and the amazing actuators within them are capable of providing a rich and dynamic spectrum of tactile effects. On the mobile phone you can create tactile experiences that are much more realistic than a rumble pack. Just imagine a feeling the roar of a fire breathing dragon. Or now, you can download your choice of games on Google Play at Games You Can Feel.
When thinking about how game designers should adapt and evolve to provide this critical information to players, without audio, we need to consider that there is only so much that can be done visually to pass along this information, especially on a device in which pixel count is limited and the viewable area compacted (as compared to PC console gaming).
Flashing lights and beeping noises can be distracting or annoying (thus the mute switch). To get past this, we need to embrace the realities that people will play with the mute switch on, on mobile platform and look for opportunities to replace this layer of engagement with another sensory experience – the sense of touch. Today we are somewhat limited to sight, sound, and touch. Let’s use it to its full abilities. And then we can start to explore smell and taste another day.