MPEG Standardization is a Watershed Moment for Haptics

By Chris Ulrich, CTO, Immersion Corporation

In Dec. 2019, I hosted and led a lunch discussion at the Pan Pacific Hotel in Seattle as part of the 3rd annual SmartHaptics conference. The topic was ‘how to grow the haptic ecosystem?’ and the attendees included haptics experts from a few large CE companies, some key component suppliers and some key researchers. Although haptics was rapidly becoming a commonly known and understood technology, it was clear to many that some collaborative effort would be required to realize the potential of the technology.

Two years later, as I plan my trip to the 5th SmartHaptics, the landscape has materially changed. It is apparent that the combined efforts of the past two years, by many members of the ecosystem, are on track to enable haptics as a real media and experience enabling technology for the mass market. Recent developments at MPEG136, and in other standards and industry consortia should be viewed as a watershed moment for the haptic ecosystem.

The Challenge and the Opportunity

Since its earliest days, haptics has demonstrated tremendous potential to enable tangible value for training and simulation, virtual reality, gaming, personal computing, mobile, and automotive markets. The sense of touch has unique properties that distinguish it from auditory and visual sensation. These unique properties have been scientifically demonstrated and are anecdotally well understood by users. It’s now hard to imagine a consumer electronics device that doesn’t have some type of haptic feedback or capability.

Its apparent to experts that, with all this latent hardware capability, haptics should become a creative medium whose potential is just now being understood. The original idea of creating ‘feelies’ (as in Aldous Huxley’s Brave New World), that is, media experiences with haptics as an essential element of the experience has been a challenge for the haptic ecosystem. There are many reasons for this, but key among them is the need for a virtuous cycle of touch-enabled content. End-users struggle to imagine how touch could enhance their media experiences, in part, because they do not have direct knowledge of these experiences. Content creators struggle to imagine how to incorporate touch into their experiences because of a lack of creation tools and direct experience with the possibilities.

At that discussion in 2019, the consensus was that enabling creatives to make haptic experiences which can be played back on many possible hardware devices is an unlocking move to setup this virtuous cycle. Facilitating a design-once playback-many market structure benefits users by enabling more choice, it benefits creatives by opening a larger market and it benefits the ecosystem by enabling competition, with an expectation of consistency. Ideally haptic content created for an Android phone should be playable on a high-end VR headset, or on a haptic chair without loss of creative integrity.

The Value of Walled Gardens for Haptic Content

But there are already compelling examples of how successful this virtuous cycle can be. Both the PlayStation 5/DualSense and the Apple iPhone (6s and later) have high performance haptics and provide a guarantee of consistency to content creators. It’s not surprising that the haptic capabilities of these devices are recognized and valued by end users and creators alike.

The Playstation 5/DualSense console, in addition to being enormously successful, set a new standard for haptics in consumer gaming. The launch title ‘Astros Playroom’ is a masterclass in effectively integrating the advanced kinesthetic triggers and wideband voice coil motors into an immersive and compelling gaming experience. The design ideas have been incorporated into many other leading titles on the PS5 platform, to rave reviews. I would argue that the success of this platform as a haptic system is built on tooling, consistent performance, and easily copied examples.

  • Tooling – Sony developed and made available rich APIs and creative tooling specifically for PS5 developers and ideally suited for the DualSense capabilities.
  • Consistent Performance – On the PS5 system, all connected gamepads have identical motors and control electronics, and all behave identically. For content creators, there is an implicit guarantee that all users will have the same haptic experience.
  • Examples – The creative and design goal of haptics is distinct from audio and video in many cases. Instead of leaving game developers to figure these out on their own, Sony provided Astros Playroom as a ‘worked example’ of where, how, and why to use the haptic functionality in the DualSense.

Apple’s iPhone/iOS platform, in many ways, provided the template for haptics in the PlayStation 5. With the introduction of the Taptic Engine technology in 2015, Apple has continued to develop and provide app developers with a complete solution that enables rapid and compelling haptics.

  • Tooling – Apple created the AHAP haptic effect schema based on JSON that has an easy to understand but very expressive language for creating haptic effects.
  • Consistent Performance – iOS and the hardware team at Apple ensure that AHAP content created for one iPhone feels the same for users across all devices. This means that developers only need to create their AHAP content once and do not have to worry about content aging as new devices are released.
  • Examples – CoreHaptics is a well-documented framework and numerous examples are provided for developers, along with style guides for consistent implementation.

Both Apple and Sony have the advantage of a completely vertically integrated business model (i.e. walled garden) which allows direct control over hardware, software, and user experience. For haptics, this control has proven to be the distinguishing factor that enables a rich content-driven ecosystem from one that struggles with more than basic haptic functionality.

The Value of MPEG Standardization

For the haptics ecosystem to truly realize the potential of the medium, it needs to go beyond a few walled gardens (no matter how successful), it needs tooling and consistent performance to match creative expectations regardless of the haptic hardware. This is the next milestone for the industry and the goal that we are making real progress against in ISO/MPEG.

In early 2020, Immersion initiated a broad MPEG initiative with the specific goal of establishing a broadly adoptable coded representation for haptics. MPEG has done this before with MP3, with HEVC and with many other media representations. As the worldwide standards body for media coding, there is no better standards body within which to realize haptic media coding.

The effort consists of four components:

  • ISOBMFF (ISO basic media file format): The first step is to establish haptics as a first order media type, along with audio and video within the basic media file format. We are happy to report that this is now an International Standard, ISO/IEC 14996-12 (7th Edition), which was approved on Nov. 5, 2021.
  • MPEG-I Haptics Phase 1: Working with Interdigital, Interhaptics, Apple, Lofelt, Foster, and many others, a working group was established in mid-2020 to outline the requirements for a coded representation for timed haptic media. Timed haptic media is any haptic effect that is played on a timeline (most vibrotactile effects are covered by this). As of MPEG136 (Oct. 2021), a reference technology has been chosen by MPEG and is now undergoing refinement. Once complete, this reference will become part of the MPEG-I suite of immersive media standards (ISO/IEC 23090).
  • MPEG-I Haptics Phase 2a: This standard will extend the work done in Haptics Phase 1 by adding a spatial component to the coded representation. In this way, timed haptic effects can be located in space and associated with an immersive media experience in XR. Work has begun in establishing the requirements for this phase and we expect to run a competitive proposal process in 2022.
  • MPEG-I Haptics Phase 2b: This standard will extend the work done in Haptics Phase 1 and 2a by incorporating interactivity into the coded representation. Once complete, this standard will enable high degree of freedom, immersive XR experiences to be coded using MPEG technologies – truly an enabling standard for the metaverse.

At each stage in this standardization process, the ecosystem is empowered to implement these standards and thusly provide an expectation of experiential consistency to content creators. This in turn will allow tools vendors such as Adobe, Unity, and Unreal to implement support for the standard and unlock the creative potential of haptics to their creative communities.

How to Contribute

Standardization at this level requires commitment and shared vision from the entire ecosystem. The payoff for this commitment is a much larger market and many new opportunities for commercial activity relative to haptics. All members of the ecosystem can benefit from participating, contributing, and shaping these standards, which will define the haptics community in much the same way as MP3 shaped the audio entertainment industry.

It’s quite easy to get involved in the MPEG process because MPEG rules require that all working meetings between the official MPEG meetings be open to the public. All discussions and email are freely available to the public and can be accessed at the link below.

MPEG Haptics Ad-hoc Group Mailing List: https://lists.aau.at/mailman/listinfo/mpeg-haptics

In addition to the MPEG work, Haptics Industry Forum (HIF) has established working groups that discuss and develop commercially driven recommended practices and position papers that are ingested into the standards process. Haptics Industry Forum is a fee-based consortium found here:  https://hapticsif.org/.

The Conclusion – For Now

Since that lunch in 2019, the industry has made substantial progress towards maturing haptics as a media type and allowing it to reach its true creative potential. It is only through the collective effort of many experts and the commitment from their organizations that this has been possible. With the ratification of haptics in ISOBMFF and the reference technology for MPEG, we have successfully executed the first big step, but there is still much to do.

Please join us in creating the future of touch and establishing a new haptics ecosystem that realizes the potential of the medium and which has a vibrant creative community constantly pushing us to the future of immersive experience.