{"id":25532,"date":"2020-05-13T02:09:00","date_gmt":"2020-05-13T02:09:00","guid":{"rendered":"https:\/\/www.immersion.com\/?p=25532"},"modified":"2020-07-14T23:18:44","modified_gmt":"2020-07-14T23:18:44","slug":"the-haptic-stack-software-layer","status":"publish","type":"post","link":"https:\/\/www.immersion.com\/ja\/the-haptic-stack-software-layer\/","title":{"rendered":"The Haptic Stack \u2013 Software Layer"},"content":{"rendered":"\n<p>By Chris Ullrich, CTO at Immersion<\/p>\n\n\n\n<p>Over the last two months,\nI introduced the haptic stack \u2013 a conceptual framework that we use at Immersion\nto think about the key technology components of an&nbsp;engaging haptic\nexperience \u2013 and described the hardware layer. As a quick refresher, the stack\nconsists of three key&nbsp;layers: design, software,&nbsp;and hardware\n(see&nbsp;<a href=\"https:\/\/www.immersion.com\/the-haptic-stack\/\">this post<\/a>&nbsp;for more detail).\nThese layers work together in a highly interdependent way to create a tactile\nexperience that is in harmony with the overall UX of a product or\nexperience.&nbsp;Product designers need&nbsp;to think carefully about the\ntrade-offs in all three layers when designing a product, to deliver a valuable\nand delightful end-user experience.<\/p>\n\n\n\n<p>This month we\u2019re going\nto delve into the middle layer \u2013 the software layer. We\u2019ll primarily focus on\nsoftware that is used to build products and applications and leave application-level\nsoftware for a later post. <\/p>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"420\" src=\"https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_stack-software-1024x420.png\" alt=\"\" class=\"wp-image-25533\" srcset=\"https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_stack-software-1024x420.png 1024w, https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_stack-software-300x123.png 300w, https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_stack-software-768x315.png 768w, https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_stack-software.png 1201w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p>The software layer interfaces\nwith the driver electronics (below) as well as with applications (above) that\nutilize haptics. <\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Effect\nRepresentations<\/h2>\n\n\n\n<p>The software layer is responsible for orchestrating the\nhaptic hardware by generating signals that can be rendered on that hardware in\nresponse to application state or user interactions. It is important to think\nabout the representation of these signals to understand the choices and\ncapabilities available as related to system-level software.<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"242\" src=\"https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_software-diagram-1024x242.png\" alt=\"\" class=\"wp-image-25534\" srcset=\"https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_software-diagram-1024x242.png 1024w, https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_software-diagram-300x71.png 300w, https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_software-diagram-768x181.png 768w, https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_software-diagram.png 1216w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p>At the lowest level, it is common for haptic driver ICs to\nprovide some type of buffer or other commands in on-chip storage that can be triggered,\nand which will result in a voltage being generated. The data stored in these\nbuffers is normally very close to the signal that is sent to the amplifier and\nis always driver IC-specific. This capability is typically used for haptic\nproducts that don\u2019t have an operating system since the buffer can be triggered\ndirectly with an I2C signal or other low-level commands. Normally these effects\nare used for button-like effects because this configuration has very low\nlatency and limited dependence on software state. The effects themselves are\nalso specific to the actuator and driver present in the system. While in this\ncontext, the capabilities of haptic effects are highly dependent on the\nhardware choice, upgrading to better haptic hardware doesn\u2019t necessarily equate\nto more capabilities. With a <a href=\"https:\/\/go.immersion.com\/hd-actuator-selection-and-testing-guide\">higher\ngraded actuator and higher performing driver IC,<\/a> the software layer plays a\ngreater role in leveraging the capabilities of the haptic hardware.&nbsp; <\/p>\n\n\n\n<p>A more general representation of haptic effects is a timed\nseries of amplitude values with a regular sampling rate. Operating systems such\nas Android and iOS can digest buffers of such signal data, typically with a 1ms\nsampling rate, and render them to the available haptic hardware. This format is\nlow level, but it is easy to generate using audio tools like Audacity. A key\nchallenge with this effect encoding is that the haptic experience that will be\ngenerated will vary widely in practice due to mechanical variance at the motor\nlevel. As I noted in last month\u2019s blog, every haptic actuator has distinct and\ndifferent performance characteristics. With a basic signal level\nrepresentation, these variances must be taken into account when the effects are\ncreated. For products that all use the same motor, this can be acceptable.\nStill, for application software that is intended to run on different devices\n(e.g., different Android models), the resulting experience will vary from \u2018<em>exactly\nas intended<\/em>\u2019 to \u2018<em>noisy and confusing.<\/em>\u2019<\/p>\n\n\n\n<p>Another approach to haptic effect representation is to\ncapture the design intent of the effect using an abstract signal\nrepresentation. Abstract signal representations are conceptual by design and\nmust be interpreted by the OS software and then used to synthesize a signal\nlevel effect (usually at runtime). The synthesis step can be accomplished by\ntransforming the conceptual effect description using a motor model that\nrepresents the actual hardware on the device. This approach effectively\ndecouples and abstracts the hardware dependence from the effect design activity\nand is an important component of a general haptic software solution, particularly\nfor markets that have applications that will run on different hardware.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Some Important APIs<\/h4>\n\n\n\n<p>There are a few important in-market haptic software APIs to consider, but, notably, there is a wide variance in approach and sophistication. This variance likely represents a key barrier for application developers who want to make use of haptics since they will normally need to reimplement\/test their application on each platform and hardware model to ensure consistency of experience.<\/p>\n\n\n\n<figure class=\"wp-block-image is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_logo-apple_1.jpg\" alt=\"\" class=\"wp-image-25575\" width=\"38\" height=\"45\"\/><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">Core Haptics (iOS)<\/h4>\n\n\n\n<p>Core Haptics was released in mid-2019 and is one of the most sophisticated in-market examples of a haptic API. Core Haptics uses an abstraction model that enables effects to be specified using ADSR (see <a rel=\"noreferrer noopener\" aria-label=\" (opens in a new tab)\" href=\"https:\/\/en.wikipedia.org\/wiki\/Envelope_(music)\" target=\"_blank\">Wikipedia<\/a>) envelopes. These envelopes are parsed at runtime within iOS and used to synthesize a motor-specific signal that is sent to the amplifier and ultimately to the actuator. Core Haptics is a rich and expressive API, and I highly recommend reading this blog post at <a href=\"https:\/\/medium.com\/@Lofelt_Gmbh\/everything-you-ever-wanted-to-know-about-core-haptics-840739fda61d\" target=\"_blank\" rel=\"noreferrer noopener\" aria-label=\" (opens in a new tab)\">Lofelt<\/a>, which provides a great technical overview.<\/p>\n\n\n\n<figure class=\"wp-block-image is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_logo-google_1.jpg\" alt=\"\" class=\"wp-image-25576\" width=\"75\" height=\"26\"\/><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">Android Vibrate (Google)<\/h4>\n\n\n\n<p>Android has supported vibration control since at least version 2.0. The API is documented on <a href=\"https:\/\/developer.android.com\/reference\/android\/os\/Vibrator\" target=\"_blank\" rel=\"noreferrer noopener\" aria-label=\" (opens in a new tab)\">this website<\/a>. Android doesn\u2019t support effect abstraction and instead provides only a low-level signal interface, which is represented as a buffer of 8-bit amplitudes that are sampled at 1ms. Although this API is very capable, it doesn\u2019t provide any hardware abstraction to developers and thus suffers from the issues described above.<\/p>\n\n\n\n<figure class=\"wp-block-image is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_logo-khronos_1.jpg\" alt=\"\" class=\"wp-image-25578\" width=\"75\" height=\"19\"\/><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">Open XR 1.0 (Khronos)<\/h4>\n\n\n\n<p>Last year, Khronos Group released OpenXR 1.0 (<a href=\"https:\/\/www.khronos.org\/registry\/OpenXR\/specs\/1.0\/html\/xrspec.html\">link<\/a>),\nwhich includes support for haptic feedback devices. This API is targeted at XR\nand gaming use cases and is intended to provide consistent, low-level hardware\nabstraction that enables high-performance applications. The effect encoding in\nOpenXR is a list of individual vibration effects, each of which has a fixed\nduration, frequency, and amplitude. Conceptually, this API is a bit more\nsophisticated than the signal level API in Android. Still, it doesn\u2019t\nmeaningfully abstract the tight hardware dependence that is a hallmark of\nhaptic feedback and will likely suffer from the same inconsistencies present in\nAndroid if the same effect lists are played on devices with different\nactuators.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Software Use Cases<\/h2>\n\n\n\n<p>Given that background, let\u2019s look at some use cases of these haptic APIs and discuss the capabilities and shortcomings of each.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Mobile\nand Automotive Confirmation Effects<\/h4>\n\n\n\n<p>Short, high amplitude\neffects are the hallmark of button replacement use cases. Recall the quality of\nthe Apple haptic home button when it was first released. Many users were\nconvinced that it was a real button. This type of use case is normally\nsupported as a driver IC-level effect buffer. The advantage of this is that the\nsystem has very low latency. Since buttons are normally static, it is\nacceptable to have limited software control over the effect itself.<\/p>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"aligncenter is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_mechanical-switches-effect.jpg\" alt=\"\" class=\"wp-image-25564\" width=\"450\" height=\"265\" srcset=\"https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_mechanical-switches-effect.jpg 900w, https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_mechanical-switches-effect-300x176.jpg 300w, https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_mechanical-switches-effect-768x451.jpg 768w\" sizes=\"auto, (max-width: 450px) 100vw, 450px\" \/><\/figure><\/div>\n\n\n\n<h4 class=\"wp-block-heading\">Mobile Gaming Effects<\/h4>\n\n\n\n<p>Mobile games have traditionally not made extensive use of the built-in haptic capabilities of mobile devices. Now that Apple has enabled developers to create haptic effects in their apps, we expect this to change. The ADSR abstraction introduced in Core Haptics, along with tight integration with audio playback, is ideal for haptic game effect design. This API enables developers to create haptic and audio effects together and manage them as similar assets. It would be nice to see similar functionality come to Android devices in Android 11.<\/p>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"aligncenter is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_Game-Controller-Effects.jpg\" alt=\"\" class=\"wp-image-25558\" width=\"500\" height=\"88\"\/><figcaption> <br><br>A.  A short double pulse with a finite space between the pulses. The second pulse is much lower in magnitude than the first. <br>B. A quick repeating pattern of pulses that ends with a long decay on the last pulse<br>C. Four separate pulses with alternating strong and moderate magnitudes  <br><br><\/figcaption><\/figure><\/div>\n\n\n\n<p><\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Game\nController Effects<\/h4>\n\n\n\n<p>The new PS5 Dual Sense controller will have both HD haptics and active trigger feedback. Both of these capabilities are sophisticated and have a lot of potential for game experiences. However, it is not clear that the OpenXR API is sufficient to enable this platform, and Sony has not (publicly) released any developer tools. It would be ideal if an abstraction-level effect encoding and API existed to facilitate widespread and cross-platform usage of advanced haptics in the next generation of console and PC gaming.<\/p>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"aligncenter is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_Gunshots.jpg\" alt=\"\" class=\"wp-image-25560\" width=\"401\" height=\"233\" srcset=\"https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_Gunshots.jpg 550w, https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_Gunshots-300x175.jpg 300w\" sizes=\"auto, (max-width: 401px) 100vw, 401px\" \/><figcaption> <br>An explosion followed by gunshots <\/figcaption><\/figure><\/div>\n\n\n\n<h3 class=\"wp-block-heading\">Summary<\/h3>\n\n\n\n<p>This month we reviewed the software layer of the haptic stack. As with the hardware layer, there are a lot of disparate choices and approaches, and each one impacts the ability of a product to fulfill its haptic features and requirements. This layer is also the layer that could benefit from enhanced industry standardization both in terms of APIs and in terms of effect encodings. Next month, we\u2019ll look at the top of the stack and examine how haptic experiences are designed and how this ties together the entire stack. We\u2019ll see you then!<\/p>\n\n\n\n<p>Related articles: <\/p>\n\n\n\n<p><a href=\"https:\/\/www.immersion.com\/the-haptic-stack\/\">The Haptic Stack<\/a><\/p>\n\n\n\n<p><a href=\"https:\/\/www.immersion.com\/the-haptic-stack-hardware-layer\/\">The Hardware Layer<\/a><\/p>\n\n\n\n<p><a href=\"https:\/\/www.immersion.com\/the-haptic-stack-design-layer\/\">The Design Layer<\/a><\/p>\n\n\n<p><!--EndFragment--><\/p>\n<p><\/p>","protected":false},"excerpt":{"rendered":"<p>By Chris Ullrich, CTO at Immersion Over the last two months, I introduced the haptic stack \u2013 a conceptual framework that we use at Immersion to think about the key technology components of an&nbsp;engaging haptic experience \u2013 and described the hardware layer. As a quick refresher, the stack consists of three key&nbsp;layers: design, software,&nbsp;and hardware [&hellip;]<\/p>\n","protected":false},"author":6,"featured_media":25595,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[17],"tags":[],"class_list":["post-25532","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-haptic-learning"],"translation":{"provider":"WPGlobus","version":"3.0.0","language":"ja","enabled_languages":["en","ja","zh"],"languages":{"en":{"title":true,"content":true,"excerpt":false},"ja":{"title":false,"content":false,"excerpt":false},"zh":{"title":false,"content":false,"excerpt":false}}},"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.4 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>The Haptic Stack \u2013 Software Layer - \u30a4\u30de\u30fc\u30b8\u30e7\u30f3<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.immersion.com\/the-haptic-stack-software-layer\/\" \/>\n<meta property=\"og:locale\" content=\"ja_JP\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"The Haptic Stack \u2013 Software Layer - \u30a4\u30de\u30fc\u30b8\u30e7\u30f3\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.immersion.com\/the-haptic-stack-software-layer\/\" \/>\n<meta property=\"og:site_name\" content=\"\u30a4\u30de\u30fc\u30b8\u30e7\u30f3\" \/>\n<meta property=\"article:published_time\" content=\"2020-05-13T02:09:00+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2020-07-14T23:18:44+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_software-diagram_feature.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"800\" \/>\n\t<meta property=\"og:image:height\" content=\"492\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Chris Ullrich\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Chris Ullrich\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"7 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.immersion.com\/the-haptic-stack-software-layer\/\",\"url\":\"https:\/\/www.immersion.com\/the-haptic-stack-software-layer\/\",\"name\":\"The Haptic Stack \u2013 Software Layer - \u30a4\u30de\u30fc\u30b8\u30e7\u30f3\",\"isPartOf\":{\"@id\":\"https:\/\/www.immersion.com\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www.immersion.com\/the-haptic-stack-software-layer\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/www.immersion.com\/the-haptic-stack-software-layer\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_software-diagram_feature.jpg\",\"datePublished\":\"2020-05-13T02:09:00+00:00\",\"dateModified\":\"2020-07-14T23:18:44+00:00\",\"author\":{\"@id\":\"https:\/\/www.immersion.com\/#\/schema\/person\/468b748fc6f539444d1157a6a3b62503\"},\"breadcrumb\":{\"@id\":\"https:\/\/www.immersion.com\/the-haptic-stack-software-layer\/#breadcrumb\"},\"inLanguage\":\"ja\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.immersion.com\/the-haptic-stack-software-layer\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"ja\",\"@id\":\"https:\/\/www.immersion.com\/the-haptic-stack-software-layer\/#primaryimage\",\"url\":\"https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_software-diagram_feature.jpg\",\"contentUrl\":\"https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_software-diagram_feature.jpg\",\"width\":800,\"height\":492},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.immersion.com\/the-haptic-stack-software-layer\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.immersion.com\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"The Haptic Stack \u2013 Software Layer\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.immersion.com\/#website\",\"url\":\"https:\/\/www.immersion.com\/\",\"name\":\"\u30a4\u30de\u30fc\u30b8\u30e7\u30f3\",\"description\":\"Experts in haptic technology building touch experiences in the digital world\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.immersion.com\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"ja\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.immersion.com\/#\/schema\/person\/468b748fc6f539444d1157a6a3b62503\",\"name\":\"Chris Ullrich\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"ja\",\"@id\":\"https:\/\/www.immersion.com\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/bfac21365e5403cc15475ad30c848d076af39ecfc457645ad484dd7daf304ef1?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/bfac21365e5403cc15475ad30c848d076af39ecfc457645ad484dd7daf304ef1?s=96&d=mm&r=g\",\"caption\":\"Chris Ullrich\"},\"url\":\"https:\/\/www.immersion.com\/ja\/author\/cullrich\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"The Haptic Stack \u2013 Software Layer - \u30a4\u30de\u30fc\u30b8\u30e7\u30f3","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.immersion.com\/the-haptic-stack-software-layer\/","og_locale":"ja_JP","og_type":"article","og_title":"The Haptic Stack \u2013 Software Layer - \u30a4\u30de\u30fc\u30b8\u30e7\u30f3","og_url":"https:\/\/www.immersion.com\/the-haptic-stack-software-layer\/","og_site_name":"\u30a4\u30de\u30fc\u30b8\u30e7\u30f3","article_published_time":"2020-05-13T02:09:00+00:00","article_modified_time":"2020-07-14T23:18:44+00:00","og_image":[{"width":800,"height":492,"url":"https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_software-diagram_feature.jpg","type":"image\/jpeg"}],"author":"Chris Ullrich","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Chris Ullrich","Est. reading time":"7 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/www.immersion.com\/the-haptic-stack-software-layer\/","url":"https:\/\/www.immersion.com\/the-haptic-stack-software-layer\/","name":"The Haptic Stack \u2013 Software Layer - \u30a4\u30de\u30fc\u30b8\u30e7\u30f3","isPartOf":{"@id":"https:\/\/www.immersion.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.immersion.com\/the-haptic-stack-software-layer\/#primaryimage"},"image":{"@id":"https:\/\/www.immersion.com\/the-haptic-stack-software-layer\/#primaryimage"},"thumbnailUrl":"https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_software-diagram_feature.jpg","datePublished":"2020-05-13T02:09:00+00:00","dateModified":"2020-07-14T23:18:44+00:00","author":{"@id":"https:\/\/www.immersion.com\/#\/schema\/person\/468b748fc6f539444d1157a6a3b62503"},"breadcrumb":{"@id":"https:\/\/www.immersion.com\/the-haptic-stack-software-layer\/#breadcrumb"},"inLanguage":"ja","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.immersion.com\/the-haptic-stack-software-layer\/"]}]},{"@type":"ImageObject","inLanguage":"ja","@id":"https:\/\/www.immersion.com\/the-haptic-stack-software-layer\/#primaryimage","url":"https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_software-diagram_feature.jpg","contentUrl":"https:\/\/www.immersion.com\/wp-content\/uploads\/2020\/05\/Article_software-diagram_feature.jpg","width":800,"height":492},{"@type":"BreadcrumbList","@id":"https:\/\/www.immersion.com\/the-haptic-stack-software-layer\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.immersion.com\/"},{"@type":"ListItem","position":2,"name":"The Haptic Stack \u2013 Software Layer"}]},{"@type":"WebSite","@id":"https:\/\/www.immersion.com\/#website","url":"https:\/\/www.immersion.com\/","name":"\u30a4\u30de\u30fc\u30b8\u30e7\u30f3","description":"Experts in haptic technology building touch experiences in the digital world","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.immersion.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"ja"},{"@type":"Person","@id":"https:\/\/www.immersion.com\/#\/schema\/person\/468b748fc6f539444d1157a6a3b62503","name":"Chris Ullrich","image":{"@type":"ImageObject","inLanguage":"ja","@id":"https:\/\/www.immersion.com\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/bfac21365e5403cc15475ad30c848d076af39ecfc457645ad484dd7daf304ef1?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/bfac21365e5403cc15475ad30c848d076af39ecfc457645ad484dd7daf304ef1?s=96&d=mm&r=g","caption":"Chris Ullrich"},"url":"https:\/\/www.immersion.com\/ja\/author\/cullrich\/"}]}},"_links":{"self":[{"href":"https:\/\/www.immersion.com\/ja\/wp-json\/wp\/v2\/posts\/25532","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.immersion.com\/ja\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.immersion.com\/ja\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.immersion.com\/ja\/wp-json\/wp\/v2\/users\/6"}],"replies":[{"embeddable":true,"href":"https:\/\/www.immersion.com\/ja\/wp-json\/wp\/v2\/comments?post=25532"}],"version-history":[{"count":21,"href":"https:\/\/www.immersion.com\/ja\/wp-json\/wp\/v2\/posts\/25532\/revisions"}],"predecessor-version":[{"id":25854,"href":"https:\/\/www.immersion.com\/ja\/wp-json\/wp\/v2\/posts\/25532\/revisions\/25854"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.immersion.com\/ja\/wp-json\/wp\/v2\/media\/25595"}],"wp:attachment":[{"href":"https:\/\/www.immersion.com\/ja\/wp-json\/wp\/v2\/media?parent=25532"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.immersion.com\/ja\/wp-json\/wp\/v2\/categories?post=25532"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.immersion.com\/ja\/wp-json\/wp\/v2\/tags?post=25532"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}