Leveraging Virtual Connections with Haptic Technology

Eryn Pierce
12 min readJan 15, 2021

To build deeper connections between hearing users and deaf and/or hard of hearing (HofH) users, designers should incorporate haptic technology into all video conferencing tools and software.

As early as the 1970s the gaming industry embraced haptic technology. Later, beginning in the mid-nineties, telephone companies capitalized on these innovations in their mobile devices. Haptic refers to the tactile sensation one gets while touching an object and interacting with it. The vibration response of a mobile interface to the press of a button or the sustained rumble felt through a game controller during a simulated drive down a bumpy road, exemplify the sensations of haptic technology.

Products that use haptic technology either display a reaction to an interaction from a user, which is haptic feedback, or move against or push towards an interaction, which is force feedback (MacLean et al., 2017). As this technology progresses, it continues to be an exciting frontier for product design. Users desire more unexpected experiences and designers seek solutions that stand out in today’s product saturated world (Palù et al., 2018). In addition to meeting these objectives, the incorporation of haptic feedback and/or force feedback tools into our products has increased their accessibility. As haptics continue to give rise to novel and accessible ways of experiencing products, users will come to expect their presence in their everyday lives.

From video games to mobile interfaces, today, designers are shifting their focus to communication technology to enhance visual and auditory facilitated connections. Incorporating haptic technology into these systems activates the sense of touch which also deepens emotional connections. Touch, as a form of communication, is the first sense developed in the womb and is one of the most intimate of the five senses. By implementing haptic technology, we are potentially tapping into those feelings of temperature, texture, pressure, vibration, pain, or movements within the body–all physical and embodied ways of gathering information.

This activation of our senses at multiple levels, beyond sight and sound, has clear benefits to the way one navigates and operates in the world. More specifically, the sense of touch can improve communication, help form social bonds, and promote emotional well-being (Portnova et al., 2020). The question then becomes where can the sense of touch serve us best within the current communication technology landscape and how can designers leverage touch in ways that have not previously been considered? Video conferencing could be the ideal platform for these innovations.

As a communication tool, video conferencing has become increasingly popular due to the Coronavirus pandemic. Platforms such as Zoom and Google Hangouts are assisting schoolwork, jobs, and social needs within the comforts of home. However, even with the recent acceptance of these platforms, parts of this experience are not ideal. Since moving from an in-person to an online space, the limitations of these remote communication tools become apparent with physical presence being the obvious loss. Two-dimensional displays inadequately support communication needs especially for those who identify as deaf and/or hard of hearing (HofH).

Currently, the readily available platforms have made communication within the hearing-impaired community increasingly difficult to achieve. Poor captioning software and the absence of spatial video rendering place an additional burden on this population. Misinterpretation of physical and spatially oriented languages such as American Sign Language (ASL) in restrictive 2D display becomes common as subtleties in hand gestures and facial cues diminish their impact (Ferreira et al., 2018). With these barriers in mind, questions around what constitutes an accessible communication tool have emerged.

Currently, assistive technologies are filling in those accessibility gaps within digital and video communication. However, no mainstream solution has fully embraced a haptic or force feedback direction. The most popular assistive technology exists separately from products and communication technology geared towards hearing users. This technology includes visual alerting devices, devices to enhance listening, products that convert spoken language to text, and technologies to support telecommunication (Clerc Center, 2014). Video conferencing integrates captioning features, but these additions do little to improve communication, especially between hearing users and deaf and/or HofH users.

Not surprisingly, when looking critically at current communication devices, there are certain communication styles prioritized over others. For example, the everyday smartphone appears accessible, but it still perpetuates normative views around communication. The deaf and HofH default to voice-based options, even when assistive technologies such as video calls and instant messaging are available (Bitman and John, 2019). This finding highlights the importance of design in the creation of superior communication tools. Offering features that are accessible is simply not enough. Designers must boldly dismantle ableist communication norms and adhere to inclusionary tools.

Thankfully, in the past 30 years, a movement is growing to address accessibility through design. After the Americans with Disabilities Act, requiring the inclusion of individuals with disabilities within facilities and public buildings, passed in 1990, accessible design became a key focus of designers nationwide (Irish, 2020). Accessible design was the first movement associated with the passing of these laws, and designers sought to create accessible features that met the basic requirements of the law. Most of these solutions took the form of assistive technology.

However, in the last 15 years, there has been an increased awareness of the importance of products and services that extend their use and function beyond one audience type and include features that are universally usable (North Carolina State University 2006). As this movement, also known as the Universal Design Movement, has progressed so has the available technology, expanding the possibilities of what a designer can do. Specifically, these advancements have allowed designers to incorporate more cost-effective multi-sensory features into their products and services.

Research conducted by Dr. DalPalù’s et al. (2018), purports the importance of multi-sensory design in creating quality user experiences and interactions. Today, theories once espousing sight as the most important sense in product perception have now been re-evaluated with the realization that every sense plays a role. By manipulating the sensory aspects of an environment (e.g. light, sound, scents, temperature) levels of comfort can be elevated and desired behaviors can be elicited (Schreuder et al. 2016). Designers can incorporate interactions between multiple sensory modalities within products to create more holistic experiences and open the opportunities for engagement by non-dominant sensing people. When satisfying the needs of ableist users and those with differing abilities, multi-sensory design then becomes a key component of the Universal Design mission.

However, even with these sensing adaptations, the main barrier to communication for the hearing-impaired remains. Whether in healthcare, academic, or workplace environments, lack of consideration by the hearing population is the leading cause of diminished communication and support of the non-hearing (Newton and Shah, 2013). Deaf individuals often feel uncomfortable initiating conversations with the hearing community. Additionally, they often find it difficult to interact with those who are unfamiliar with ASL. Unfortunately, both factors are responsible for the sustained social isolation that the deaf and/or HofH experience when interacting with the hearing majority (Ferreira et al., 2019). So, the question becomes, how can consideration of the deaf and HofH community support more effective communication? A solution is to incorporate features within technology that compel the consideration of others — technology that not only promotes non-ableist communication practices but also builds emotional connections between the hearing and the deaf and/or HofH communities.

Haptic technology is the most practical technology to carry out this goal. In addition to the social bonding benefits of touch, haptic technology can encourage and guide certain behavior. For example, haptic cueing can initiate and facilitate the learning of ASL and the more physical practices of automatic mimicry — the unconscious and automatic imitation of speech, movements, gestures, facial expressions, and eye gaze. In nonverbal communication, imitation becomes extremely important as even the smallest attempt at ASL by a hearing individual could result in a deaf individual feeling understood.

Tactile sensations could guide both hearing and non-hearing learners through complex movements of ASL and reinforce gestures shared among communicators. Both technological affordances are building upon the emotional bonds necessary for healthy social development (Prochazkova and Kret, 2017). Overall, ASL is a kinetic language, and touch administered through haptic technology can be a powerful tool in assisting learners with these embodied movements, especially in an online environment. In addition, touch can be instrumental in building a relationship among those involved.

Distributing the burden of communication and nurturing relationships across hearing and hearing-impaired communities in a video conferencing environment is a lofty goal, but with recent advances in haptic technology, this goal is closer to realization than one might guess. In the last couple of years, haptic sensations have moved from needing direct interaction with a device to a more seamless mid-air experience. In 2018, Ultra Leap released a platform that combines gesture-recognition and mid-air haptic feedback allowing users to control and immerse themselves in virtual or augmented environments. The sensation of mid-air touch without additional wearables makes complex gestures less cumbersome. A multi ultrasound speaker system directs high-frequency waves towards specific points on the hands and activates the sense of touch. Imagine how helpful a tool such as this could be at assisting the learning of ASL. Unobtrusive cues could help correct one’s dissemination of ASL and simultaneously improve one’s interpretation of ASL, all within the video conferencing platform.

Another key advancement that has changed the game in terms of what designers can do using haptic technology, is image recognition — specifically, vision-based sign language recognition technology. This technology is composed of three main building blocks: (i) hand segmentation and/or tracking, (ii) feature extraction, and (iii) sign recognition (Ferreira, et al., 2018). Combined, these components seamlessly interpret complex gestures and translate language into touch sensations that can be inferred. The communication gap, once widened by a video conferencing environment, is narrowing. No longer will reliance on obtrusive text-based assistive technology be the norm. Rather, tactile experiences, our most primal way of experiencing the world, will become front and center as stronger connections that support fluid communication among a multitude of users are built.

Although these are fitting examples that showcase the importance of haptic technology as the future of video conferencing, there are limitations to what it can do. One argument against embracing haptics within design is the fact that not all individuals experience haptic stimuli in the same way. Touch feedback is especially difficult for the elderly to interpret, making haptic technology troublesome for this population (MacLean et al., 2017). However, with touch, just as with any other sense, there will always be some level of variability in how one experiences the world. So, the issue is not whether someone can or cannot sense touch, but how sensitive the platform is at recognizing those differences and responding to them.

The role of the designer is not to find one solution that fits all, but to seek solutions that can adapt to the user’s needs. Qualities of Ultra Leap make the sense of touch a practical addition to technology. But only when haptic technology is used in conjunction with the appropriate visual and auditory components can video conferencing has the potential to become truly universal. The lesson to be learned is no single sense should be used in isolation and a broad spectrum of senses should be represented. Haptic, or the simulation of touch, is one sense that should be leveraged.

Looking towards the future where haptic technology is a central feature within all communication, the potential to shift our perception around communication is in sight. Enhancing our audio/visual communication channels is only the beginning of what this technology can accomplish. Through notifications, guidance, and entertainment venues, haptic technology can support learning, expand emotional pathways, and establish new communication norms. As consumers demand more inclusive products and experiences, haptic technology has a unique role to play. However, without a drastic shift in our societal norms around communication and the technology we use, this dream to incorporate haptics in all video conferencing systems might not come to fruition. Thus, the onus falls on the designer to advocate for this technology and tap into the full potential that a multi-sensory system has to offer.

References

Bitman, N., & John, N. A. (2019). Deaf and Hard of Hearing Smartphone Users: Intersectionality and the Penetration of Ableist Communication Norms. Journal of Computer-Mediated Communication, 24(2), 56–72. https://doi.org/10.1093/jcmc/zmy024

Camarillo-Abad, H. M. (2019). Transferring motor skills through tactile communication technology. Proceedings of the IX Latin American Conference on Human Computer Interaction.

Clerc Center. (2014, October). Assistive Technologies for Individuals Who are Deaf or Hard of Hearing. Gallaudet University. https://www3.gallaudet.edu/clerc-center/info-to-go/assistive-technology/assistive-technologies.html.

Ferreira, P. M., Cardoso, J. S., & Rebelo, A. (2018). On the role of multimodal learning in the recognition of sign language. Multimedia Tools and Applications, 78(8), 10035–10056.

Irish, J. E. N. (2020). Increasing participation: Using the principles of universal design to create accessible conferences. Journal of Convention & Event Tourism, 1–23.

Lawless, C. (2019, October 24). Multimodal Learning: Engaging Your Learner’s Senses. LearnUpon. https://www.learnupon.com/blog/multimodal-learning/.

Lupton, E. (2018, April 3). Why sensory design?: Cooper Hewitt, Smithsonian Design Museum. Cooper Hewitt Smithsonian Design Museum. https://www.cooperhewitt.org/2018/04/03/why-sensory-design/.

MacLean, K. E., Schneider, O. S., & Seifi, H. (2017). Multisensory haptic interactions: understanding the sense and designing for it. The Handbook of Multimodal-Multisensor Interfaces: Foundations, User Modeling, and Common Modality Combinations — Volume 1, 97–142.

Newton, V. E., & Shah, S. R. (2013). Improving communication with patients with a hearing impairment. Community eye health, 26(81), 6–7.

Palù, D. D., Giorgi, C. D., Lerma, B., & Buiatti, E. (2018). Multisensory Design: Case Studies, Tools and Methods to Support Designers. Frontiers of Sound in Design Springer Briefs in Applied Sciences and Technology, 31–46.

Portnova, G. V., Proskurnina, E. V., Sokolova, S. V., Skorokhodov, I. V., & Varlamov, A. A. (2020). Perceived pleasantness of gentle touch in healthy individuals is related to salivary oxytocin response and EEG markers of arousal. Experimental Brain Research, 238(10), 2257–2268.

Prochazkova, E., & Kret, M. E. (2017). Connecting minds and sharing emotions through mimicry: A neurocognitive model of emotional contagion. Neuroscience & Biobehavioral Reviews, 80, 99–114.

Rastgoo, R., Kiani, K., & Escalera, S. (2020). Hand sign language recognition using multi-view hand skeleton. Expert Systems with Applications, 150, 113336.

Schreuder, E., Erp, J. V., Toet, A., & Kallen, V. L. (2016). Emotional Responses to Multisensory Environmental Stimuli. SAGE Open, 6(1), 215824401663059.

SmartSign. GVU Center. (2018, May 25). https://gvu.gatech.edu/research/projects/smartsign.

Ucar, E. (2015). Multisensory Met: Touch, Smell, and Hear Art. https://www.metmuseum.org/blogs/digital-underground/2015/multisensory-met.

Ultraleap. Haptics. Ultraleap. https://www.ultraleap.com/haptics/.
Ultraleap. What is Haptic Feedback? Ultraleap. https://www.ultraleap.com/company/news/blog/what-is-haptic-feedback/.

U.S. Department of Justice Civil Rights Division Disability Rights Section. (2003, October). ADA Business Brief: Communicating with People Who Are Deaf or Hard of Hearing in Hospital Settings. https://www.ada.gov/hospcombr.htm.

Weaver, K.W., & Starner T., (2012). SMARTSign: A Different Flavor of Accessibility. Pervasive-Frontiers in Accessibility Workshop ACM.

Xu, M., Wang, D., Zhang, Y., Song, J., & Wu, D. (2015). Performance of simultaneous motion and respiration control under guidance of audio-haptic cues. 2015 IEEE World Haptics Conference (WHC).

--

--

Eryn Pierce
0 Followers

Eryn Pierce is a designer, researcher, and teacher in Lander, WY.