Captioning and sign language help brands reach the Deaf community

The coronavirus outbreak has created unprecedented challenges for deaf and hard-of-hearing people in the US. Not only have social distancing protocols and stay-at-home orders proved isolating, but mask mandates have made speech reading difficult and stymied communication in American Sign Language (ASL), a visual language that relies on signing with hands and on facial expressions to convey tone, meaning, and nuance. As a result, many of the country’s 48 million adults with hearing difficulties cannot access potentially lifesaving information.

In some instances, the Deaf community has literally had to fight for access. While ASL interpreters are now common at most state and local COVID-19 briefings, the National Association of the Deaf (NAD) and five deaf consumers recently won a lawsuit that forced the White House to use an interpreter during presidential briefings related to the coronavirus. The omission of interpreters, they argued, violated federal law by making it difficult for non-English speakers to fully understand the information being presented.

“COVID-19 updates occur on a daily, if not hourly, basis. For Deaf people who use American Sign Language as their primary language, these updates are often inaccessible,” said Craig Radford, vice president of strategy and business development at Communication Service for the Deaf (CSD), an organization that provides technology, resources, and services for the deaf and hard of hearing. Since the crisis began, his organization’s COVID-19 hotline—which enables callers to communicate directly via video with representatives in ASL—has had regular requests for information and assistance.

Diverse communications preferences

Though the Americans with Disabilities Act (ADA) was written to protect all deaf and hard-of-hearing people from discrimination, the deaf population is not monolithic. It includes people who were born deaf, people with limited hearing (who may or may not use hearing aids, cochlear implants, and other assistive devices), and those who’ve experienced hearing loss later in life (also known as late-deafened). As a result, communications preferences and abilities vary widely. Some people use sign language, some read speech or read and write in English (or another spoken language), and others do all or none of these things.

Within the deaf population, there’s also a subset of people who share a distinct Deaf culture.  “Being Deaf is unique because it comes along with a culture and a language,” Radford said. United by their primary use of ASL, they identify as a linguistic minority.

As a result of this diversity, it’s often challenging for organizations to meaningfully engage with deaf consumers. Most public and private entities comply with ADA requirements by providing sign-language interpreters, audio induction loops for hearing aid users, video relay services, captioning, or transcription, but the requirements—and the workarounds offered—are often ambiguous or inadequate. For example, while captioning is required by law for TV programming and films, most online videos—including those on YouTube—aren’t subject to the same regulation. Instead, these and other videos on social media come with machine-generated captions using speech-recognition software, with spotty and unreliable results.

Top brands step up

Now, some marketers are seeking better options. Many brands now offer more accurate and complete captioning in their videos, TV commercials, and online assets and plan to boost their captioning and live chat efforts over time. “All of the larger companies support captioning, but others don’t have it and there’s no excuse,” said Christian Vogler, director of the Technology Access Program at Gallaudet University. “To add captioning to 1 minute of video costs $1.25. If I see a company advertising a product and they don’t bother captioning their video, I’ve already decided I don’t want to interact with that company,” he said.

For its “2020 State of Captioning” report, transcription provider 3Play Media surveyed professionals across industries and found that 64% reported captioning all or most of their online video content, up from 58% in 2019. It also found respondents’ confidence in captioning compliance understanding had increased from 22% in 2019 to 32% in 2020. Likewise, a March 2020 survey by Rev, another audio-to-text transcription service, found many professionals worldwide expected to increase their use of transcription, captions, and subtitles over the next several years.

But captioning is not a panacea. Too much text can cause nonnative speakers of English to disengage, and machine-generated captioning—which many companies hope will save them money—remains highly inaccurate. More than five in 10 (56%) respondents in the 3Play Media survey said “no” when asked if pre-recorded videos with autogenerated captions provided a truly accessible experience. The report also noted that without appropriate quality control, machine-generated captioning can be anywhere from 10% to 50% inaccurate and must be manually edited to be adequately accessible. So while automatic speech-recognition technology is steadily improving, brands should avoid it as a substitute for manual live captioning or ASL interpretation.

Direct ASL communication provides native-language experience

While some deaf consumers do concede that captioning is “better than nothing,” it’s not always the best choice. For one, it’s useless—and potentially harmful—when it’s unintelligible or incorrect. It’s also unhelpful to people who can’t read or understand English. “Those who were late-deafened will be familiar with English and with hearing culture,” Radford said. “But English is not always a deaf person’s primary language. For many, it’s a second language.”

Radford said that some estimates put the number of ASL users in the US and Canada as high as 2 million to 3 million people. Some 93% of the 480,000 deaf and hard-of-hearing individuals surveyed by CSD between 2010 and 2013 said it was important to communicate in their native language of ASL.

To better engage with these consumers, brands are turning to direct video calling (DVC) systems that enable them to communicate directly with Deaf consumers in ASL. Google and Comcast were among the first big brands to partner with Connect Direct, a CSD subsidiary that offers real-time, one-to-one DVC customer support in ASL. The Connect Direct service enables customers to video chat with Deaf or hard-of-hearing support representatives who are fluent in ASL without having to use third-party interpreters or video relay services—which can prove frustrating and inconsistent. “When a deaf customer who uses ASL goes to the website and sees the ‘ASL Now’ button, they get directly connected to a representative who uses sign language and is trained in the product or service being asked about,” Radford said.

Organizations that have introduced ASL DVC have seen positive results, including faster call speed and an increase in the number of deaf callers. According to Radford, Comcast saw a 33% reduction in call times after introducing direct calling in ASL, while Google—for which DVC represents 16% of total US call volume—has seen an 83% reduction in “average handle time” when using direct ASL communication over phone support, as well as a 93% customer satisfaction rating.

"Behind the Numbers" Podcast