Why are 20 somethings using closed captions? And what does it mean for the rest of us?

I wear hearing aids because I know that uncorrected hearing loss gets worse faster than corrected hearing loss AND uncorrected hearing loss leads to a proven minimum 9% increase in dementia. I support public services aimed at those with hearing loss by making it a point to attend movie screenings that are open captioned.

But this article surprised me. Bravo to the young who may accidentally help us older folks.


Why Do All These 20-Somethings Have Closed Captions Turned On?

As automatic captioning on TikTok and creative audio descriptions on Netflix go mainstream, so does accessibility

By Cordilia James

Sept. 17, 2022 7:00 am ET

Closed captions are cool now. Just ask anyone under 40.

More viewers, especially younger ones, are using tools that transcribe dialogue in the content they’re watching online, from Netflix NFLX -4.49%▼ movies to TikTok videos. This isn’t just about watching “Squid Game” drama in Korean with English subtitles.

Closed captions—which display text in the same language as the original audio—have been crucial for a long time for many people with hearing loss. They’re now a must-have for plenty of people without hearing loss, too, helping them better understand the audio or allowing them to multitask.

Recent surveys suggest that younger generations are viewing content with captions more than older generations, despite reporting fewer hearing problems.

In a May survey of about 1,200 Americans, 70% of adult Gen Z respondents (ages 18 to 25) and 53% of millennial respondents (up to age 41) said they watch content with text most of the time. That’s compared with slightly more than a third of older respondents, according to the report commissioned by language-teaching app Preply.

“I can’t think of a time in the past couple of months or years that I haven’t had subtitles or captions on,” says 23-year-old Ayem Kpenkaan, who also creates his own comedy videos. While he doesn’t have any hearing issues, he says it helps him focus on what’s happening on-screen, even with the sound on.

In recent years, Apple, Google and other tech companies expanded on-device auto-captioning options, while Netflix found creative ways to describe audio (not just dialogue) to viewers who are deaf and hard of hearing. The innovations—as well as the rising popularity of captions on social media—have helped eliminate some of the stigma associated with hearing loss, advocates say.

“People are hesitant to ask for accommodations in the workplace because they don’t want to stand out or make waves,” says Barbara Kelley, executive director of nonprofit Hearing Loss Association of America. As more people adopt captions, she adds, it becomes easier to ask for those aids.

Caption Popularity

Netflix now provides more colorful play-by-plays. Its new vampire slayer film “Day Shift” added colorful subtitles at certain parts of the movie. In the latest season of “Stranger Things,” subtitles amused viewers with rich descriptions such as “tentacles squelching wetly.” The number of people accessing captions and subtitles has more than doubled since 2017, a Netflix spokeswoman says.

People turn on subtitles and captions for many reasons—to learn a language, perhaps, or decipher a heavy accent or muttered dialogue. A lot of people complain about background music making it harder to hear dialogue. Captions can also facilitate multitasking and allow people to watch content in shared spaces without disturbing others.

Rachael Knoth, a 23-year-old artist in Dothan, Ala., says she has used captions for as long as she can remember. She says she hasn’t been diagnosed with hearing loss. Still, she finds it so hard to view anything without captions that if a video doesn’t have them, she won’t watch it.

“In class, when they play videos and they don’t have the captions on, I have to pay really close attention,” Ms. Knoth says. If she doesn’t, it’s common for her to misunderstand the speakers for a minute or two, she adds.

Improving Accessibility

The National Captioning Institute, a nonprofit that provides captioning services, introduced the first prerecorded closed captions in 1980. A decoder box was needed to view the captions until the 1990s when the U.S. government required electronics companies to build the technology into their TVs. Since then, efforts by people who are deaf and hard-of-hearing have led to the passage of legislation that ensures captions are available for videos online.

Initially, people had to manually transcribe a video’s audio. More recently, artificial intelligence has helped put automatic captions in apps such as YouTube and Facebook. TikTok launched its auto-generated captions last year, while Instagram followed earlier this year.

The first TikTok video Ayem Kpenkaan made with captions remains his most popular.PHOTO: CORDILIA JAMES/THE WALL STREET JOURNAL

Scarlet May, a deaf content creator with 6.5 million followers on TikTok, says when she first joined, she could only watch content from creators who used sign language. Now, captions have exposed her to a whole new world of content.

“I can enjoy the app like everyone else,” says Ms. May, 21.

Many creators filled the accessibility gap by adding their own captions manually. Mr. Kpenkaan, who makes comedy videos, is among those who still do. These are “open captions”—they can’t be turned off. He sees inclusivity as a way to reach more viewers, and believes the open captions help more people get his jokes.

Mr. Kpenkaan plays around with placement, emojis and other features to add humor to some of his videos and engage more viewers. “Captioning is just another medium to be creative,” he says. The first TikTok he made with captions—a funny clip of him and a friend on a romantic swan-boat ride—remains his most popular TikTok video with more than 36.6 million views. 

Turning On Captions

For those looking for captions to help them in their everyday lives, such as when you’re having trouble hearing your device in a noisy environment, one of the latest technologies comes from Apple.

Its Live Captions feature, available with MacOS Ventura and iOS 16 on the iPhone 11 and newer, lets users turn on a live transcript for any audio, whether it’s during FaceTime calls, in a streaming-video app or just picked up by the device’s microphone. Live Captions uses machine learning and keeps everything on your device, rather than sending it to Apple’s servers for processing. You can find it under Settings > Accessibility.

Apple’s Live Captions feature lets users turn on an auto-generated live transcript.PHOTO: APPLE

Google has a similar app for its Pixel phones, and this year’s Samsung TVs can automatically place captions on the screen in locations that won’t disrupt the view.

Social-media apps such as Instagram generate captions on uploaded videos by default, and make them available to turn on within the videos. (Creators can choose not to have captions, or to add their own open captions instead.) Snapchat users can turn on auto-generated subtitles for the app’s Discover page, and as of last year they can also use auto captions in their own recorded snaps.

—For more WSJ Technology analysis, reviews, advice and headlines, sign up for our weekly newsletter.

Write to Cordilia James at cordilia.james@wsj.com

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.