Apple’s New AirPods Offer Impressive Language Translation
The technology is one of the strongest examples yet of how artificial intelligence can be used in a seamless, practical way to improve people’s lives.
by https://www.nytimes.com/by/brian-x-chen · NY TimesThis week, I saw an old friend, and he caught me up on what he’d been up to over the summer. He and his girlfriend had visited family in Arizona. His niece dragged him to a screening of “Lilo and Stitch.” He was working hard at a new start-up. He said all of this in Spanish, a language I have never learned, but I followed every word.
I understood him because I was wearing the new Apple earbuds arriving in stores on Friday. The $250 AirPods Pro 3 use artificial intelligence to do real-time translations, their most significant new feature. (The earphones, which have slightly better noise cancellation, are otherwise not that different from the last iteration.) As my friend spoke, Apple’s virtual assistant, Siri, acted as an interpreter, speaking in a robotic voice that immediately converted the Spanish words into English in my ears.
Later, I reviewed a transcript of the conversation produced on my iPhone to confirm the accuracy of the translation. With the exception of a few mistakes where Siri mixed up pronouns (referring to my friend’s girlfriend as a “he” instead of a “she”), it was solid.
I was impressed. This was the strongest example I had seen of A.I. technology working in a seamless, practical way that could be beneficial for lots of people. Children of immigrants who prefer to speak their native tongue may have an easier time communicating. Travelers visiting foreign countries may better understand cabdrivers, hotel staff and airline employees.
It would also help me in my day-to-day life with understanding a contractor or pest control employee who doesn’t speak English and is trying to explain what he found under my house.
And frankly, I was also surprised. Apple’s foray into generative A.I., the technology driving chatbots like OpenAI’s ChatGPT and Google’s Gemini, has been rocky, to say the least. The company never finished releasing some of the A.I. features it promised for last year’s iPhone 16 because the technology didn’t work well. And Apple’s A.I. tools that are available for photo editing and summarizing articles have been disappointing compared with similar tools from Google.
The robust translation technology in the AirPods is a sign that Apple is still in the A.I. race, despite its early stumbles. Digital language translators are not new, but Apple’s execution of the feature with the AirPods, a product that perfectly fits in your ears, should make a profound difference in how often people use the technology.
For more than a decade, consumers have fumbled with language translation apps on their phones that were awkward to use, like Google Translate and Microsoft Translator. They required users to hold their phone’s microphone up to a person speaking a foreign language and wait for a translation to be shown on a screen or played through the phone’s tiny speakers. The translations were often inaccurate.
In contrast, AirPods users need only to make a gesture to activate the digital interpreter. About a second after someone speaks, the translation is played in the wearer’s preferred language through the earbuds.
Here’s what you need to know about how to use the translator, how the technology works and why it is likely to be better than past translation apps.
Getting Started
Setting up the AirPods Pro was simple. I opened the case next to my iPhone and tapped a button to pair the earphones. To use the translation software, I had to update to the latest operating system, iOS 26, and activate Apple Intelligence, Apple’s A.I. software.
Then I had to open Apple’s new Translate app and download the languages I wanted to translate. Spanish, French, German, Portuguese and English are available right now, and more are coming soon. I selected the language the other person was speaking (in this case, Spanish) and the language I wanted to hear it in.
There are a few shortcuts to activate the interpreter, but the simplest way is to hold down on both stems of the AirPods for a few seconds, which will play a sound. From there, both people can start speaking, and a transcription shows up in the Translate app while a voice reads the translated words out loud.
Owners of the AirPods Pro 2 from 2022 and last year’s AirPods 4 with noise cancellation can also get the translation technology through a software update. A recent iPhone, such as the iPhone 15 Pro or a device from the 16 series, is also required to use Apple Intelligence to do the translations.
For a fluid conversation to be translated in both directions, it is best if both people are wearing AirPods. Given how popular Apple’s earbuds already are, with hundreds of millions sold worldwide, this feels quite probable.
Yet there are times when this tech will be useful even with only one person wearing AirPods. Plenty of immigrants I interact with, including my nanny and mother-in-law, are comfortable speaking only in their native tongue, but can understand my responses in English, so my being able to understand them, too, would go a long way.
Why Translations Are Getting Better
The AirPods’ reliance on large language models, the technology that uses complex statistics to guess what words belong together, should make translations more accurate than past technologies, said Dimitra Vergyri, a director of speech technology at SRI, the research lab behind the initial version of Siri before it was acquired by Apple.
Some words carry different meanings depending on the context, and large language models are capable of analyzing the full scope of a conversation to correctly interpret what people are saying. Older translation technologies were doing piecemeal translations one sentence at a time, which could result in big mistakes because they lacked context, Dr. Vergyri said.
Yet the A.I. technology in the AirPods might still have blind spots that could lead to awkward social situations, she added. Words alone don’t account for other types of context, like emotions and cultural nuances. In Morocco, for example, it may be impolite to jump into a conversation without a proper greeting, which often involves asking about a person’s family and his or her health.
“The gap still exists for real communication,” Dr. Vergyri said. She added, however, that translation technology would become increasingly important as corporate workers become more global and need to communicate across cultures.