Apple Adds Live Translation to Messages, FaceTime, and Phone Calls

Today at WWDC, Apple announced a highly requested feature for anyone who’s ever been frustrated by not knowing every language on Earth (like me). Live Translation will translate between many languages, and it will work directly in Messages, FaceTime, and even during your calls in the Phone app.
This feature is similar to comparable features on Google’s Pixel phones , which is great because breaking a language barrier doesn’t have to be platform-specific. Live Translate can also translate messages you type into the recipient’s language as you write them. Their responses will then be translated into your language on the way back. Technically, you could translate messages before, but you had to long-press on them first.
In FaceTime calls, these translations will be in the form of live captions that you can read on the screen while listening to the speaker. In pure phone calls, you’ll get the translations spoken out loud. These may be a little harder to pay attention to, since you’ll also be hearing the raw audio from the person you’re talking to, so we’ll have to wait and see how Apple handles the volume levels between the person you’re talking to and the voiceover. On the other hand, translations in phone calls will still appear on the screen, so you’ll be able to see them if you’re on speakerphone and don’t have to hold the phone to your face.
Like most of Apple’s AI-powered features, the company advertises them as running on your device to avoid having to route all your conversations through some random server.