How to Make Siri and Alexa Understand What You Are Saying

If speaking out loud from Alex (or your digital assistant) seems unnatural, you are not alone. I have Siri while she’s alive, but I can count on my fingers how many times I’ve talked to her. It has always seemed easier to open an app on my iPhone or type in a google query and get exactly what I’m looking for than to perform verbal questions that ultimately lead to my desired answer – or not.

It is also difficult to get rid of the weirdness of talking to an inanimate object. While at least one in six U.S. adults uses a voice assistant to order takeaway, set morning alarms, control our thermostats, and read the weather forecast to us, many of us still struggle with the idea of ​​treating our voice assistants as if they were real, alive. creatures capable of human dialogue, especially because they exist only in the form of small black boxes (figuratively and sometimes literally). In fact, nearly 60 percent of Americans say they are changing the way they talk to a robot.

With that in mind, we spoke with experts in speech recognition, language processing, and machine learning to determine exactly how we should interact with our voice assistants – Alexa, Google Home, and Siri – to get the information we need and avoid embarrassment. “Sorry, I can’t help it.” With practice, my own questions became more like one side of the conversation, not like a match. Here’s what you need to know:

Wait a little while your request is processed

The machines first pick up the sounds of our speech and translate them into words that look like dictation. But in reality, they cannot do anything with these words if they do not understand the transcript. Humans can record the sounds we hear in different languages ​​with a reasonable degree of accuracy, but that doesn’t mean we can understand their meaning.

This is why voice assistants look “smart” – they process and understand natural human speech and respond accordingly. But it’s actually a pretty simple task, says Candy Sidner, professor of computer science at Worcester Polytechnic Institute.

“[Voice assistants] are essentially programmed to perform certain actions, so they break up the statements presented to them and then search the Internet,” she says.

Sydner adds that there is always a gap between the end of the question and the response of the machine to account for processing time, especially when it needs to understand speech – entering a query directly into Google does not require this extra step. Make your questions as specific as possible for the best possible outcome, and let your assistant receive and communicate an answer before following you or assuming Siri heard you wrong.

Talk to Alexa as if she is your friend

Voice assistants are trained using human speech images. This means that talking loud or slow, overexposing words, or oversimplifying questions will actually make your queries less successful, not more successful. Imagine that Alexa, Siri, and Google Assistant are the people sitting next to you, rather than voices on inanimate devices, and they are more likely to handle your requests correctly.

“When the system doesn’t understand, people tend to speak in the language of robots and become louder and clearer, which is funny because the data is built on real, natural human speech,” said Katy Pearl, head of conversation design at Google. “The data model is really most accessible when you speak as naturally as possible, rather than shouting and talking too much.”

Don’t try to hide your accent

Experts say voice assistants are surprisingly responsive to user accents if they have been trained to use human speech appropriate to a language or region.

“The reason speech recognition works as well as it does today is because we have years and years of anonymous, real-life speech utterances – things that people have said,” Pearl says. “We have to think about the different ways people talk and interact with the world when we localize in different countries.”

It can be more difficult for a voice assistant to make sense of someone who is not native or non-American in English than it is to distinguish between New York and Alabama users, but they all offer multiple English accents. If your device has settings for your accent, such as British English, you can switch to this mode for more precise processing. In general, even without special tuning, you will get the best results by speaking naturally.

Alexa, Siri and Google Assistant can also understand different languages ​​if you set them up that way. The supported languages ​​are quite limited depending on the helper . For example, Alexa speaks English with five accents, as well as Japanese and German. Google Assistant has several configurations available on the included phones and tablets, and Google expects over 30 languages ​​to be available by the end of this year. Siri supports 20 languages ​​with a number of additional dialects in some of those languages.

You can change the accent or language of the voice assistant in the settings of your device. Here’s how to do it for Siri , Alexa, and Google Assistant .

Be prepared to rephrase or repeat

It’s easy to get annoyed when the voice assistant doesn’t understand your question the first time you ask it, but people aren’t always good at that either.

“One thing the system could do if it doesn’t understand is to say, ‘I didn’t understand that.’ Can you say otherwise? “Says Alexander Rudnitski, professor emeritus at the Institute for Language Technology at Carnegie Mellon University. “If you’re human, this is a smart way to try the system. Just say it differently. “

If the person is likely to answer yes or what? Or with a blank look, your assistant will at least confirm your request and apologize when he needs more information, he will not understand you, he will not be able to get an answer, or has not been trained in certain phrases or types of questions.

While voice assistants do not require users to adhere to a script, they can misinterpret the request or take the wrong action due to the way the user phrases their question. For example, if you tell Google Assistant, “Play Jason Derulo’s new song“ Colors, ”” she might first recognize the artist, not the song, and say, “Okay, here’s Jason Derulo on Spotify,” which is not entirely true. you asked. If you paraphrase the request for “Reproduce Jason Derulo’s colors,” the answer is “Jason Derulo’s colors, of course. Play on Spotify. “

Voice assistants usually respond best to simple, direct, and specific requests, so if you find that your device isn’t doing what you ask, try rephrasing your request.

Don’t expect complex or nuanced answers.

Experts agree that while voice assistants are pretty good at answering simple questions and learning basic user preferences, they lack the ability to understand context in the way humans can. For example, Pearl explains if you ask your best friend, “Who was at the party last night?” she or he would give you a different answer than someone you don’t know so well.

When the voice assistant is unable to understand the context, it will usually not be able to respond properly. If you ask Google Assistant, “Is Paddington 2 already on Netflix?” she will say, “My apologies … I don’t understand.” In this case, the word “on” has several meanings, says Pearl. If instead the user asks for a specific action – “Can I stream Paddington 2 on Netflix?” – the context is clear and the assistant responds: “I searched for Paddington 2 on Netflix, but it’s either not available or can’t be played right now.”

While voice assistants can control our smart home devices, play music, report the weather, and query for Uber, there is still a lot to learn about human communication.

“In a way, these helpers are really smart,” Pearl says. “They know a lot of facts. But in a way, they are very dumb. They have no common sense about how the world works. “

More…

Leave a Reply