Norway


A second study showed how would frequently mangle interpretations when people read news headlines out loud. American wouldn’t always get it right, but even the slightest whiff of a non-American accent (say, British) would lead to bizarre reconstructions of what people said.

The companies are aware of these issues, but promised in statements that they were improving. Amazon noted that Alexa was improving the more it heard “certain speech patterns” and “certain accents.” , meanwhile, said it would “continue to improve” voice recognition as its database gets larger.

with accents and voice recognition are far from new R; they’re the stuff of comedy routines. And it’s important to stress that the tests didn’t cover a full range of accents, or other assistants like Siri, Bixby and Cortana. The formal studies help quantify the problem with accents, though, and also suggest that a lack of diversity is a serious problem in voice assistant testing. That drop in accuracy for pronounced accents could effectively rule out smart speakers and other voice-aware devices for many people whose only ‘mistake’ was not growing up in the States (or even a particular region of the States). If voice assistants are going to become ubiquitous, they can’t just account for different languages — they have to account for different backgrounds.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here