A recent survey by Life Science Centre in New Castle claimed that about 79% visitors who attended their “Robots – Then and Now” exhibition had to alter the way they spoke in order to be understood by the voice assistants, such as Alexa, Google Assistant, Siri, and Cortana.
These visitors were also reported to having regional British accent and their need to shift their accent in order to communicate with the voice assistants effectively.
It is important to understand that people with non-standard accent will have a harder time to communicate while interacting with smart assistant devices. Another issue people face while interacting with smart speakers is that speech technology seems to favor standard accents.
Even as humans we sometimes struggle to recognize where people are from based upon their accents alone and virtual assistant devices will also take time to gain that familiarity with regional accents.
People are worried that these virtual assistants will take away your regional assistant but we have to be realistic here. Interactions between humans and robots are not that frequent when compared to human-to-human interactions. Since our everyday talk is designed to communicate with other humans, a small change in the way we pronounce or our accents will not change our accents drastically. The only thing we might find ourselves doing more while interacting with virtual devices are raising or lowering our voices or enunciating more.
So, we have to stop fretting now. If you see the journey of virtual assistants to produce language, you can see that they have come a long way already and they will only improve going forward and possibly be able to understand regional accents.