The number of voice assistance systems around the world has roughly doubled within two years. In addition to its many advantages, this AI technology also has its downsides that many are unaware of.
The useful and convenient voice control has more and more fans. An estimated 200 million devices had been sold worldwide by the end of 2020 – making smart speakers the fastest growing technology on the consumer market.
The majority of the market is dominated by three US companies with their systems: Amazon, Google and Apple. However, the artificial intelligence of the devices requires extensive training, which is often done by people in low-wage countries under unworthy conditions. It is estimated that around 100 million “clickworkers” do this work.
But data protection problems also keep making headlines when it comes to smart speakers. What is less well known is that daily use of a language assistance system can also affect one’s own social behaviour – and often negatively, as psychological research shows. Disparaging behaviour towards women can also be shaped as a result, as the systems that accept orders without contradiction often have female voices.
Role stereotypes and racism through assistance systems
So that language assistance systems do not (re-) produce role clichés or racism, they should actually be programmed with more self-confidence and the willingness to represent ethical principles such as equality. But a smart speaker that contradicts its owner would be bad for business.
And voice assistance systems should do even more: they can – apparently – bring the dead back to life and make them available as “intelligent” conversation partners. Our podcast is also about fundamentally touching human dignity.