Beware of these words which could trigger your Apple Siri, Google Assistant, Amazon Alexa, and Microsoft Cortana

1483

To all those using virtual assistants like Apple Siri, Google Assistant, Amazon Alexa, and Microsoft Cortana, here’s a piece of news that might grab your interest.

 

A team of researchers in Germany has discovered that even conversations from television programs taking place in front of a virtual assistant could activate it to record the conversations taking place thereafter. And nearly 1K words were identified by the researchers that could trigger a digital assistant without the knowledge of their respective users.

 

And what’s more concerning about this activity is that the words can activate the AI-based digital assistants to such an extent that they keep recording the conversations that take place in their vicinity for hours only to transmit them to centralized servers for quality assurance and analytics.

 

The team which discovered the alarming implications of using virtual assistants that could erode a user’s privacy was a group of security researchers working for Ruhr Universitat Bochum and the Max Planck Institute for Cybersecurity and Privacy in Germany.

 

And some of the words identified and shared by the University researchers are election, tobacco, unacceptable, letter, Ok, Cool, chill, who is reading, Ok, Hey Jerry, a city, Montana, duet, love, pose, what the, are you, silly, boy, some metropolitan city names, train,

 

Reacting to the news, a developer working on Amazon Alexa said that all digital assistants are made sensitive to a set of words to avoid the wrath of frustrating users and so they rather start too often as soon as they hear the words or the phrases- irrespective of them being expressed by a human or a computer/TV.

 

That’s seriously concerning…..isn’t it?

Ad
Naveen Goud is a writer at Cybersecurity Insiders covering topics such as Mergers & Acquisitions, Startups, Cyber Attacks, Cloud Security and Mobile Security

No posts to display