Obsessed with technology?
Subscribe to the latest tech news as well as exciting promotions from us and our partners!
By subscribing, you indicate that you have read & understood the SPH's Privacy Policy and PDPA Statement.
News Categories

People could be listening to the conversations you have around your Google Assistant device

By Koh Wanzi - on 12 Jul 2019, 2:40pm

People could be listening to the conversations you have around your Google Assistant device

Google Home and Home Mini smart speakers.

You probably already know that Google records what you say after its Assistant hears the "Hey Google" trigger phrase. You can opt out of this by turning off the "Voice & Audio Activity" function in your Google account, or delete recordings, but these options aren't the most prominent or easily accessible. But recordings aside, what you may not know is that human workers could actually be privy to your conversations.

Belgian public broadcaster VRT recently gained access to over 1,000 recordings from a Google contractor. This contractor was part of a global workforce that helps Google review select audio recordings picked up by the Assistant from devices such as phones, smart speakers, and even security cameras. While these recordings aren't tied to your user account, they often contain personal information like addresses and names that could then be linked to your identity.

The biggest problem is when Google Assistant mistakes a certain phrase for its trigger phrase and then wakes up and starts recording. In situations like that, it could pick up just about anything and record conversations it isn't meant to hear. In fact, VRT says that this was the case in about 150 of the recordings it reviewed. 

The recordings the broadcaster got access to also included bits of phone calls and private conversations. Someone's love life was discussed, as was the rate at which another person's child was growing. One even contained a couple's address and information suggesting that they were grandparents. 

On Google's end, it says that transcribing conversations is a critical part of building speech technology that can eventually understand all languages, accents, and dialects. The company claims it also instructs contractors to only transcribe audio directed to Google, and not anything else like background conversations, and presumably, a mistakenly recorded conversation. Furthermore, only 0.2 percent of all recordings are reportedly reviewed by humans.

That said, the practice is still concerning. After all, you don't want your private conversations to be left up to the discretion of some random worker. As it turns out, the worker who shared the recordings with VRT felt uncomfortable listening in, which prompted him to leak the recordings in the hopes of calling more attention to the issue. 

Google isn't the only one doing this. Amazon also has workers that review and transcribe audio collected by its Echo devices after they hear their wake word. Like Google though, Amazon's policy on this is vague. Both companies say no data leaves the device until the wake word is detected, but they don't clarify that they could inadvertently start doing so.

Unfortunately, it doesn't seem like this practice will change any time soon. In response to this, Google is simply saying that it will continue to improve how it explains its settings and privacy settings to people, and review opportunities to further clarify how data is used to improve its speech technology.

Source: VRT