Digital assistants are like an electronic butler than can help you turn up the heat, lower the lights or pick your favourite playlist, all while helping you make and decline calls like a boss without lifting a finger.
Those little speakers are so helpful because they connect you and your home to the vast stores of knowledge in the clouds of Google, Apple and Amazon. But some privacy experts say that's precisely why consumers should read the fine print and ask questions about what happens to the private information they share with their new helper.
In fact, former Ontario privacy commissioner Ann Cavoukian says Canadians should think twice before bringing such a device into their home.
"I think it's a really bad idea," said Cavoukian, who is currently leading the Privacy by Design Centre of Excellence at Ryerson University in Toronto. "It's precisely the element of the unknown that should be causing you concern. You don't know how your information might be used, to whom it might be disclosed."
- 'We're paying with our data': Privacy can be a problem with apps
- Connected devices quietly mine our data, privacy experts say
- Why data is 'the new oil'
Apple, Google and Amazon say their customers' privacy is very important to them, and they don't sell user information to third parties.
But Cavoukian still advises caution, because she says history has shown new technology is prone to privacy issues.
"A few years ago, the Federal Trade Commission in the [U.S.] did a study of 12 mobile health and fitness apps, like Fitbit," Cavoukian said. "They found that the information was flowing out to 76 different third parties that the users didn't know anything about."
In 2013, the FTC quoted a Privacy Rights Clearinghouse study of 43 free and paid fitness apps that found many had no privacy policies, and up to 40 per cent disclosed information to third parties without telling customers they were doing so.
"It can come back to haunt you — trust me. This can be very devastating," she said.
But at the Ontario Institute of Technology, Andrea Slane, associate dean of strategic research and development, doesn't see digital assistants as necessarily a greater threat to privacy than, say, a smartphone.
"They can do positive and potentially negative things if we don't guard against [unexpected uses of our data]," she said. "This is not just digital assistants; this is anything that involves data analytics at this point."
One difference with a newer product like a digital assistant is users might be less aware of what personal information they're actually giving up.
"Social media, cellphones, there's a certain level of choice," Slane said.
"You still don't know what exactly is happening with your data but at least you have some idea of, 'Okay, what I'm giving up now is my location' or 'What I'm giving up now is my preference on things I like based on what I click on.'"
How they work
Digital assistants build on existing technology and voice-activation software similar to that in cellphones and other electronics.
"What they do is they record your voice and they send it back to their servers and they use what's called machine learning to create a neural network, and they scan these responses to return to you the most relevant responses." said Daniel Blair, an award-winning technology researcher and CEO of a virtual reality startup in Winnipeg.
"They dig through all that data and they scrub it and they try to use it to give you the most relevant response based on what you asked for, and those servers could be anywhere in the world."
Companies value privacy
Amazon's Echo, Google Home and Apple's HomePod all have a "wake" or "hot word" that activates them. The companies say unless activated, their devices aren't listening or streaming information from your home to the cloud.
In emails to CBC, both Amazon and Google said the customer has control over the information collected by the devices and can delete that information at any time by logging in to their Amazon or Google account.
Apple didn't respond to questions, but it did direct CBC News to its HomePod site, which says: "Security and privacy are fundamental to the design of Apple hardware, software and services. With HomePod, only after 'Hey Siri' is recognized locally on the device will any information be sent to Apple servers, encrypted and sent using an anonymous Siri identifier."
Adjust device settings
Blair says if you do decide to jump on the digital home assistant bandwagon but have privacy concerns, you can modify settings and use mute functions to limit how often the device is active in your home.
But, as he points out, a digital assistant could become less useful if you delete the information it collects and limit the ways you use the device.
"What you're doing by allowing them to have access to all this data, is you're allowing them to use this data to better cater content for you."
Blair recommends doing some research before buying a device.
"Be aware as a consumer what they're doing with your data. And when you sign up for accounts, and when you actually research these devices, you should look into where are the servers located and what is the company's intention to do with your data?" he said.
"Learn about what happens to your information, and what, if any, consents are required," she said.
She suggests customers write a letter to the manufacturer that says, "You do not have my consent to disclose my information to any third parties."
"Whether or not they'll do it, who knows?" she said. "But at least you'll be making your instructions known very clearly."