Sales of smart home-related products are growing at double-digit rates in many countries across the globe (particularly in the United States and Western Europe), as highlighted at the IFA Global Press Conference in Spain last month. For many consumers, the allure of the smart home is difficult to overcome. What consumer doesn’t like the idea of using simple voice commands to raise their window shades in the morning, change the home temperature, or turn on the coffee maker without getting out of bed?
Before you eagerly embrace the smart home, though, it’s important that you understand the privacy risks. That’s not to say there aren’t advantages; smart home technologies have enormous potential to save time, increase personal productivity, and provide a level of convenience that would have been unimaginable just five years ago. Even so, it’s important to proceed with your eyes open.
Without taking sides in the ongoing debate over which digital assistant solution is best, it’s likely that a smart speaker solution with embedded Google Assistant or Amazon Alexa functionality will be your starting point. The good news is that the most popular appliances, lighting solutions, security products, and door locks (just to name a few of the product categories) can easily be managed and controlled by Google Assistant or Amazon Alexa. The bad news is that these smart home offerings present privacy risks.
These concerns start with the digital assistant products themselves. These devices are continuously listening for “activation” words or phrases that can be executed by connected corporate servers. This is not a trivial risk to privacy; you basically have a microphone in your home that is always listening to your conversations. While companies like Google and Amazon like to state that they go to great lengths to put privacy protections in place, that hasn’t stopped embarrassing — and scary — situations from occurring. Last December, a report emerged that an Amazon consumer in Germany requested data about his personal activities and was given access to 1,700 audio recordings he didn’t even know existed. Amazon is not alone — eerily similar stories have been reported with Google Assistant products, as well.
If that doesn’t scare you (and it should), these solutions create recordings on servers that might potentially reemerge in lawsuits or even law enforcement investigations. It doesn’t take a wild imagination to conjure up a scenario where digital records get called up during a messy divorce or child custody battle. Also, keep in mind that these records aren’t being stored by humanitarian organizations; they are typically used to provide personalized advertising and marketing offers. Both Amazon and Google will let you review and delete your voice history; it would be advisable to do so on a regular basis.
Many inexpensive smart home devices, especially IoT products that are manufactured in Asia and sold under off-brand labels, have a bad reputation of being easy to hack. Even Google, which prides itself on its second-to-none security reputation, recently suffered an embarrassing intrusion. The company’s Nest Cam was effectively hacked because it didn’t employ the appropriate precautions in the software app. This sort of breach is the stuff of nightmares — many parents use this product as a baby monitor.
Do these issues sufficiently frighten you away from the smart home? I’m a huge proponent of the smart home and the opportunities it presents, but the thing to remember is that trading privacy for convenience is a bad swap. If you take reasonable security steps with your home network and are vigilant about what tabs your digital assistant keeps on you, you should be able to implement a smart home without sacrificing your family’s privacy.
Disclosure: Moor Insights & Strategy, like all research and analyst firms, provides or has provided research, analysis, advising, and/or consulting to many high-tech companies in the industry, including Google. The author does not have any investment positions in any of the companies named in this article.
This content was originally published here.