Tag Archives: privacy

Convenient or Creepy: The Smart Speakers are Listening

In a previous blog post, we talked about privacy as it relates to fitness wearable devices. To continue on the privacy vein, let’s examine smart speakers like Google Home, Alexa, and Siri. If you use any of these devices, you know the convenience associated with voice commands. Need the time? Alexa has it. Want to know some obscure movie fact? Ask Siri. Need to buy paper towels? Tell Google Home. Easy peasy, right? In today’s busy world of work, events, and to do lists, these devices lend a helpful hand, but at the price of consumer privacy.

Recently, both Google and Amazon filed patents to allow even broader abilities to listen in on consumer lives. Rather than waiting for a wake command as the speakers do now, the device would always be on and listening for keywords, but what is it doing with that information? According to both companies, the information would be used to target advertising based on the collected data. Say, you’re sitting around the dinner table discussing your next family vacation and then Alexa starts to recite travel deals to your target destination? How about Google Home listening to you discuss your medical issue with a friend and then offering related prescription ads? What about recognizing your mood, or an oncoming cold based on a sneeze, or an argument with your spouse? Google Home could even suggest parenting tactics based on its monitoring of your family interactions. You should spend more time with Susie, Google says. Do you want all of that in the cloud for advertisers to mill through? Is this level of technology cool or creepy?

While these assistants can be a source of convenience, we’d be remiss not to consider the potential consequences. Could information collected from smart speakers be used in a court of law? Are they discoverable? Could anything obtained be used for a conviction? Technology often moves faster than the law, but the law is catching up. Consider the Arkansas murder case where investigators wanted to use information gathered by Amazon’s smart speaker, Echo belonging to the suspect. Amazon jumped in citing First Amendment protections, an important step in setting a precedent for how this technology could be handled in the future. The suspect ended up granting permission for the data to be collected, but in future cases, consent and rights will need heavy consideration. For an innocent person accused of a crime or the unfortunate victim, the smoking gun, in this case, could be a home assistant device. But take a moment to consider the flip side. Could a conversation taken out of context lead to suspicion in the event something goes afoul, making it difficult to defend oneself? Do you really want your casual conversations used against you in a court of law?

As with much of current technology, consumers need to weigh the pros and cons of adopting smart devices. You must ask yourself the question is giving the device full access to your life, habits, interests, and vices worth the convenience? Where is the line and once we cross it, can we ever go back?

I always feel like somebody’s watching me. Hint- It’s your fitness tracker.

Cheesy 80’s song reference aside, privacy in the age of smart EVERYTHING is a serious concern. While it’s convenient, helpful, and even seductive to have all this technology at our fingertips, or wrists as it were, we would be remiss not to consider what we’re giving up in exchange.

Consider the situation recently experienced when a user discovered her entire jogging route was made public for strangers to view at their leisure. As a single woman who often ran in the early morning or evening hours in an urban setting, this is beyond alarming. Or, the unintentional national security risk unleashed when Strava published a global heat map that identified secret military outposts. Not good. Many fitness trackers have a social aspect that encourages users to interact with others, great for motivation and comradery. The tradeoff? You’re giving up your location information. It can also provide your full name and picture. It wouldn’t be a tough leap from there to figure out another user’s patterns, routes, workplace, and residence. That’s pretty high on the creepy scale.

Another common trend popping up is insurance and wellness programs encouraging the use of fitness wearables, often offering discounts or points for achieving specific activities. Many people are excited to sign up for these programs as a way to motivate themselves to live healthier lives. But what if the insurance company uses that data at a later date to determine you are not as healthy as they would like and raises your premiums? Or, the seemingly helpful suggestion of programs and tips to help you meet your goals but penalizes you if those goals are not met. What if you find your car insurance rates going up based on your driving habits derived from a fitness wearable? Is this still a good tradeoff for clocking your daily steps?

The law is paying attention too. Fitness trackers are being used to prove injury after an accident in the form of reduced activity, or in some cases, to prove insurance fraud when the tracker shows activity that doesn’t match the person’s injury claims. It can also be used to identify your whereabouts when a crime was committed- a potential alibi or smoking gun, as the case may be. In one situation, a woman was charged with making a false crime report after her Fitbit contradicted her timeline. With unclear guidelines on what is protected and isn’t, it seems the data from your wearable can and will be used against you in a court of law.

In response to these privacy concerns, some companies have made improvements. Fitbit, for example, voluntarily complied with HIPAA in order to partner with corporate wellness programs. Also, many companies have changed their default settings to opt-in rather than opt-out as was the case before. These are positive changes in a world where personal privacy is at a premium.

It’s clear wearables and other technologies such as Amazon Echo and Google Home are only increasing in popularity, so what can you do to protect yourself? You can start with one of the most obvious but often avoided tasks- read the fine print. Understand the privacy policy of the tech you’ve adopted and what the company can and cannot do with your data. Participating in a workplace wellness program is great for the culture, health benefits and friendly competition, but before handing over your wearable data, learn what the policy is and what it means to you. Understand where your information goes and what you can do if there is an adverse impact. Check your privacy settings on wearables with social components to ensure you’re only sharing information you’re comfortable with and do your research before buying to understand what you’re getting.

Once you’ve done all that, there’s only one thing left to do. Get up and kill that daily step goal.