You have the right to remain silent. Anything you say can and will be used against you.
We are all familiar with these lines, yet we fail to see how they apply to our everyday lives. In many countries, individuals have rights that protect them against unlawful recording, or interception of those recordings. Yet, we are gleefully bugging our own homes. Even the most conservative of estimates suggest people are now outnumbered by smart devices, many of which are internet-connected and house always-on microphones.
Smart speakers and virtual assistants pose a unique challenge: personal conversations within the safety of one’s own home typically contain a lot more sensitive information than one would typically post publicly on social media. Do you ever say things in the privacy of your own home that could be embarrassing, damaging, or unsafe if made publicly available? For most people, the answer is a resounding “yes”. According to Edison Research, 12% of users even keep their primary smart speaker in their master bedroom.
“For quality and training purposes, this call may be recorded.” In some jurisdictions, businesses are required to inform you that calls are recorded before speaking to you; otherwise they’re infringing upon your rights. Because of this, it has become an industry best practice to advise all callers of any recording. But when it comes to smart devices, legislation hasn’t kept up with technology, which has created an uncomfortable grey area. For instance: manufacturers withhold the fact that human subcontractors are listening in, hide undisclosed microphones in products that aren’t listed in the tech specs, and record users inadvertently up to 19 times a day – recordings they admit are stored forever. But, when those same recordings are requested to help solve a murder, they refuse: suddenly pretending to care about user privacy. Understandably, these relentless headlines are making users deeply uncomfortable.
But tech-savvy users have made it clear that they don’t want to sacrifice convenience for privacy, and smart speaker sales continue to soar. So, the founders of our company wondered: is there a way to have your tech and mute it, too? At Paranoid, our entire business model revolves around preventing consumers from being unknowingly recorded in their own homes. We’re so serious about privacy that our website doesn’t even employ the use of cookies, an omission that is very rare, despite the fact that it puts us at a marketing disadvantage. As a subsidiary of renowned tech company Pleasant Solutions, whose clientele include NASA and Buckingham Palace, rigid security and privacy are built standard into everything we develop.
Below, we have compiled some of the most startling headlines and research – sources we believe should be mandatory reading for anyone with an internet-connected device.
For tips on how to be the advocate of your own privacy, see “Quick Tips.”
Get to Know What You’re Bringing into Your Home
A report from Bloomberg first brought to light the fact that thousands of staff around the globe are listening to and transcribing Amazon’s voice recordings. Sickeningly, two of those staff members allegedly confessed to having overheard what they believe was a sexual assault, and that they share some recordings around with other listeners as “a way of relieving stress.” Not long after, Belgium’s VRT NWS exposed that Google contractors were also listening in. They described similar scenarios, including hearing a woman “who was in definite distress,” but admitted that there were no guidelines regarding what to do in those instances. VRT NWS claims that their reporters listened to over 1,000 recordings, 153 of which they insist “should never have been recorded”, and did not follow the wake word ‘OK, Google’.
Research from Northeastern University found that smart speakers record users inadvertently up to 19 times a day. A separate study identified over 1,000 words and phrases that can incorrectly trigger smart speakers, including (but not limited to): Alexa responding to the words “unacceptable” and “election” (yikes), Google to “OK, cool”, Siri to “a city”, and Cortana getting duped by “Montana”.
YouGov, an international research data and analytics group, published some alarming findings from a 2019 survey of 1,000 smart speaker owners:
- 1 in 3 respondents weren’t aware that their voice recordings are sent to the companies at all.
- Of those who were aware voice commands were saved: only 58% believe it is possible to subsequently access those recordings, and 52% of them “don’t know” if they have the power to have them deleted.
- 57% believe someone who obtained their voice recordings would not be able to identify them.
According to YouGov’s findings: “While Amazon, Google, and Apple (which between them account for more than 90% of smart speaker users) seemingly allow you to delete your recording through your account, an examination of the terms and conditions of these services suggests that this simply deletes them from your view, and that data is in fact retained by the companies for an unspecified amount of time.” To date, many identifications have been made from various forms of data – and is a process some reporters have described as being quite easy. It’s even possible for contractors or potential hackers to uncover the home address of users.
In addition to the obvious privacy concerns presented by the manufacturers themselves, smart devices are also hackable by third parties. The last few years have been a constant barrage of headlines about smart home devices, security cameras, and even baby monitors being hacked. White hat hackers from Germany’s Security Research Labs developed eight different smart speaker apps (referred to as Alexa “skills” or Google Home “actions”) – all of which passed their respective company’s security vetting – that were able to eavesdrop into users’ homes and phish for passwords. Even without human intervention, malfunctioning smart speakers can do damage all on their own: in Oregon, an Amazon Echo recorded a family’s conversation, and sent the recording to a random person in their contacts, one of the husband’s employees.
Unexpectedly, the FBI published an official warning to consumers about the security risks of Smart TVs, firmly stating: “Beyond the risk that your TV manufacturer and app developers may be listening and watching you, that television can also be a gateway for hackers to come into your home. A bad cyber actor may not be able to access your locked-down computer directly, but it is possible that your unsecured TV can give him or her an easy way in the backdoor through your router (…) At the low end of the risk spectrum, they can change channels, play with the volume, and show your kids inappropriate videos. In a worst-case scenario, they can turn on your bedroom TV’s camera and microphone and silently cyberstalk you.”
Keep Work Away From Voice-Activated Devices
Since the COVID-19 pandemic has thrust us into a brave new world of remote work from home, our always-listening gadgets have stirred up debate about smart devices in home offices, and the risks associated with discussing trade secrets and/or privileged info near them. Just mentioning confidential information (such as health or legal advice) within the vicinity of an always-on microphone, whose recordings are stored in the cloud, could violate one of the many rigid privacy regulations currently in effect globally (such as HIPAA, PIPEDA, GDPR, FOIP, COPPA, and countless others). If that data is stored on a server in another country, even more issues arise.
This has prompted businesses and professionals to speak out about the serious privacy concerns associated with having smart devices near workspaces. Mishcon de Reya LLP – the famed corporate law firm that also advised Princess Diana on her divorce – officially requested that their staff shut off any listening devices while working from home, especially when discussing client matters. This is sound advice (pun intended) since smart speakers can hear you from across a noisy room. By that same logic, it’s probably not a good idea to discuss trade secrets near them either, particularly if you are bound by any non-disclosure agreements (NDAs).
Read the Fine Print
In 2009, UK gaming company GameStation.co.uk changed their terms to state: “By placing an order via this web site […] you agree to grant us a non-transferable option to claim, for now and for ever more, your immortal soul. Should we wish to exercise this option, you agree to surrender your immortal soul, and any claim you may have on it, within 5 (five) working days of receiving written notification from gamesation.co.uk or one of its duly authorised minions.” Technically, they reaped thousands of immortal souls. It was a joke, but proved an excellent point: how many of us really read and understand all the legalese we’re agreeing to?
CBC’s Marketplace produced an excellent, eye-opening piece of investigative journalism called “Privacy and smartphone apps: What data your phone may be giving away”. In the special, Marketplace’s team developed their own app with the help of Appthority, a leader in mobile security. Domingo Guerra, Appthority’s president and co-founder, referred to apps as being, in some cases, “the perfect spy tool”, and warned: “In general, we see that free apps are not really free … we’re paying with our data.” Marketplace’s team kept the cameras rolling and took to the streets of Toronto to ask passersby to try their free horoscope app. Nearly all zoomed past the terms and conditions and eagerly hit accept. In doing so, the users unwittingly agreed to allow app administrators to access their microphones, cameras, and virtually everything stored on their phones. More chillingly, these terms are similar to those in use by today’s most popular apps. When later confronted on-camera, the app’s test subjects were asked to share their feelings about what Marketplace had uncovered: including their location, pictures of them they didn’t know were taken, and transcripts of their private communications. “I feel kind of violated,” said Shahbaz, who also referred to the app permissions as ‘disturbing’. “I should have read those terms and conditions.”
By 2014, the average Canadian already had between 18-30 apps on their mobile device, according to research by Catalyst Canada. It’s a stark and unwelcome reminder of our own vulnerabilities. That private text you just sent could theoretically be stored indefinitely on 18 or more different servers, in the data centers of for-profit corporations. And it was all in the terms and conditions…
Understand Your Rights
With the amount of technology and knowledge we have at our fingertips, there’s no excuse to not know your rights. In the U.S., where much of Paranoid’s consumer base resides, the laws differ from state to state. In a number of controversial and fascinating cases, undercover agents/informants have created false scenarios to lure suspected criminals across state lines to confess, in order for the recording to be court admissible. In Canada, where our company is headquartered, the laws are more standard across the country. The law generally allows for what is referred to as “one-party consent”, but not interception of private communications. In layman’s terms: if you are one of the parties actively participating in a conversation, you are allowed to record that conversation without having to overtly disclose it. If you are not one of the participating parties (if you’re listening in, or somehow intercepting the call), you cannot record it. However, there have been some interesting legal challenges revolving around what constitutes a reasonable expectation of privacy.
So how is it that a contractor can be allowed to secretly listen in on your personal conversations, ones you didn’t knowingly consent to provide access to? Particularly when the manufacturers of these devices do not mention it in their terms and conditions, and have explicitly denied users were being eavesdropped on?
Demand More from Your Representatives
According to the documentary Terms and Conditions May Apply (a New York Times’ “Critic’s Pick”), over a dozen bills introduced in Congress to protect online privacy were killed or abandoned after 9/11. Through the Patriot Act, the government was able to increase surveillance while the country was in shock and mourning. Since then, personal privacy protection has never fully recovered in the U.S.
“They say that Facebook sent an army of lawyers so that the final privacy legislation that emerged in 2011 was watered down significantly in a way that wouldn’t affect Facebook’s business model,”alleged Rainey Reitman, of the Electronic Frontier Foundation, in an interview for Terms and Conditions May Apply. In 2012, the following year, Tech Crunch reported that Google tripled the amount they spent lobbying politicians, and Facebook more than doubled their spend (in relation to their spend in the same quarter of the prior year).
And when tech giants get slapped with record-breaking privacy violation penalties, the costs are negligible in relation to their profits. For example, Youtube and its parent company Google reached a record-breaking $170 million settlement with the Federal Trade Commission (FTC) for violating the Children’s Online Privacy Protection Act (COPPA). It was not even Google’s first penalty for breaching privacy; they received another $22.1 million FTC fine in 2012. The $170 million settlement equaled only 0.44% of their revenue, according to their quarterly report. The minuscule impact on the company would do little to discourage future violations, and the children whose privacy was violated will never receive a dime. For context: it’s the mathematical equivalent of giving someone who earns $100,000 a year ($25,000 per quarter) a $110 speeding ticket and expecting it to be an effective deterrent. Even worse, it may serve as an example to other companies that it’s easier and more profitable to just do whatever they want and pay the occasional fine.
(If you’re reading this and are giving them the benefit of the doubt that they did not realize they were violating children’s privacy, remember that this is the same company that wants to put chips in our brains. They can definitely afford to hire privacy experts.)
Reconsider Your Approach to Tech
Today’s tech giants wield their power with reckless abandon, without any consideration of whose privacy they violate. For example: in 2009, Facebook silently changed their Privacy Policy overnight, automatically changing the default for a user’s posts from friends to public. Over 1 million people joined a fan page that called to have the policy reversed. Mark Zuckerberg, when asked about it, nonchalantly replied: “Doing a privacy change for 350 million users is not the kind of thing that a lot of companies would do… We decided that these would be the social norms now and we just went for it.” Also in 2009, Google’s Executive Chairman Eric Schmidt referred to the cross-collection of data from all searches on a shared computer as a ‘worst case’ scenario, and noted: “we don’t do that, and we’re not likely to do that”. Just two short years later, Google did exactly that: the thing their then-executive had referred to as the worst-case scenario.
There’s an old saying amongst the tech savvy: If you’re not paying for the service, you are the product. Data processing, storage, moderation – these things are all very expensive. Giving away a free service doesn’t pay the bills, so that’s clearly not their business model. Collecting users’ data is. That data becomes their property, for them to sell to the highest bidder or retain internally to further promote to users. The exact cost to produce smart speaker models is not available, but many have theorized that manufacturers regularly sell them at prices below cost. Amazon has sold their Echo Dot for as low as 99 cents. Notably, there is also no monthly subscription fee, the kind you might expect from similar software products. When you have billion-dollar companies selling goods at a rate of negative profit, alarm bells should immediately go off.
Despite the fact that very few people have even heard of them, companies like Acxiom are amassing vast commercial databases of consumer information. Executives of Acxiom have claimed its database contains “information about 500 million active consumers worldwide, with about 1,500 data points per person” which includes the majority of US adults. Rainey Reitman of the Electronic Frontier Foundation, in an interview for the documentary Terms and Conditions May Apply, asserted: “These are the types of companies that a potential employer would go to, to try and run a background check on somebody before they hired them. They are able to connect the fact that you went to site A, and then later to site B, and then eventually to site C, and create this detailed history of what sites you visit online.”
Consider Paying for Privacy
Hear us out: paying for privacy and security is nothing new. (And it’s becoming increasingly necessary.) Unlisted phone numbers, private domains, antivirus software, and even the curtains on our windows have always come at a price. Your instant gut reaction might be to ask yourself: Why should I have to pay for privacy? But we encourage you to instead consider: Why doesn’t privacy come standard? And how much is my privacy worth? Looking toward the future, it seems a small price to pay to prevent your future employers from knowing everything you’ve ever searched for on the internet.
For context: the price of the most popular smart speaker models (typically between $29-$59) plus $39 for a Paranoid add-on is still less than a quarter of what the original iPod cost in 2001, never mind adjusting for inflation or the fact that iPods had significantly fewer capabilities.
Quick Tips
Question Everything: Why does that app need microphone permission? What are the future implications of my data? What is my privacy worth to me?
Take Security Seriously: Never use default passwords for any device – including routers, anything connected to the internet, and especially devices with microphones and/or cameras. Change passwords often, and never reuse the same password for multiple accounts. Enable two-factor authentication (2FA) on any account that offers it.
Reduce Your Data Footprint: Data is forever. The only way to truly protect your privacy is to prevent unwanted data from being recorded (and, therefore, stored indefinitely) in the first place. It’s almost impossible to have things permanently deleted or deindexed after the fact. Don’t post anything online that you wouldn’t be comfortable having displayed as a top result when you search your name.
Keep Your Work Away From Always-On Microphones: Discussing topics such as health advice, legal cases, or trade secrets near voice-activated devices could be a violation of privacy regulations and/or any NDAs in effect.
Report All Violations: Don’t be afraid to report businesses and file privacy complaints. In order for governing bodies to take action, someone needs to bring the issues to light. A “someone else will probably do something about it” mindset helps no one.
At Paranoid, we think a healthy dose of paranoia is often warranted. This inspired us to create affordable devices that can help people take back their power. It is our firm belief that you should be able to enjoy modern conveniences without having to sacrifice your privacy or jeopardize your right to a future unhindered by a trail of private data.
You can choose convenience and privacy.
You have the right to remain silent.
Get Paranoid.
Further Information
A Guide for Individuals – Protecting Your Privacy: An Overview of the Office of the Privacy Commissioner of Canada and Federal Privacy Legislation