' Should Smart Personal Assistants Ever Report Your Conversations? | MTTLR

Should Smart Personal Assistants Ever Report Your Conversations?

The rise of smart home devices and personal assistants such as Amazon’s Echo, Google’s Home, and Apple’s Siri has also led to an increased wariness about these devices listening to private conversations. These fears are not unfounded: according to the Echo’s privacy policy, the device “processes and retains audio, interactions, and other data in the cloud.” Consumers can find warnings about the privacy implications of the Echo they unwrapped on Christmas morning all over the Internet.

Privacy horror stories regarding smart home device recordings are everywhere. Some of the patents that companies have filed for are illustrative. Amazon owns a patent to recommend cough drops and soup to those who sound sick while speaking to their Echo. Recordings of a criminal defendant from the Echo are being used to prosecute him. An Echo device secretly recorded and sent a clip of a couple’s conversation to someone in their contacts.

These examples are enough to make anyone concerned about what the device that they bring into their home might be hearing and what it might be using that information for. Users are concerned not just that companies are retaining the data collected by the devices to more effectively market their own products, but also that the data is being sold to third parties. Further, the threat of the government accessing this information adds a dystopian Big Brother element.

However, the overwhelming backlash to the sharing of data from smart personal assistants comes into conflict with the reaction to a recent news story. Earlier this month, a 13-year-old Indiana boy was detained by police after he posted a screenshot of a response from Apple’s Siri. The boy told Siri “I am going to shoot up a school,” to which Siri helpfully responded with a list of schools in the area. Though the boy claimed the post was a joke and the police found that he did not have access to weapons, the thought that a smart personal assistant was ready and willing to help with this request is obviously troubling.

While most people have an aversion to the thought of their private conversations being recorded and transmitted, some might also say that they would want Siri to step if it were asked about a school shooting. As school shootings have become commonplace in America, some might argue that we should do whatever we can to stop these events and other types of crime and violence.

It is unlikely that most people who are serious about committing a mass shooting will choose to ask their smart device about it. But it is certainly possible that a smart home device might record a conversation planning such a shooting. Though most people would probably determine that, in the end, the risks of allowing smart devices to report conversations that people have in their homes outweigh the benefits that could be gained, the possibility of preventing a devastating public shooting raises interesting questions about when consumers might actually be okay with having their private lives recorded.*

*Rachel Foster is an associate editor on the Michigan Technology Law Review. She can be reached at rcfost@umich.edu.

Submit a Comment

Your email address will not be published. Required fields are marked *