' Privacy Policies in a World Without Privacy. Is There Such a Thing as a Good Privacy Policy? | MTTLR

Privacy Policies in a World Without Privacy. Is There Such a Thing as a Good Privacy Policy?

Questions surrounding data privacy and what happens to our personal data when companies collect it have risen to the forefront of public discussion more in recent years than ever before. As the data that companies collect grows more and more personal, where the data is stored and how it is used has become an increasingly important question. This is especially true for companies that collect and store biometric data – that is, data that measures biological features, such as fingerprints and facial features.

One such company is Wireless Lab OOO, the developer of FaceApp, an application that uses AI to transform pictures that users upload using the mobile application, allowing users to see what they might look like older, younger, as a different gender, etc. FaceApp was created by Yaroslav Goncharov, a Russian software developer who previously worked for Microsoft and later founded his own start-up company. Goncharov currently leads the twelve-person team behind FaceApp, an application he envisioned as an “automated photo editor,” for which people would pay a subscription fee and which would essentially replace photoshop with AI. The app had been downloaded over twelve million times as of July 10, 2019, following an explosion of exposure on social media due to its use by celebrities such as LeBron James and Drake. With this exposure came scrutiny and controversy, particularly over the terms of use, and the control they gave the app developers over user data.

The clause in the terms of use that has received the most public scrutiny is the section in the user content section that reads:

“You grant FaceApp a perpetual, irrevocable, nonexclusive, royalty-free, worldwide, fully-paid, transferable sub-licensable license to use, reproduce, modify, adapt, publish, translate, create derivative works from, distribute, publicly perform and display your User Content and any name, username or likeness provided in connection with your User Content in all media formats and channels now known or later developed, without compensation to you. When you post or otherwise share User Content on or through our Services, you understand that your User Content and any associated information (such as your [username], location or profile photo) will be visible to the public.”

Regulators worry that users do not realize the full extent of the rights that they are granting FacApp, or where FaceApp is storing the data. FaceApp stores user information on the cloud, so deleting the app does not delete user data. The ability to sell such sensitive biometric data as that involved in facial scans, is frightening for many consumers. Particularly when the company in question is a Russian company, who has been accused of racism in the past. Given the current political climate, a Russian company that keeps and sells users biometric data scares many American consumers and law makers.

While it might be the case that the FaceApp terms of use are more distressing than those of other technology companies like Facebook, the most concerning part about it might be that they are not actually as different as many people would like to believe. The perpetual and irrevocable language of the FaceApp policy is the primary feature separating it from the Facebook terms of use for example. Facebook’s terms of use states “you grant us a non-exclusive, transferable, sub-licensable, royalty-free, and worldwide license to host, use, distribute, modify, run, copy, publicly perform or display, translate, and create derivative works of your content (consistent with your privacy and application settings)”.  Even though the license in the Facebook policy is not perpetual, the terms do state “You should know that, for technical reasons, content you delete may persist for a limited period of time in backup copies (though it will not be visible to other users). In addition, content you delete may continue to appear if you have shared it with others and they have not deleted it.”

Security experts such as Alex Holden of Hold Security LLC have pointed out that what happens to biometric data after it is collected is important “because institutions are increasingly relying on facial identification and automation to secure cryptocurrencies, bank accounts, electronic documents and international borders”. In the wake of FaceApp’s meteoric rise to prominence, legislators, like US Senator Chuck Schumer of New York, have grown more concerned over where personal data that companies collect ends up, and have called for the FBI and FTC to investigate FaceApp. The FTC recently closed a year-long investigation into Facebook, following the Cambridge Analytica scandal. The result was a finding that Facebook “repeatedly used deceptive disclosures and settings to undermine users’ privacy preferences” which warranted a five billion dollar fine. “The fine is the largest the Federal Trade Commission has levied on a tech company,” and shows that following the current status quo of data privacy and terms of use agreements is not enough to protect a company that collects personal data, and not enough to ensure compliance with federal regulations.

Companies that collect such personal data moving forward will have to think carefully about what they put into their terms of use agreements, and about where that data goes. The recent FaceApp controversy over the terms of use and how much data it allows FaceApp to gather, shows that consumers are growing more aware of, and more concerned with, where their data ends up. FaceApp’s revenue relies on a premium subscription model, but the high number of downloads has not translated to subscriptions, with “just 1% signing up for a single month’s premium use at $3.99”. FaceApp is still profitable, and the privacy concerns are certainly not the only factor affecting subscription rates, but consumers are starting to care more and more where their data goes. Moving forward, it is in the best interests of companies that collect personal data to make sure that they have terms of use and privacy policies in place that are readable and not overreaching, both to keep consumers happy and to ensure compliance with regulations.

The General Data Protection Regulation that was implemented in the European Union in 2018 includes a clause requiring privacy policies to be delivered in a “concise, transparent and intelligible form, using clear and plain language”. The majority of technology privacy policies do not meet these standards, as most are dense to the point that they necessitate a greater reading level than the average college graduate possesses, making them understandable to a relatively small percentage of consumers. The privacy policies of many companies, such as Airbnb, Hulu and even CNN require a reading level greater than that of the average professional, meaning some policies are hard for even lawyers to comprehend. Yaroslav Goncharov indicated that, the amount of calls and inquiries that FaceApp got from consumers worried about the privacy policy made it hard to do more work on the app itself, since his team had to devote so much time to responding to these concerns. He further added that “People got scared because they think everything we say in this policy we do, which of course is not the case at all.”

Companies that have to navigate the data privacy space would be wise to avoid these problems and disruptions to their work by having a privacy policy that is tailored to the actual data that they collect and what they use it for, as opposed to just using the current, flawed industry standard agreements that are currently under attack. Having tailored privacy policies would not only allay concerns about overbroad collection of data, but also help make it easier for companies to comply with new data regulations.

 

* Jason Zaccaro is an Associate Editor on the Michigan Technology Law Review.

Submit a Comment

Your email address will not be published. Required fields are marked *