In February 2019, the Federal Trade Commission (FTC) collected the largest civil penalty for a violation of the Children’s Online Privacy Protection Act (COPPA) to date. The target: TikTok, a popular video sharing app. Tik Tok paid $5.7 million in a settlement after being accused of a host of privacy violations, including failing to obtain “verifiable parental consent prior to collecting, using and/or disclosing personal information from children,” and failing to delete personal information after parents requested it be deleted, or otherwise “retaining personal information longer than reasonably necessary.”
COPPA was enacted in 1998 by the FTC and aims to protect children’s privacy by regulating how websites collect, use, and disclose the personal information of children—defined by COPPA as persons under the age of 13. The law is aimed at websites or online services that are directed toward children, or that have actual knowledge that they are collecting or maintaining personal information from children. The latter part is supposedly what tripped up TikTok, which does not formally allow children under the age of 13 to create their own accounts.
On top of paying the monetary penalty, TikTok had to introduce several changes to the app in order to bring itself into compliance with the settlement order. This order also subjected TikTok to extensive compliance reporting and record keeping. In a statement, TikTok announced that it would be launching a “separate app experience” for its younger users. It also beefed up its privacy notifications—including a new account (@tiktoktips) that posts cutesy videos explaining how users can control the content they are exposed to—and promotes general “positive vibes” on the app. The video below demonstrates how to use the “digital wellbeing” features to block unwanted content:
TikTok has launched the careers of a few “lucky” adolescents, but not everyone is a fan of the attention, which often morphs into psychological or sexual harassment. Parents are rightly concerned about the content their children may come across on the app; but at the same time, there has been a rise in the number of parents exposing private information about their children online, from making Instagram accounts for yet-to-be-born children to profiting off of sponsorship deals that put their children front and center online.
So what’s a nation to do? Author of COPPA, Senator Edward Markey, was not completely satisfied with the result of the TikTok settlement. Markey noted the “historic” nature of the penalty, but he believes that the fine “is not high enough for the harm that is done to children and to deter violations of the law in the future by other companies.” Facebook has also opened itself up to accusations of violating COPPA as well in light of the recent controversy over “duping children” into making in-app purchases through a variety of apps.
It’s clear that TikTok is not the only platform parents need to be concerned about, and privacy isn’t the only concern to arise from children’s use of the internet. Sure, some parents might be easily convinced of viral hoaxes about the dangers of the internet; but the occasional overreaction should not deter anyone from taking the issues of bullying, strangers, and exposure to suggestive and harmful content seriously.
Research conducted in 2013 confirmed what most people know anecdotally today: An increasingly large number of young children are using the internet. Remember the emotional reaction you had the first time you saw a toddler navigate Snapchat filters better than they could speak in full sentences? Preliminary research has demonstrated concerning effects of technology on children. But, Congress is pushing for more. Proposed legislation would give the National Institute of Health $95 million to further study the effects of technology on children over the course of several years.
The fact is that the internet is an important resource for children, despite the risks. Additionally, social media platforms and the public alike have been wary of more stringent regulation of the internet. The TikTok settlement demonstrates that we have at least one tool at our disposal to hold platforms accountable for making the internet less safe for children—that is, if we are willing to use it.
However, COPPA has also been described as an “‘opt-in’ privacy band-aid” that doesn’t begin to reach the level of protection provided by, for example, the European Union’s earliest internet regulations, as well as its more recent General Data Protection Regulation. While some commentators argue that congressional action will not come soon enough, Markey has not given up on the fight. In March 2019, he and Senator Hawley introduced a bill which proposes expanding COPPA protections—including provisions prohibiting the collection of personal data for targeted marketing of children and empowering minors to request the removal of public, personal information “to the extent technologically feasible.” If the bill passes, COPPA could prove to be an even more powerful tool for protecting the privacy of children online.
As it stands, TikTok still has some kinks to work out with the new policy. Whether the changes TikTok has made to its app are any better at protecting children’s privacy than before remains to be seen. Despite the uncertainty, Markey remains adamant that the solution lies, in part, in bipartisan support of “COPPA 2.0,” stating, “if we can agree on anything, it should be that children deserve strong and effective protections online.”*
*Solana Gillis is an Article Editor on the Michigan Technology Law Review. She can be reached at firstname.lastname@example.org.