Online privacy advocates and operators were shocked when, in 2019, the company ByteDance (owners of Musical.ly) were slapped with a record $5.7 million fine by the Federal Trade Commission (FTC) because it knowingly violated COPPA, the Children’s Online Privacy Protection Act. This was a signal that regulators were going to take children’s online privacy seriously.
But what’s more interesting is how both the legal landscape and the available technology solutions have co-evolved since the law became effective in 2000. Online operators, for example, may end up liable for information simply by being the recipient of reports that indicate their users are under the age of 13, or by using third-party algorithms that could possibly uncover this information. This means that COPPA violations could become more widespread, and require much more rigid kinds of age verification methods.
As it turns out, TikTok’s 2019 COPPA violation case might well have been the proverbial canary in the coal mine when it comes to verifying the digital identity and preserving the digital privacy of minors.
COPPA, which became law in 1998 and effective in 2000, aims to protect the personal information of children in online environments. The act applies to “operators of commercial websites and online services (including mobile apps and IoT devices, such as smart toys) directed to children under 13 that collect, use, or disclose personal information from children, or on whose behalf such information is collected or maintained (such as when personal information is collected by an ad network to serve targeted advertising).” This includes foreign-based websites and online services, as long as those sites are being directed to, or are collecting personal information from, children in the United States.
To comply with COPPA, operators must:
For those looking for specific provisions, the FTC provides general guidelines with regard to COPPA, along with answers to frequently asked questions. Here, COPPA’s application in a case against ByteDance (owners of TikTok) reveals just how far the law extends.
One of the largest civil penalties ever paid for non-compliance of COPPA involves a company then called Musical.ly—but now known by its more popular rebranding, TikTok.
According to the FTC, the Musical.ly app met COPPA’s definition of a site “directed to children,” based on audience composition statistics, subject matter, visual content, music, and the appearance of celebrities who specifically appeal to children.
The complaint also alleged that Musical.ly had actual knowledge the company was collecting personal information from children. The user profiles on the app showed that users were sharing their date of birth and/or grade in school, for example. However, the company did not seek parental consent before collecting names, email addresses, and other personal information.
In February 2019, the FTC issued a fine of $5.7 million to ByteDance, which subsequently agreed to pay the fine. ByteDance also agreed to add a “kids-only” mode to TikTok (which in the meantime had been merged with Musical.ly and re-branded).
In addition to the fine, there was also a class-action suit against ByteDance in 2019, which led to an additional $1.1 million settlement. Since TikTok, similar suits have been brought against YouTube (Google) and Yelp.
Just last year, two senators put forward the Children and Teens’ Online Privacy Protection Act (CTOPPA), a piece of legislation meant to strengthen COPPA. If passed, the act would extend COPPA protections to children under the age of 16 (meaning that children ages 13 to 15 would now be covered).
But perhaps the most significant part of the bill is the “constructive knowledge” standard. As it stands today, COPPA requires that the operator has actual knowledge of children’s private data on their networks—for example, if a child shares their age or birthday when making an account on a website, or if the website is obviously targeting a younger demographic. Both of these were the case with ByteDance.
It has been argued that applying a standard of “actual knowledge” is simply pushing operators to more passive forms of data collection while “looking the other way” when it came to things like age verification.
If CTOPPA passes, that standard will be changed to one of constructive knowledge, where an operator can still be liable when it “directly or indirectly collects, uses, profiles, buys, sells, classifies or analyzes (using an algorithm or other form of data analytics) data” that belongs to children. For example, constructive knowledge creates liability if an operator gets reports on users that indicate their ages or birthdays, or if the operator receives complaints from parents of children covered by the law.
In short, most businesses will not be able to “claim ignorance” when it comes to the ages of their users or the personal data that has been collected on those users. If the data show that users are under a certain age, operators will be liable whether or not they have explicitly and intentionally collected age data. This has the potential to dramatically increase the number of companies paying fines, as the constructive knowledge standard is easier to prove than the actual knowledge standard.
And as fines increase, you can bet that online companies will be beating a path to companies providing the most up-to-date age assurance technologies.
Whether or not operators want to continue marketing their services to children, most will need some form of age assurance included in any process where personal data is gathered from users.
Suppose a website does want to market to children under the age of 13. Its operator would need to verify when a user is under that age. And if it wanted to store or use any data for such a user, the operator would also need to notify the user’s parent or legal guardian, and then verify their age.
What’s more, it is clear that simply asking a user’s age, or the age of a parent or legal guardian—so called “self declaration”—will not be sufficient for these purposes. Children will often fake their age if prompted to share that information in order to gain access to a site. But if parents lodge a complaint, or if CTOPPA passes and age information resides in analytics data, an operator might still be liable, even though they have technically “verified” age.
That said, updates to COPPA and similar regulations do recognize several accepted forms of age assurance, including:
It is likely that the accepted age assurance techniques will change over the years, as technology advances and as older forms of verification prove to be unreliable (anyone can receive an email these days) or become obsolete (faxing a form? in 2022?). Chances are good that more advanced forms of biometrics with liveness detection will become part of COPPA compliance in the coming decade, especially if CTOPPA passes.
For the industry, this means that a strong demand for privacy-preserving age-assurance technologies is right around the corner. Industries are going to gravitate to whichever processes create the least friction for their own users. If product managers want to stay even with the competition, this needs to be a part of their roadmap NOW, before the regulatory hammer comes down.
What is significant here is that neither Musical.ly nor TikTok says anywhere that it is a site for kids; the creators intended to create a site for all audiences. Even at first blush, the site does not appear to be targeted to kids. But the FTC considered it to be “targeted to children” based on a plethora of other factors, much the same way it found that e-cigarettes were targeting children and teens with their advertising.
And the fact that ByteDance was a foreign-owned-and-operated company did not matter, either: It was liable because its users were children in the U.S.