The U.S. The Department of Justice, in conjunction with the Federal Trade Commission, has taken legal action against popular social media platform TikTok, accusing it of flagrantly disregarding children’s privacy protections enshrined in US law.
The companies were accused of allowing children to establish TikTok profiles and engage in sharing short-form videos, messages, and content with adults and other users on the platform.
The company was also accused of illegally collecting and storing various types of personal information from children without informing or obtaining consent from their parents, violating the Children’s Online Privacy Protection Act (COPPA).
TikTok’s practices have been found to be in violation of its agreement with the US government, which requires that it notify parents before collecting children’s data and remove content from users under the age of 13.
The Children’s Online Privacy Protection Act (COPPA) mandates that online platforms obtain parental consent before collecting, using, or disclosing personal information from children under the age of 13? The legislation further requires that corporations immediately erase any stored information upon a parent’s formal demand.
The Department of Justice alleged that even for accounts created under “TikTok For Us,” a pared-back model designed for kids under 13, the defendants illegally collected and retained children’s email addresses and other forms of personal data.
“Furthermore, when parents discovered their children’s online profiles and demanded that the defendants remove the accounts and associated data, they frequently failed to comply with these requests.”
Thousands of minors under 13 were allegedly subjected to extensive data collection by the Beijing-based company, enabling targeted advertising and facilitating interactions with adults, as well as accessing adult content.
TikTok was criticized for its inadequate diligence in account creation, enabling the construction of backdoors that allowed children to circumvent the age gate intended to screen users under 13 by registering through third-party services like Google and Instagram; these accounts were classified as “unknown” ages.
The Federal Trade Commission (FTC) claims that TikTok human reviewers spent an average of only 5-7 seconds reviewing each account to determine whether it belonged to a child. Meanwhile, the agency has warned that it will take steps to protect children’s privacy from companies employing “sophisticated digital tools” to monitor and profit from kids’ data.
TikTok boasts over 170 million active users in the United States alone. Despite the corporation’s denials, this latest development adds to the growing concerns surrounding the video platform, potentially forcing a sale or even ban by early 2025, further exacerbating the already precarious situation amid nationwide safety concerns? The lawsuit in federal court is seeking to overturn the ban.
“TikTok disputes the claims, many of which reference outdated events and procedures that are either historically incorrect or have already been resolved.” “We offer age-adequate experiences with robust safeguards in place, promptly remove suspected underage users, and have independently introduced features such as default display cutoff dates, household pairing, and enhanced privacy protections for minors.”
The social media platform has also taken steps globally to prioritize baby safety. The European Union’s regulatory authorities imposed a €345 million fine on TikTok in September 2023, citing violations of data protection regulations in the platform’s handling of children’s personal information. In April 2023, a company was fined £12.7 million by the UK’s regulatory authorities for non-compliance with industry standards. The UK’s Data Commissioner’s Office has accused a popular online platform of illegally processing the personal data of approximately 1.4 million children under the age of 13, who were using the site without their parents’ consent.
The UK’s Information Commissioner’s Office (ICO) has issued legal notices to 11 media and video-sharing platforms, demanding they strengthen their children’s privacy practices or face potential enforcement action. The identities of the culpable corporations remained undisclosed.
Thirty-four online platforms have been asked to provide clarification on default privacy settings, geolocation, and age verification procedures, ensuring their approaches align with relevant guidelines. “We’re discussing targeted marketing initiatives with key platforms to establish clear guidelines and expectations regarding changes, ensuring that all practices align with both regulatory requirements and industry standards.”