Utah's Attorney General Alleges TikTok's Awareness of Minor Grooming Incidents During 'Live' Sessions
TikTok is battling various challenges simultaneously. It's currently awaiting its day in court before the Supreme Court next week, as it faces scrutiny from the federal government. Meanwhile, Utah's Attorney General is keeping a close eye on the platform. Bloomberg obtained a censored version of a lawsuit filed by Utah's leading prosecutor, which accuses TikTok of being aware that its Live streaming feature encouraged illegal content and harmful behaviors, such as grooming children.
The lawsuit uncovers two internal investigations conducted by TikTok. Project Meramec discovered that underage users were engaging in sexually explicit activities on livestreams, in exchange for virtual gifts from viewers. At that time, TikTok's policy prohibited users under 16 years old from broadcasting on Live, and prevented users under 18 from exchanging virtual gifts that could be converted into money. However, enforcement was weak, with TikTok's review revealing that 112,000 underage users hosted livestreams during a single month in 2022. Adding to this, TikTok's algorithm was boosting sexualized content, potentially exposing underage streamers to viewers. The incentive for this was clear: TikTok made money from every virtual gift purchase, and users who received more gifts generated more revenue for the platform.
Project Jupiter, the second investigation, focused on money laundering operations carried out using TikTok's livestreaming service. It discovered that some criminal activities were using TikTok Live to transfer funds, while others were selling illegal goods and services in exchange for virtual gifts. Internal communications showed discussions about how Live might have been used to fund terrorist organizations, such as the Islamic State.
Following an investigation published by Forbes, it was found that older males were coaxing young women into performing sexual acts on TikTok Live in exchange for gifts. Leah Plunkett, an assistant dean at Harvard Law School, described the situation as "the digital equivalent of going down the street to a strip club filled with 15-year-olds."
TikTok's lax moderation, particularly in regards to content involving minors, has landed the company in hot water on several occasions. In 2022, the US Department of Homeland Security launched an investigation into TikTok's handling of child sexual abuse material. Earlier this year, the FTC and Department of Justice sued the company for violating the Children's Online Privacy Protection Act, alleging that TikTok knowingly allowed underage users to create accounts and interact with adults on the platform.
Other social platforms are also grappling with child predator issues. Last year, the Wall Street Journal reported that Meta was struggling to remove pedophiles from Facebook and Instagram, and its algorithms were promoting and guiding users towards child exploitation content. Under Elon Musk's leadership, Twitter disbanded its moderation team in charge of monitoring child sexual abuse, resulting in networks of child pornography traders emerging on the platform, while accounts posting child exploitation content were reinstated.
It's plausible that none of these platforms are entirely innocent.
In the face of mounting scrutiny, the future of tech companies like TikTok hinges on their ability to address concerns about technology being misused. The recent revelations about TikTok's Live streaming feature promoting harmful behaviors and lacking adequate moderation could shape the trajectory of tech regulation in the next decade.