Leaked documents from inside TikTok suggest the social media giant intentionally endangers kids to benefit its bottom line.
A group of 13 states and the District of Columbia filed individual suits against the Chinese-owned company earlier this month, alleging:
The suits claim TikTok’s addictive features, like autoplay and 24-hour push notifications, as well as, marketing strategies, like promoting beauty filters, harm children’s mental and physical health. Evidence uncovered in the states’ two-year investigation into the platform suggests the company knew about these problems and allowed them to continue.
TikTok’s internal documents and communications should have been hidden from the public. However, problems with the redaction in South Carolina’s and Kentucky’s cases prematurely revealed some clandestine details.
It doesn’t look good for TikTok.
On the Apple App Store, TikTok claims its content is suitable for children ages 12 and up. But Apple challenged that age rating in 2022, according to evidence in South Carolina’s case against TikTok.
The Washington Post reports:
But TikTok refused to give up its kid-friendly rating. Instead, it claimed it took “aggressive strategies” to filter and remove the kinds of content Apple flagged. South Carolina’s suit says it didn’t work. TikTok still shows children inappropriate and vulgar content, the case alleges, but the company doesn’t want to cop to it.
New York Attorney General Letitia James, who helped lead the charge against TikTok, says TikTok has financial incentives to keep its age rating low. A press release explaining the case claims approximately 35% of TikTok’s American ad revenue comes from children and teens.
Leaked information from Kentucky’s case, reviewed and published by NPR, supports James’ assertion.
TikTok knows know many of its most dedicated users are minors. One internal study of its users found 95% of American smartphone users under 17-years-old use the app. Another study of TikTok’s engagement statistics notes, “As expected, across most engagement metrics, the younger the user, the better the performance.”
The company knows it must keep young users engaged. In one employee chat regarding a TikTok tool meant to decrease the time minors spend on the app, a project manager admitted, “Our goal is not to reduce the time spent [on TikTok].” Another employee added, “[The goal is] to contribute to daily active users and retention [of other users].”
The aforementioned tool allows parents to impose an hour-long TikTok time limit — a feature that would impact TikTok’s goal if it worked. But it doesn’t. TikTok estimates it only reduces usage time by an average of 1.5 minutes.
Another internal document suggests the screen-time limit wasn’t built to work well. TikTok only evaluated the feature’s success by how it “improved public trust in the TikTok platform via media coverage.”
TikTok apparently instructs their content moderators to perform a similarly shoddy job. A document referring to “younger users/U13” tells moderators to leave underage users’ accounts alone unless it explicitly identifies them as under 13 years old.
TikTok doesn’t just turn a blind eye to minors on its platform — it recruits them.
The platform discovered users must watch 260 videos to form a TikTok habit. Kentucky’s lawsuit elaborates:
TikTok knows its most engaging features cause young people to compulsively open its app. It also knows what problems these compulsions cause. The company’s internal research concludes:
One of the platform’s most addictive features is its algorithm, which learns and feeds users the kinds of videos they like. But ingesting too much of the same content can quickly skew the way users view the world.
A good example of this comes from one of TikTok’s employees, who participated in an internal study of “filter bubbles” — the kind of homogenous content filtering that occurs when social media algorithms determine what posts users engage with.
This employee wrote:
The employee is referencing TikTok accounts featuring exclusively sad stories and comments from people in pain. Ostensibly designed to support those going through a hard time, an excess of this kind of content makes the world seem like a perpetually dark place.
To avoid filter bubbles, TikTok claims to offer a “Refresh” feature that resets the algorithm. James’ press release says this feature “does not work as TikTok claims.”
So TikTok knows minors generate big profits, creates addictive features to keep them on the app, turns a blind eye to underage users and allows them to binge on inappropriate content. Yikes.
Unfortunately for everyone involved, it gets worse.
One incident documented in Kentucky’s case involves TikTok Live, a feature that allows users to broadcast live videos of themselves. In 2022, TikTok discovered “a significant” number of adults had started paying minors to strip on live video.
You read that right. In one month alone, adults sent more than 1 million ‘gifts’ — real money converted into digital currency — to kids for ‘transactional’ behavior.
D.C. Attorney General Brian Schwalb calls this an “unlicensed payment system” that incentivizes minors to prostitute themselves. He told the Post:
In contrast, a TikTok official commented to coworkers:
The more we learn about social media, the more obvious it seems that children shouldn’t get within ten feet of it. To learn more about what you can do to keep your child screen-free, take a look at the articles linked below.
At least commit to enforcing strong boundaries around technology. Some ideas include requiring your children to scroll social media in public area, regularly checking their social media feeds for inappropriate content and educating them about sextortion and other predatory online behaviors.
Remember that companies like TikTok and Meta will not reliably protect your child from exploitation and inappropriate content. It’s up to parents to keep kids safe online.
Additional Articles and Resources
Four Ways to Protect Your Kids from Bad Tech, From Social Psychologist Jonathan Haidt
Parent-Run Groups Help Stop Childhood Smartphone Use
Survey Finds Teens Use Social Media More Than Four Hours Per Day — Here’s What Parents Can Do
Teen Boys Falling Prey to Financial Sextortion — Here’s What Parents Can Do
Instagram’s Sextortion Safety Measures — Too Little, Too Late?
Kid’s Online Safety Act — What It Is and Why It’s a Big Deal
Instagram Content Restrictions Don’t Work, Tests Show
Zuckerberg Implicated in Meta’s Failures to Protect Children
Surgeon General Recommends Warning on Social Media Platforms
‘The Dirty Dozen List’ — Corporations Enable and Profit from Sexual Exploitation
The post TikTok Dangerous for Minors — Leaked Docs Show Company Refuses to Protect Kids appeared first on Daily Citizen.
Daily Citizen