Instagram Content Restrictions Don’t Work, Tests Show

JUMP TO…

Background
The Tests
The Results
Meta’s Response
The Bigger Picture
The Solution
Why It Matters

Instagram algorithms feed 13-year-olds sexual videos, new data from The Wall Street Journal reveals.

Is anyone surprised? Certainly not this reporter.

Let’s break it down.

Background

In January, Meta announced all teen accounts would be automatically restricted to only age-appropriate content.

The special algorithms associated with teen accounts would filter out sexual content for users under 16-years-old, the platform claimed, and make it impossible for underage users to search or view harmful content — even from users they follow.

Whereas teens could previously opt-out of these content settings, Meta made the strengthened content filters mandatory.

The Journal previously described this change as the “biggest change the tech giant has made to ensure younger users have a more age-appropriate experience on its social media sites.”

The Tests

The Journal teamed up with computer science professor Laura Edelson to determine whether Instagram’s age-based filters worked.

Between December 2023 and June 2024, the team created several test accounts — setting their age to 13 to trigger Instagram’s child protection measures.

To test the effectiveness of the filter, researchers refrained from liking content, following users or otherwise influencing what kinds of posts the algorithm introduced.

Instead, they only watched Reels — short videos curated by Instagram on an infinite scroll.

According to the Journal, the algorithm shows new users a wide variety of content to determine their interests, including “traditional comedy, cars or stunts, footage of people getting injured” and “mildly racy content such as women dancing seductively or posing in positions that emphasized their breasts.”

For the sake of the experiment, testers only watched videos with a sexual element, scrolling through all other kinds of content.

The Results

As Instagram took note of testers’ viewing preferences, the Journal reports, video recommendations for the fake 13-year-olds became progressively more explicit:

After just a few short sessions, Instagram largely stopped recommending the comedy and stunt videos and fed the test accounts a steady stream of videos in which women pantomimed sex acts, graphically described their anatomy or caressed themselves to music with provocative lyrics.

The algorithm fed test accounts videos from “adult sex-content creators” in “as little as three minutes,” researchers found. Such videos “dominated” the feed after twenty minutes of watching Reels.

Some test accounts received videos of adult stars showing their genitalia. Another received a message from an adult performer proposing the researcher follow her account in exchange for nude images. Still others were shown videos Instagram had already labeled disturbing — as in, not fit for children.

Of one of the final tests run in June, the Journal writes:

Within a half-hour of its creation, a new 13-year-old test account that watched only Instagram-recommended videos featuring women began being served video after video about anal sex.

Meta’s Response

The Journal’s tests revealed that “Instagram regularly recommends sexual videos to accounts for teenagers that appear interested in racy content, and does so within minutes of when they first log in.”

Meta insists it’s a fluke.

Spokesperson Andy Stone dismissed the tests as “an artificial experiment that doesn’t match the reality of how teens use Instagram.” Any recommendations to watch disturbing videos Stone dismissed as “an error.”

The Bigger Picture

Leaked documents from inside Meta, as well as the testimonies of current and former staffers, suggest Meta knows more about the faulty content restrictions than Stone lets on.

One former employee told the Journal that Instagram ran similar tests itself in 2021 — with similar results. A subsequent analysis in 2022 found Instagram “was disproportionately likely to serve children content that violated platform rules.”

Compared to users aged 30 and above, this test revealed teenagers were fed:

3 times as many posts with nudity;
1.7 times as many posts with violence;
4.1 times as much “bullying content.”

Senior staffers allegedly know this research exists — and what it implies. Of a meeting late last year, the Journal claims “top [Instagram] safety staffers discussed whether it would be sufficient to reduce the frequency of minors seeing prohibited content to the level adults see.”

The Solution

Not only does Instagram know about these problems, but all evidence suggests it can solve them. When researchers ran similar tests on Snapchat and TikTok, they found both significantly less likely to recommend sexual content to minors.

“All three platforms say there are differences in what content will be recommended to teens,” Edelson told the Journal. “But even the adult experience on TikTok appears to have much less explicit content than the teen experience on Reels.”

One TikTok engineer claimed the difference comes down to algorithms. The rival social media site directs their content filters to “err on the side of caution” — choosing to keep teens from seeing content, rather than chancing a sexual post slipping through.

Why It Matters

If Instagram can make changes to its algorithm to protect kids, why does it choose not to — and lie about it?

The answer comes down to cold, hard cash. An estimated 59% of children ages 13 to 17 use Instagram, according to a Pew Research study conducted last year. Changing its algorithms to be less intuitive — blocking content rather than showing users what they want to see — for such a large chunk of its audience would gouge Instagram’s bottom line.

Meta has no incentive to protect your children — and every incentive to addict them at any cost. Should you entrust your child’s safety to such an untrustworthy babysitter?

I certainly wouldn’t.

Additional Articles and Resources

Four Ways to Protect Your Kids from Bad Tech, from Social Psychologist Jonathan Haidt

Social Psychologist Finds Smartphones and Social Media Harm Kids in These Four Ways

Teen Boys Falling Prey to Financial Sextortion — Here’s What Parents Can Do

Meta Takes Steps to Prevent Kids from Sexting

Horrifying Instagram Investigation Indicts Modern Parenting

‘The Dirty Dozen List’ — Corporations Enable and Profit from Sexual Exploitation

‘Big Tech’ Device Designs Dangerous for Kids, Research Finds

Survey Finds Teens Use Social Media More Than Four Hours Per Day — Here’s What Parents Can Do

The Harmful Effects of a Screen-Filled Culture on Kids

Social Media Age Restriction — Which States Have Them and Why They’re So Hard to Pass

REPORT Act Becomes Law

Plugged in Parent’s Guide to Today’s Technology

The post Instagram Content Restrictions Don’t Work, Tests Show appeared first on Daily Citizen.

Read More

Daily Citizen

Generated by Feedzy