01 logo

Meta Accused of Hiding Research Showing Social Media Hurts Mental Health

Why this matters more than ever for young adults

By SocialodePublished 2 months ago 3 min read

If you’ve ever taken a break from Instagram or Facebook and suddenly felt lighter, calmer, or a little more human again… you’re not alone. And now, a new lawsuit claims Meta knew exactly why that happens — and allegedly chose not to tell us.

A class-action lawsuit filed by multiple U.S. school districts is accusing Meta of hiding internal research suggesting its platforms may be harming users’ mental health, particularly teens and young adults. The case also includes Google, TikTok, and Snapchat, but Meta is at the center of this storm.

So what’s really happening? And why does it feel like we’re living through the social media equivalent of the tobacco industry years?

Let’s break it down.

The Study Meta Didn’t Want You to See

The allegations revolve around a Meta research project called Project Mercury. This wasn’t some casual survey — Meta partnered with Nielsen to study what happens when people deactivate Facebook and Instagram for just one week.

  • The findings were eye-opening:
  • People felt less depressed
  • People reported lower anxiety
  • Participants experienced less loneliness
  • Social comparison, that constant “everyone’s life looks better than mine” feeling, went down noticeably

Imagine a major tech company discovering that its platforms are making people feel worse… and then quietly turning off the lab lights.

According to internal records cited in the lawsuit, that’s exactly what happened. The project wasn’t released. It wasn’t shared publicly. It was simply shut down.

Meta allegedly dismissed the findings as being “tainted by media narratives,” but internal messages tell a different story. Multiple employees reportedly told leadership that the research was valid and clearly showed a causal link between social media use and negative mental health outcomes.

One staff member even compared hiding the results to “the tobacco industry doing research and knowing cigarettes were harmful, but keeping that info to themselves.”

That’s a heavy statement, especially coming from inside the company.

Meta’s Mixed Messages

Here’s where it gets even more complicated:

Around the same time this research was being quietly shelved, Meta told Congress that it couldn’t determine whether its platforms harmed teen girls.

According to the lawsuit, they already had evidence suggesting otherwise.

Meta spokesperson Andy Stone has disputed the accusations, saying the research methodology was flawed and that Meta has spent years improving safety tools and protections for teens. He claims the allegations rely on “cherry-picked quotes” and misinterpretations.

But the lawsuit points to more than just hidden research.

Beyond Mental Health: A Bigger Pattern?

The filing also accuses Meta and other platforms of:

  • Allowing, even indirectly encouraging, kids under 13 to use their apps
  • Failing to act strongly on harmful or abusive content
  • Increasing outreach to teens in school environments
  • Offering payments to youth-focused organizations to publicly support the company’s safety claims

If even a portion of these allegations hold up, it suggests tech companies may be prioritizing engagement over protection.

And for a generation that grew up online, that hits close to home.

Why This Story Matters for 18–35 Year Olds

If you’re part of Gen Z or a younger millennial, social media is baked into your everyday life. It’s where you connect with people, showcase your work, build communities, date, discover opportunities, and express yourself.

But what happens when the very platforms we depend on are designed in a way that chips away at our emotional well-being, and the people building them know this?

This isn’t about telling people to quit social media. Realistically, most of us aren’t doing that, and we shouldn’t have to. Platforms should be safe by design, not safe only after something goes wrong.

This moment is about transparency.

It’s about accountability.

It's about deciding whether the tech giants shaping our digital lives should be held responsible when their products affect our mental health.

What Happens Next?

A hearing is scheduled for January 26th, where more details may become public. For now, Meta strongly denies all allegations.

But regardless of the outcome, the conversation has already shifted.

More and more people, especially younger users, are asking harder questions:

  • Is this platform helping me or hurting me?
  • What’s the emotional cost of staying connected 24/7?
  • What responsibility do these companies have to protect us?

And loud or not, those questions aren’t going away anytime soon.

social media

About the Creator

Socialode

We are a mobile app team working for the past year on creating a platform that allows users to connect with people while protecting their privacy. Our goal is to fix the world of social media.

www.socialode.com

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments

There are no comments for this story

Be the first to respond and start the conversation.

Sign in to comment

    Find us on social media

    Miscellaneous links

    • Explore
    • Contact
    • Privacy Policy
    • Terms of Use
    • Support

    © 2026 Creatd, Inc. All Rights Reserved.