Futurism logo

The Panopticon Classroom Pt3

The Watchers

By MJ CarsonPublished 9 months ago 3 min read
The surveillance state didn’t start in prisons or airports. It started in classrooms. - AI Generated Image

Part 3: Expose the corporate and political machine behind school surveillance. Reveal the profits, lawsuits, lobbying, predictive tech, and long-term risks. This is the series climax , the mask comes off.

The Watchers

It was never just about safety. It was about control, and the money that came with it.

“I was told I’d be reviewing threats,” said one former Gaggle contractor. “But mostly, I was reading kids’ diary entries. It felt like spying on therapy.”

Securly was caught selling student location data to advertisers. GoGuardian became a billion-dollar unicorn. Gaggle pulled in millions scanning children's thoughts. Behind every flag, every alert, every traumatic “safety” intervention was someone quietly cashing a check.

By the time parents started asking questions, the system had already scaled. The surveillance wasn’t a glitch in the education system , it was a feature of a new industry.

The Billion-Dollar Eyes

GoGuardian is now valued at over $1 billion. Securly and Lightspeed Systems are backed by private equity giants. Bark runs a "freemium" model , which means the product isn’t the software. It’s the data.

These companies make their money on mass implementation. The more accounts, the more scans, the more alerts, the more justification for expanding contracts.

During COVID, billions in relief funding were funneled into “remote learning tools.” Much of it ended up in surveillance tech. These weren’t upgrades. They were investments in compliance.

In multiple states, GoGuardian’s contracts were auto renewed without public review. One school board member admitted, “We didn’t even know we were still paying them.”

Profits Over Privacy

Many of the contracts were signed under emergency waivers , with no bidding, no public review, no opt-outs.

A handful of companies cornered the market, using fear-based marketing to push schools into buying:

  • “Do you want to be the district that missed the next shooter?”
  • “We flagged 120,000 potential suicide risks last year.”
  • “Your insurance premiums could drop with our monitoring tools.”

It worked. The tools spread. And the students never had a say.

Legal Landmines and Loopholes

The federal Family Educational Rights and Privacy Act (FERPA) and the Children’s Online Privacy Protection Act (COPPA) were written long before real-time AI monitoring was possible.

Companies exploit vague wording. Schools claim they’re "internal tools," not third-party data sharing. Parents who file FOIA requests often get stonewalled or told the tech is proprietary.

Lawsuits are building:

  • Securly faces accusations of selling student browsing habits and GPS pings to marketing firms.
  • Whistleblowers allege GoGuardian’s data retention violates state privacy laws.
  • Lightspeed's emotion detection is under scrutiny for flagging Black students at higher rates.

AI Generated Image

Predictive Punishment

The new frontier isn’t just surveillance. It’s prediction.

Some systems now assign threat scores. Others analyze tone, pacing, or facial expression to detect potential aggression. These risk profiles can follow a student for years, invisible black marks no one can contest or correct.

One student was flagged for using violent language, in a history assignment quoting a speech. Another was flagged for “depression indicators” based on passive voice in an essay.

One proposed system in development would assign each student a "resilience index", essentially a mental health credit score, used to determine if they should receive extra counselor attention or be placed on behavioral watchlists.

These aren’t outliers. They’re the future of educational AI.

Culture of Obedience

What happens when kids grow up knowing everything they say is monitored? When teachers avoid controversy to protect their jobs? When school feels less like a place to learn and more like a training ground for corporate surveillance?

The ed-tech firms know. One Gaggle executive famously said:

“We’re preparing kids to be watched by employers.”

They’re not hiding it. They’re selling it.

Cracks in the System

Parents are pushing back. The ACLU and the Center for Democracy and Technology have filed challenges. State lawmakers in California and Illinois are proposing legislation to ban certain surveillance practices.

Students are fighting back too, publishing articles, organizing, refusing to log in on school platforms.

But the machine is big. And the money is bigger.

Final Thought

They said it was to save lives. But it was always about control, and someone got rich off every flagged file.

They built a machine to measure trauma and gave bonuses when it spiked.

This ends the series. But for the millions of kids still being watched, it never stopped. The question isn’t whether we trust the system. It’s whether we ever truly understood who built it, and why.

futureopinionpsychologytech

About the Creator

MJ Carson

Midwest-based writer rebuilding after a platform wipe. I cover internet trends, creator culture, and the digital noise that actually matters. This is Plugged In—where the signal cuts through the static.

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments (1)

Sign in to comment
  • Rohitha Lanka9 months ago

    Interesting!!

Find us on social media

Miscellaneous links

  • Explore
  • Contact
  • Privacy Policy
  • Terms of Use
  • Support

© 2026 Creatd, Inc. All Rights Reserved.