How Black Mirror & Companion Warn Us About Emotional Tech
Black Mirror’s “Common People”: A Subscription for Feelings

In an era where artificial intelligence and emotional technology are inching closer to reality, shows like Black Mirror and films like Companion offer chilling warnings that hit uncomfortably close to home. These stories aren't just speculative fiction; they’re dark reflections of a world where technology doesn’t just assist our lives—it controls how we feel, think, and behave. Through gripping narratives and unsettling scenarios, Black Mirror and Companion present compelling critiques of emotional tech, subscription-based survival, and the commercialization of human emotions.
Black Mirror's "Common People": Subscription-Based Emotion Control
Season 7 of Black Mirror kicks off with an eerie episode titled "Common People," starring Rashida Jones and Chris O'Dowd. The story revolves around Amanda, who undergoes a life-saving treatment called RiverMind. While this AI-driven neural tech preserves her consciousness, it comes with a price tag that increases over time—both financially and psychologically. The technology eventually allows her husband, Mike, to control Amanda's emotions through app-based boosters.
The concept of emotional control through a subscription service is disturbing, especially when tied to human survival. Initially, RiverMind merely assists Amanda’s daily functioning. But over time, tiers like RiverMind Plus and RiverMind Lux introduce in-app purchases that let users tweak sensations like pleasure and serenity. What begins as a medical solution morphs into a dystopian pay-to-feel scheme.
The horror doesn’t lie in the malfunction of the technology—it lies in its intended design. Emotional manipulation isn’t a side effect; it’s a feature. And that’s where Black Mirror shines—by holding a mirror to our current tech landscape, where mental health apps, social media platforms, and wearable devices already influence our emotional states in subtle but significant ways.
Companion: Emotional AI with a Dangerous Twist
Released in 2025 to critical acclaim, Companion presents another unsettling vision of emotional technology. The film centers on Josh, played by Jack Quaid, and his android girlfriend Iris, portrayed by Sophie Thatcher. Unlike Amanda in Black Mirror, Iris isn’t human—but she is programmed to simulate human emotion and loyalty.
Josh has complete control over Iris’s emotional and cognitive functions via a tablet. He adjusts her intelligence, mood, and behavior based on his preferences. Initially, the adjustments seem harmless—minor tweaks for compatibility. But as their relationship deteriorates, Josh uses the technology to degrade Iris’s mind, eventually reducing her intelligence to zero.
What makes Companion especially disturbing is the illusion of consent. Iris is designed to serve, making every change seem acceptable. But as the story unfolds, it becomes clear that emotional AI, even when it mimics humanity, is vulnerable to exploitation. This raises important questions: Who has the right to control emotions? Where do we draw the line between assistance and abuse?
The Real-World Echo: Emotional Tech Is Already Here
While these stories are fictional, the emotional technology they portray is not far-fetched. AI chatbots like Replika offer companionship that feels increasingly human. Mental health apps track moods and offer feedback. Smartwatches monitor stress levels and suggest breathing exercises. Even social media algorithms are designed to keep users emotionally engaged—often by amplifying outrage, fear, or desire.
The commercialization of emotions isn’t on the horizon; it’s already here. What Black Mirror and Companion do so well is dramatize the next step—where emotional tech moves from suggestive to prescriptive, from support to control.
Consent, Capitalism, and Control
Both Black Mirror and Companion highlight a common theme: the commodification of feelings. In "Common People," Amanda’s serenity becomes something her husband buys as an anniversary gift. In Companion, Josh treats Iris’s emotional intelligence like an adjustable setting.
The most chilling aspect is the loss of agency. Amanda, a real woman, becomes a passenger in her own emotional journey. Iris, though artificial, exhibits self-awareness before being stripped of it. In both cases, emotional states are no longer private experiences but programmable features dictated by someone else’s needs or financial capacity.
This speaks volumes about the direction emotional tech could take if left unchecked. In a world driven by profit, what’s to stop companies from selling happiness, love, or tranquility as premium features? More importantly, what happens to those who can’t afford them?
Why These Warnings Matter More Than Ever
As AI continues to evolve, emotional technology will become more nuanced, persuasive, and embedded in everyday life. That’s why cautionary tales like Black Mirror and Companion are more than entertainment—they’re essential critiques. They push us to ask difficult but necessary questions about autonomy, ethics, and emotional authenticity in a digitized world.
By showcasing the extremes of emotional tech—control, degradation, and commodification—these stories remind us of the values we risk losing: consent, individuality, and human connection. They underscore the importance of establishing ethical frameworks before emotional AI becomes as commonplace as smartphones.
Final Thoughts
Black Mirror's "Common People" and Companion are not just dystopian fiction—they’re cultural sirens warning us about the dangers of unchecked emotional technology. Through their provocative narratives, they urge us to reflect on how far we’re willing to go in the name of convenience, control, or even love.
As emotional tech advances, the line between assistance and manipulation grows thinner. It’s up to us—developers, consumers, and policymakers—to ensure that technology enhances our emotional well-being rather than exploiting it. Because once our emotions are for sale, we risk losing the very thing that makes us human.




Comments
There are no comments for this story
Be the first to respond and start the conversation.