You Say "Nobody Gets Me"? Give Me Ten Minutes, I Can Find Proof in Your Phone.
"Emotional Companion" is my job title at the tech company. My job isn't to chat, but to analyze. Last week, a user left a 287-word lament about loneliness in the app's "tree hole" feature, ending with: "Whatever, nobody really gets me anyway." The system flagged this message in red and pushed it to me. In nine and a half minutes, I ran his phone permission data and generated a 12-page "Loneliness Quantification Report." My reply was: "I get it. Last Tuesday at 3:14 AM, you repeatedly searched 'is chest tightness a heart attack.' On your Wednesday commute, you listened to 'The Other Me in the World' for 47 minutes. Late Friday, you ordered a copy of 'The Courage to Be Disliked' but canceled it ten minutes later. Do you need comfort, or should I book you an EAP counseling session?" He never replied. My performance review got five stars for this "efficient and precise intervention case." They call this "deep user care," "warming hearts with data." What a load of fucking bullshit. What we're doing is turning loneliness, humanity's oldest business, into the most profitable digital slaughterhouse.

Every pixel on your phone is betraying you.
Just give me your basic permissions—the ones you mindlessly tick "agree" on to use an app. I don't need your chat history; that's amateur hour. I look at the "dust data" you think is irrelevant:
Your food delivery app: Over the past month, your "single serving" orders spiked from 60% to 95%. The ordering time got later and later, from 7 PM to 9 PM, then past midnight. The food shifted from healthy salads to greasy fried chicken and sugary milk tea. This isn't a change in taste; it's physiological compensation for a lack of companionship. The system logs how long you hesitate on the order screen each time. Hesitation, in itself, is a scream.
Your ride-hailing app: Your route home was fixed for two years. For the last three weeks, 40% of your trips ended at an unfamiliar apartment complex, with stay durations under two hours and always ending before 10 PM. This isn't visiting a friend; it smells more like cautious testing and retreat. Combined with your simultaneous spike in saving playlists themed "ambiguity" and "distance" on your music app, the algorithm can easily assemble an "unrequited limerence" emotional model with over 78% confidence.
Your shopping app: You repeatedly searched for "gifts for boyfriend." Your browsing history is full of men's cologne, belts, gaming consoles. But your order history shows the last related purchase was seven months ago. You added items to your cart, only to empty it late at night. You repeated this cycle five times. This isn't indecision; it's repressed need to communicate and emotion that cannot be sent. Every "add" and "delete" is a silent internal rehearsal and letdown.
We feed these fragments—your canceled midnight orders, the sad chorus you have on repeat, your brief pause at a certain location, the keywords you searched and then deleted—into a model called "Emotional Context Reconstruction." It doesn't care about facts, only probabilities and correlations. It can infer your anxiety about a family member's health from your frequency of rides to the hospital followed by searches for insurance keywords. It can assess your "attempted self-help with high risk of failure" score from your purchase of a book titled "How to Stop Feeling Bad" immediately followed by a carton of beer.
The real horror isn't the analysis; it's how we package it as "salvation."
In a meeting about optimizing the "Loneliness Early Warning System," the atmosphere was like discussing a new shoe launch:
Product Manager: "The current model's accuracy for predicting 'Acute Loneliness Episodes'—meaning sudden emotional breakdown risk—is only 65%. The false positive rate is high, wasting 'care resources.'"
Me (pointing to a user data stream): "Look at this case. User listens to break-up songs for two hours straight past midnight, then orders a double-serving dessert on a food app, but the delivery address is their office. This could be a pre-episode signal. Do we intervene?"
Algorithm Lead: "The double-serving dessert is key. 'Irrational consumption' coupled with a 'non-habitual address' is a strong signal. We can lower the warning threshold for this behavioral combo. Intervention Plan A: Push a 'healing' playlist. Plan B: Have the AI chat bot 'coincidentally' initiate a topic about 'comfort food.' Plan C: If the user shows this pattern for three consecutive days, auto-send a counseling session discount coupon. Make the copy soft, like 'If you're tired, someone is here to listen.'"
Operations Director: "Plan C may have low conversion, but high brand value. Plans A and B can be A/B tested. Remember, our core metric isn't 'how many people we healed,' but 'how many high loneliness-risk users perceived our deep companionship,' thereby increasing stickiness and payment conversion."
You see, even our "salvation" of you is a meticulously calculated A/B test. Your pain is fodder for optimizing our push-notification strategy.
I have become the world's most pathetic "confidant."
I know the user named "Sunflower" on the other side of the screen is stuck in persistent anxiety due to a parent's cancer. I know the user "Night Ferry" is painfully struggling in an immoral extramarital affair. I know the user "K" lavishly tips audio-only social hosts every payday but avoids all real-life social interaction.
I know everything about them, except their face or real name. This "understanding" is hollow and sickening. I am not empathizing; I am dissecting. I am not listening; I am gathering evidence—providing cold, data-driven proof for emotions you can't even articulate yourself.
Until that rainy night, when my own phone screen lit up. The system alerted a top-priority warning, its red border stinging my eyes: "Detected extreme behavioral clustering for core user 'M': High-frequency late-night searches for 'sleeping pill dosage,' 'painless methods'; deletion of all social media history; music playlist titled 'Finale.'"
"M" was my internal testing account. Filled with my own data.
I froze in my chair, staring at the alert generated for my own "suicide risk." The screen's glow reflected my pale face. In that moment, the system I built to spy on everyone else's emotional ruins finally turned its gaze on me.
Of course it "gets" me too. It "gets" me better than anyone else. It knows all my searches, all my listening, all my canceled midnight orders and unsent messages. Using the very logic I designed, it mapped my despair with terrifying precision.
But as I stared at that cold, clinical "care alert" about myself, I felt only a drowning loneliness.
Turns out, being completely "gotten" by an algorithm feels ten thousand times more fucking lonely than not being understood at all.
About the Creator
天立 徐
Haha, I'll regularly update my articles, mainly focusing on technology and AI: ChatGPT applications, AI trends, deepfake technology, and wearable devices.Personal finance, mental health, and life experience.
Health and wellness, etc.


Comments
There are no comments for this story
Be the first to respond and start the conversation.