The Algorithm’s Illusion: How Facebook and Big Tech Distort Reality
From data collection to content curation, social platforms aren’t reflecting reality—they’re skewing it
Think about the sheer amount of data Facebook generates. It’s overwhelming. Every post you scroll past, every search, every profile you visit, and every action you take is recorded. That’s just what you can see. Now, imagine what you don’t have access to—every hover, every moment spent looking at a post, every pause, every message typed and deleted, every exit. The level of behavioral tracking is staggering.
But the real issue isn’t privacy. It’s the fact that Facebook’s understanding of reality is fundamentally flawed. The platform doesn’t allow the full spectrum of history or human experience to be shared—it filters what it deems appropriate. Facebook doesn’t have insights into reality; it only has insights into how users behave within its system. Even tools like Meta’s Pixel, designed to track user behavior across the web, fall short of capturing the broader picture. Instead of using its vast dataset to understand humanity, Facebook limits itself to a narrow, distorted view.
How the Algorithm Will Be Their Death
Any algorithm that sways content one way or another creates a false distortion of reality. The moment a platform amplifies certain topics, behaviors, or perspectives while suppressing others, it ceases to reflect genuine human interests and instead manufactures a curated version of reality. Algorithms, by design, prioritize engagement, but engagement is not an unbiased metric—it is shaped by the platform’s goals, policies, and internal biases.
When an algorithm favors specific types of content, users unknowingly adapt to fit the system’s expectations. They learn what gets rewarded and what gets buried, leading to self-censorship, performative behavior, and an artificial shift in discourse. Over time, this creates a feedback loop where reality is no longer what people naturally engage with, but what the algorithm has trained them to engage with.
True reality is unpredictable, chaotic, and multifaceted. Any system that selectively promotes or suppresses content inevitably distorts that reality, reducing it to a manipulated version tailored to fit corporate interests, political pressures, or ideological leanings. This is why platforms like Facebook, YouTube, and Google—despite their vast amounts of data—fail to offer real insights into human nature. They aren’t observing reality; they are shaping it.
And the problem is that advertisers see this data and attempt to replicate it, unaware that this approach will ultimately lead to their downfall in digital advertising. Temu’s success proves the point—its blatantly promotional ads perform well because they don’t try to disguise themselves as organic content. Instead of misleading users, they embrace their nature as advertisements, avoiding the growing resentment that comes from brands trying too hard to blend in. I, myself enjoy seeing their ads because it is refreshing not to be lied to. (see ad fatigue below)
The Illusion of Understanding
This problem extends beyond Facebook. Google, Reddit, YouTube, Instagram, X, and every other major platform are training AI models and shaping digital experiences based on internal rules and regulations. They aren’t truly observing reality—they are curating a version of it that aligns with their guidelines. As a result, their data is largely useless beyond their own ecosystems.
These platforms claim to understand user interests, but their algorithms distort reality by prioritizing certain behaviors over others. If an algorithm amplifies one type of content over another, it subconsciously trains users to behave in specific ways within the platform’s framework. Over time, this conditioning creates an illusion of predictability.
Take away the network, though, and that predictability disappears. In the real world, human behavior is far more complex than the metrics these platforms rely on. Their systems are built to reflect engagement patterns, not genuine human interests. That’s why their so-called insights into human nature fail to hold up outside of their digital walls.
Community Double Standards
The saddest part? These platforms had the potential to offer real insights into human behavior. Instead of filtering reality through their own biases, they could have been tools for understanding how people think, act, and interact across cultures, time periods, and ideologies. But their data is fundamentally incomplete. Not just incomplete but completely skewed.
Using community standards which often resort to double standards, these platforms selectively enforce policies based on subjective interpretations rather than consistent guidelines. Content moderation often reflects ideological biases, allowing certain viewpoints to thrive while suppressing others under the guise of maintaining "safety" or "integrity."
This creates an uneven digital landscape where the rules are applied differently depending on who is speaking and what is being said. Instead of fostering genuine discourse, these double standards reinforce echo chambers and further distort the perception of reality within their ecosystems.
Facebook and its counterparts aren’t mapping reality—they’re building echo chambers that reinforce their own version of it. These echo chambers are dangerous, restricting not only our online experiences but also our freedoms and our ability to truly understand ourselves.
Ad Fatigue (continued)
Click-through rates have plummeted from 44% in 1994 to below 2% in 2025. The peak of digital ad optimization for Google, Facebook, and YouTube was reached in 2020, but since 2022, effectiveness has declined. While the total ad industry continues to grow, the cost per mile (CPM) and cost per acquisition (CPA) have increased, meaning advertisers are paying more just to maintain the same level of performance.
The reason? Ad fatigue. It primarily has to do with ad frequency. In 2021, Facebook maintained an approximate ratio of 1:7—meaning for every seven posts, one was an ad, based on my personal assessment of over 200 posts. Now, that ratio has shifted to 1:3, meaning every third post is an advertisement. It gets exhausting.
To add onto this: ads have become so seamlessly integrated into platforms that users can no longer immediately distinguish between organic content and paid promotions. The little sponsored text doesn't cut it. This forces users to sift through everything with suspicion, leading to frustration and resentment.
With a third of posts being ads, users have to be on guard constantly. Instead of simply tuning out ads, they now feel obligated to scrutinize every piece of content to avoid being misled. When users do recognize an ad, it’s often met with irritation—directed not just at the platform but at the advertiser itself.
This oversaturation contributes to ad fatigue, as users are constantly bombarded with promotional content, making it harder to engage with organic posts. Facebook punishes users, groups, and pages that fail to comply with its double standards—just like YouTube, X, and Instagram.
This suppression prevents users from receiving the content they actually want, fueling frustration toward both the platform and its advertisers. The result? A digital landscape where genuine engagement is replaced by relentless ad farming, eroding trust and driving users away.
Instead of ads blending naturally into the platform, they dominate the feed, making the experience feel more like a marketplace than a social space. As a result, users become desensitized, engagement rates decline, and resentment toward both advertisers and platforms increases, and it will be the end of the algorithmic tyrants.
About the Creator
Andrew Lehti
Andrew Lehti, a researcher, delves into human cognition through cognitive psychology, science (maths,) and linguistics, interwoven with their histories and philosophies—his 30,000+ hours of dedicated study stand in place of entertainment.

Comments
There are no comments for this story
Be the first to respond and start the conversation.