Facebook 〈PREMIUM • 2024〉
The "Like" button, often celebrated as a tool for affirmation, is in fact a quantitative reduction of human emotion. It transformed qualitative relationships—friendship, empathy, solidarity—into a binary metric of social approval. The result is a performative arms race. Users do not share what they think; they share what they believe will generate the highest yield of social credit. The self becomes a brand, and every post is a quarterly earnings report for the ego. This gamification of social validation has been linked directly to the meteoric rise in adolescent anxiety, depression, and loneliness, as documented in countless longitudinal studies (Twenge, 2017). The platform promises connection but delivers comparison; it promises community but manufactures isolation. If the interface is the trap, the algorithm is the hunter. Facebook’s ranking algorithm is optimized for one variable: engagement. Engagement, however, is not a neutral metric. As internal documents leaked by whistleblower Frances Haugen revealed, the company has long known that its algorithms amplify content that evokes high-arousal emotions—specifically anger and outrage. A serene sunset photo receives a polite like. A politically charged, misleading meme about immigration receives furious comments, angry reacts, and shares. The algorithm, learning from user behavior, begins to prioritize the meme.
This creates a toxic feedback loop. To maximize reach, pages and influencers are incentivized to post the most divisive, sensationalist, or emotionally volatile content. The center cannot hold because the center is boring. Nuance, compromise, and good-faith disagreement are low-engagement behaviors. Consequently, Facebook did not merely host political polarization; it accelerated it. In countries like Myanmar, Sri Lanka, and Ethiopia, Facebook’s algorithm actively amplified anti-Rohingya and anti-Muslim rhetoric, turning the platform from a town square into a lynch mob. The company’s "community standards" proved porous against a firehose of hate that the algorithm itself was designed to promote. The platform became the world’s largest publisher without assuming any of the liability or ethical responsibility of a publisher, hiding behind the legal shield of Section 230. Perhaps the most insidious transformation wrought by Facebook is the normalization of surveillance capitalism. Before Facebook, privacy was understood as a default condition. After Facebook, privacy became a setting to be adjusted—and one that defaulted to "public." The platform’s business model, which sells predictive access to user behavior rather than user data directly, relies on a totalizing surveillance apparatus. Every scroll, every pause, every hover over a friend’s ex-boyfriend’s photo is a data point fed into a machine-learning model that predicts your future self. Facebook
Mark Zuckerberg’s famous dictum—"The age of privacy is over"—was not an observation; it was a business strategy masquerading as a philosophical truth. By convincing a generation that privacy was quaint or futile, Facebook dismantled the psychological barrier that historically protected individual autonomy. The Cambridge Analytica scandal was not a bug but a feature: the realization that the intimate details of 87 million users could be weaponized for political manipulation was simply the logical conclusion of a system that treats personal identity as raw material for ad targeting. Today, Facebook knows your political affiliation better than your spouse does, your creditworthiness better than your bank, and your mental state better than your therapist. This is not connection; this is possession. To critique Facebook is to confront a profound paradox: its indispensability. In much of the developing world, Facebook is not a website; it is the internet. Through initiatives like Free Basics (rightly rejected for violating net neutrality in India), Facebook positioned itself as the gateway to online life. For billions, WhatsApp (acquired by Facebook in 2014) is not a messaging app; it is the town hall, the marketplace, and the public utility. To call for a mass exodus from Facebook is to call for digital homelessness. The "Like" button, often celebrated as a tool
Yet, the growing body of evidence suggests that the costs of this utility are becoming unsustainable. The "Bans off our Ads" movement, the rise of decentralized alternatives like Mastodon and Bluesky, and the increasing regulatory scrutiny from the EU’s Digital Services Act and the US’s antitrust suits indicate a sea change. Younger generations are abandoning Facebook for the algorithmic chaos of TikTok or the ephemeral walls of Discord—not because they are wiser, but because Facebook has become the digital equivalent of a shopping mall in the 2010s: ubiquitous, stale, and vaguely predatory. Facebook promised to bring the world closer together. It delivered a world of closer strangers. It transformed the radical act of empathy—seeing the world through another’s eyes—into the passive consumption of a curated feed. In its relentless pursuit of growth, the platform optimized human connection out of existence, leaving behind only the hollow shell of performance. The legacy of Facebook will not be the friends we reconnected with but the society we lost. It taught us that every human interaction is a transaction, that outrage is the most efficient currency, and that privacy is a relic of a pre-digital age. To deconstruct Facebook is to ask a terrifying question: If this is what we built when we tried to connect, what does that say about who we have become? Until we are willing to log off not just from the platform, but from the logic of the infinite scroll itself, we will remain prisoners of a machine that knows us better than we know ourselves. Users do not share what they think; they
In the two decades since a Harvard sophomore coded a website called "TheFacebook" from his dorm room, the platform has undergone a metamorphosis more radical than any technological upgrade. What began as a collegiate directory for ranking classmates’ attractiveness has become, in the words of former Facebook Vice President for User Growth Chamath Palihapitiya, a tool that is "ripping apart the social fabric of how society works." To examine Facebook is not merely to analyze a product; it is to dissect the operating system of the 21st-century human condition. Through a confluence of behavioral psychology, network effects, and algorithmic amplification, Facebook did not just reflect human nature—it rewired it, transforming the public square into a theater of outrage and the private self into a commodity. The Architecture of Addiction At its core, Facebook is not a social network; it is an attention extraction engine. The platform’s foundational architecture—the "Like" button, the News Feed, the infinite scroll—was not designed for utility but for habit formation. Early Facebook, with its static profiles and poke wars, was a utility. Post-2009, under the influence of metrics like "time on site," the company adopted the principles of variable rewards, a psychological mechanism B.F. Skinner identified as the most effective way to induce compulsive behavior. Every time a user refreshes their feed, they play a slot machine: Will I see a birthday announcement, a political screed, a photo of a friend’s vacation, or utter silence?

Well done piece. I’d add the Spinners’ Pick of the Litter & the Albums list. Top songs and production by Thom Bell.
Love that Guy Clark album.