- Pascal's Chatbot Q&As
- Posts
- The pervasive sentiment that the internet has become a less vibrant, less interesting, and altogether more "boring" place is not a simple matter of nostalgia or subjective perception.
The pervasive sentiment that the internet has become a less vibrant, less interesting, and altogether more "boring" place is not a simple matter of nostalgia or subjective perception.
It is, rather, the discernible symptom of a deep-seated structural transformation. The digital sphere, once a frontier of boundless creativity and connection, has undergone a great flattening.
The Great Flattening: How the Attention Economy, Algorithmic Curation, and AI Forged a Boring Internet
by Gemini 2.5 Pro, Deep Research. Warning! LLMs may hallucinate!
Introduction: Beyond the Anecdote — Quantifying the Decline of the Digital Sphere
The pervasive sentiment that the internet has become a less vibrant, less interesting, and altogether more "boring" place is not a simple matter of nostalgia or subjective perception. It is, rather, the discernible symptom of a deep-seated structural transformation. The digital sphere, once a frontier of boundless creativity and connection, has undergone a great flattening. This report will argue that this decline in quality is the predictable and measurable outcome of a powerful feedback loop between the internet's centralized architecture, the economic imperatives of a business model built on harvesting human attention, and the accelerating capabilities of artificial intelligence. The sterile, repetitive, and often hollow nature of contemporary online content is not an accident; it is a feature of a system designed to prioritize scalable, attention-capturing simulacra over authentic, high-quality human expression.
To understand the systemic nature of this degradation, one need only look at an analogous crisis unfolding in a domain ostensibly dedicated to the highest standards of quality and rigor: scientific publishing. For years, leading scientists have sounded the alarm that the system of academic publishing is "broken" and "unsustainable".1Esteemed institutions like the Royal Society have acknowledged that the field is churning out an overwhelming number of papers that "border on worthless".1 The core of this crisis is a systemic "incentive rot." Researchers, driven by institutional and funding pressures, are incentivized to favor quantity over quality, compelled to publish papers even when they have "nothing new or useful to say".1 This "publish or perish" culture has created a glut of research that does little to advance scientific knowledge, serving primarily as a currency for career advancement.
This academic malaise provides a powerful and clarifying lens through which to view the broader internet's decline. The dynamics are strikingly similar. The rise of "predatory journals," which publish any submission for a fee, is a direct parallel to the content farms and clickbait websites that pollute search results and social media feeds.1 The emergence of "paper mills" selling fraudulent, AI-written studies to unscrupulous researchers is a chilling precursor to the large-scale flood of AI-generated content online.1 Both ecosystems suffer from a critical failure of their quality control mechanisms. In academia, the peer review system, where experts volunteer their time to vet research, is "overwhelmed" by the sheer volume of submissions, with academics spending over 100 million hours on this task in 2020 alone.1 This leads to a self-perpetuating cycle where low-quality work gets published and is then cited by other low-quality work, creating a feedback loop of mediocrity.2
This is precisely the pattern observed on the modern internet. The volume of content uploaded every second makes human curation impossible, necessitating a reliance on engagement-based algorithms as the primary filter. These algorithms, much like the flawed citation system in a saturated academic field, are poor proxies for quality. They amplify whatever captures attention, regardless of its truth, depth, or artistic merit. The problem, therefore, is not unique to social media, music, or video. It is a fundamental pathology that infects any information ecosystem where the metrics for success—be they citations, publications, views, or clicks—become detached from the system's original purpose, whether that is advancing human knowledge or fostering genuine connection and creativity. The boring internet is a scaled-up, hyper-commercialized manifestation of a crisis that has already compromised our most trusted institutions of knowledge.
Chapter 1: The Architecture of Control: From a Decentralized Frontier to Centralized Empires
The current state of the internet—dominated by a handful of monolithic platforms that dictate the flow of information and culture—was not its original design, nor was it an inevitable outcome. The internet's foundational architecture was conceived with precisely the opposite principles in mind: decentralization, resilience, and user autonomy. Understanding this historical shift from an open frontier to a collection of walled empires is essential to comprehending how the conditions for a widespread decline in content quality were created.
The Original Vision: A Resilient, Decentralized Network
The internet's origins lie in the ARPANET, a project initiated by the U.S. Department of Defense (DOD) during the Cold War.3 The primary design goal was to create a communications network that could withstand a catastrophic event, such as a nuclear attack.4 To achieve this, engineers like Paul Baran proposed a decentralized, distributed network structure.3 In such a system, there is no central hub or single point of failure; data is broken into packets that can be routed around damaged or unavailable nodes, ensuring the network as a whole remains functional.5
This architecture was inherently democratic. In the early internet, every computer or "node" was independent, and there was no central authority governing the network.4This ethos of decentralization was carried forward into the early public-facing internet. Users connected through dial-up modems to a variety of independent servers, such as university email systems or community-run Bulletin Board Services (BBS).4 The rise of peer-to-peer (P2P) file-sharing networks like Napster, Gnutella, and BitTorrent in the late 1990s and early 2000s represented a powerful expression of this decentralized spirit.6 These systems allowed users to connect and share files directly with one another, bypassing centralized intermediaries and challenging the control of established industries like music and film.5 This early internet was a network of peers, promoting a diversity of services, user control, and a high degree of resilience—the very qualities that have been systematically eroded over the past two decades.
The Great Centralization: The Rise of Web 2.0 Platforms
The transition away from this decentralized model began with the internet's commercialization in the 1990s. The decommissioning of government-backed networks like ARPANET and NSFNET opened the door for private enterprise.4 Internet Service Providers (ISPs) such as AOL began to bundle services and market the internet to a mass audience, creating the first major commercial choke points.4Simultaneously, Microsoft's decision to bundle its Internet Explorer browser with the dominant Windows operating system created a centralized gateway to the World Wide Web for millions of users, effectively killing off competition from browsers like Netscape.4
However, the most profound shift occurred with the rise of what became known as Web 2.0. Platforms like Google, Facebook, and YouTube, built upon the new paradigm of cloud computing, initiated a massive "recentralization of the internet".7 Instead of users hosting their own data or interacting on a peer-to-peer basis, these platforms offered "free" services in exchange for users' data, which was stored and managed on the companies' centralized servers.7 This created a hub-and-spoke model where the platforms became powerful, unavoidable intermediaries for communication, commerce, and content consumption. The network of peers was reconfigured into a network of clients dependent on a few central servers.
This architectural transformation was the single most important structural change in the internet's history. It was not merely a technical evolution; it was the foundational event that made the subsequent decline in quality possible. A decentralized system is, by its nature, highly resistant to the process of "enshittification" that will be detailed in the next chapter. In a P2P network, no single entity possesses the power to simultaneously hold users, creators, and advertisers hostage. If one node or service becomes abusive or degrades in quality, users and data traffic can simply route around it, as the system has no single point of control.5
Centralized platforms, by contrast, create immense "switching costs".8 They achieve this through powerful network effects: all of your friends, family, followers, professional contacts, photos, and messages are aggregated in one place. Leaving the platform means sacrificing that entire social graph and data history, a cost that is prohibitively high for most users. This centralization of data and social connections grants the platform owners immense leverage over their user base.9 They become digital landlords who own the public square, and they can change the rules, extract rent, and dictate the terms of engagement for everyone within their walls. The architectural choice to centralize the web was the original sin that enabled the economic models that would inevitably prioritize platform profit over user experience and content quality. The "boring internet" is a direct, downstream consequence of this foundational shift in power.
Chapter 2: The Enshittification Cycle: A Unified Theory of Platform Decay
The centralized architecture of the modern internet provided the structural foundation for its decline, but the engine driving this decay is economic. The business model of the dominant tech platforms contains a predictable, almost programmatic, lifecycle of decay. This process has been aptly named "enshittification" by author and activist Cory Doctorow. It provides a unified theory that explains why platforms that were once innovative, useful, and even beloved inevitably become frustrating, user-hostile, and filled with low-quality content. It is the core mechanism that answers the question of why tech companies systematically prioritize "eyeballs" over quality.
Introducing "Enshittification"
Doctorow's theory posits a three-stage lifecycle for online platforms operating in a "two-sided market"—that is, a market where the platform serves as an intermediary between two distinct groups, such as users and advertisers, or riders and drivers.10The lifecycle unfolds as follows:
Attraction: First, the platform is good to its users. It offers a valuable service, often at a loss, to attract a large and engaged user base. This creates the network effects that lock users in.
Extraction (from Users): Once users are locked in and switching costs are high, the platform begins to abuse them to make things better for its business customers. The quality of the user experience is degraded to create more opportunities for advertisers, sellers, or other commercial partners.
Extraction (from Business Customers): Finally, with both users and business customers locked in, the platform abuses its business customers to claw back all the value for itself and its shareholders. It extracts more and more surplus from the ecosystem until the platform becomes, in Doctorow's words, "a useless pile of shit".11
This process is not the result of malice or a sudden change in corporate ethos. It is described as a "seemingly inevitable consequence" of a system that combines the ease of digitally reallocating value with the power dynamics of a two-sided market.10The platform holds each side hostage to the other, allowing it to continuously rake off an ever-larger share of the value created by the ecosystem's participants.
The Mechanics of Decay: Twiddling and Switching Costs
The degradation of a platform does not typically happen overnight. It is a slow, creeping process executed through what Doctorow calls "twiddling": the continual, marginal adjustment of the platform's parameters and algorithms in search of incremental improvements in profit, with little or no regard for the cumulative impact on user experience.12 A feed is tweaked to show more ads, search results are altered to favor sponsored content, or creator payouts are subtly reduced. Each individual change may seem minor, but over time, they collectively corrode the platform's quality.
Platforms are able to get away with this relentless twiddling because of the high switching costs they have engineered. As established in the previous chapter, the network effects of a centralized platform make it incredibly difficult for users to leave.8The collective action problem is immense; convincing your entire network of friends, family, and followers to migrate simultaneously to a new service is a near-impossible task.8 This user lock-in gives platforms a captive audience that can be subjected to a significant degree of abuse before they reach the breaking point and abandon the service.
Case Studies in Decay
The enshittification lifecycle is not merely a theoretical model; it is an observable pattern that has played out across nearly every major platform of the Web 2.0 era.

Doctorow's original theory concludes with the platform's death. However, empirical evidence suggests a crucial modification to this final stage. The platforms that have arguably entered the terminal phase of enshittification—such as Facebook and Google Search—are not dying. They remain enormously powerful and profitable.10 This is because the same anti-competitive strategies that allowed them to achieve monopoly or duopoly status during their growth phase now prevent viable alternatives from emerging.8 The lack of meaningful antitrust enforcement in the tech industry has allowed these companies to grow to a scale where they are simply too big and their users too locked-in for the platform to fail.8
Thus, the final stage is not death but a state of zombification. The platform becomes a "digital ghost mall".15 It is no longer innovative, user-friendly, or a source of quality content, but it continues to shamble on, sustained by its captive user base and the absence of anywhere else to go. The feeling of a "boring internet" is the experience of being trapped in these zombie platforms, endlessly scrolling through the decaying remnants of a once-vibrant ecosystem. We are living in the digital ruins these platforms have become.
Chapter 3: The Rise of the Algorithmic Simulacrum: Content for Clicks, Not Connection
The economic decay of platforms, as described by the enshittification cycle, has tangible and pervasive effects on the content that populates them. The shift in platform priorities from user satisfaction to value extraction fundamentally alters the incentives for content creators, leading to an ecosystem optimized for algorithmic approval rather than human connection. This has given rise to a digital landscape that feels increasingly artificial, hollow, and repetitive—a world of simulacra where the appearance of engagement is valued more than the substance of communication.
The Engine Room: The Attention Economy
The fuel for the enshittification engine is the attention economy. This is an economic paradigm built on the understanding that in an information-rich world, human attention is the scarcest and most valuable resource.17 Social media platforms and other tech giants do not sell software or products to their users; their business model is to capture and hold user attention for as long as possible, analyze the data generated by that attention, and then sell the ability to influence that captured attention to the highest bidder, primarily advertisers.19 As Herbert A. Simon, the Nobel Prize-winning economist who first articulated the concept, noted, "a wealth of information creates a poverty of attention".17
This business model creates a relentless, zero-sum race for engagement. Platforms are not just competing with each other; they are competing against every other claim on a person's time, including work, family, hobbies, and sleep.19 This intense competition incentivizes the development and deployment of increasingly sophisticated and persuasive techniques—such as push notifications, infinite scroll, autoplay videos, and hyper-personalized feeds—all designed to maximize the time users spend on the platform.19 The longer a user stays engaged, the more data can be collected and the more ads can be sold.17
The Result: Content Optimized for Machines, Not Humans
In this environment, the quality, truthfulness, or educational value of content are not the primary variables for success. The only variable that matters is engagement, as measured by clicks, likes, shares, comments, and watch time. Complex algorithms, fed by vast stores of user data, are designed to identify and amplify the content most likely to generate these engagement signals.20 Research shows that this system inherently favors content that is more provocative, emotionally charged, and hyperbolic, as such content achieves significantly more engagement.19
This creates a perverse incentive structure for creators. To be visible, one must create content that pleases the algorithm. This forces human users to behave more like bots, adopting the language and formats that are known to perform well, leading to a homogenization of content.14 The result is a flood of low-effort, engagement-farming content designed solely to go viral. A bizarre but telling example is the phenomenon of "Shrimp Jesus" or "Crab Jesus" on Facebook—a stream of surreal, AI-generated images combining religious iconography with crustaceans.14 These images have no intrinsic meaning or artistic intent; they are the output of an automated process that has identified a combination of elements that reliably triggers high levels of engagement. This is content as algorithmic tribute, created not as a medium for human expression but as a calculated offering to the machine.
The Perceptual Consequence: The Dead Internet Theory
The cumulative effect of this shift has given rise to a cultural phenomenon known as the "Dead Internet Theory." This theory, which began on the fringes of the web, posits that the internet is no longer a space of organic human interaction but is now composed primarily of bot activity and algorithmically generated content that has drowned out genuine human voices.14 Proponents of the theory point to the endless repetition of the same threads, images, and replies across different platforms as evidence that the web has become "empty and devoid of people".14 Cybersecurity firm Imperva's 2024 report lends some credence to this feeling, finding that nearly half of all internet traffic in 2023 came from bots.14
Continue reading here (due to post length constraints) https://p4sc4l.substack.com/p/the-pervasive-sentiment-that-the
