They called it the Typing—the slow, quiet remaking of the world into patterns.
When Mina was born, people still talked about class in textbooks and museum exhibits: who owned land, factories, capital. By the time she was old enough to read the news herself, another grammar had taken hold. Wealth no longer lived only in bank vaults and factories; it lived in trails — the clicks and pauses, the half-typed searches, the times people lingered on a recipe or a political thread. That trail, when stitched together across millions of lives, became something with value: information capital.
At first the change was called metaphors — “data is the new oil,” “the attention economy” — phrases journalists used to make a strange economy feel familiar. But the engineers and the platforms had a more precise name for what they were building: behavioral surplus. Silicon Valley firms collected the spare pieces of human behavior not because anyone asked for them, but because those pieces could be transformed into predictions and influence. What followed was the slow marketization of experience itself — feeds tuned to hypnotize attention, ads that felt like answers, and recommendations that recast curiosity into consumption. (See Shoshana Zuboff’s account of surveillance capitalism and the Economist’s analysis of data’s new economic role.)
Mina grew up inside a string of “types.” Social platforms categorized her early: indie-foodie, late-night gamer, climate volunteer. The labels were not purely descriptive. Algorithms learned the textures of each type and traded on their predictability. The types aggregated into political tastes, shopping cohorts, and even medical-prognosis clusters. Where once political mobilization required meetings and pamphlets, now the alignment of millions of micro-preferences produced a pressurized public, prone to sudden coherence — and sudden fracture. Eli Pariser’s “filter bubble” had been a cautionary essay; in Mina’s world it was an infrastructure: people lived inside calibrated mirrors that amplified what would keep them clicking.
Power, in that epoch, shifted. Managing capital had once needed ledgers and factories; managing information capital required models, data pipelines, and latency. The real ruling class became the architectures that could store, index, and optimize information flows: massive foundation models, global recommender graphs, and the distributed servomes that decided what a billion people saw in the next thirty seconds. Mina noticed it in small ways: city transit displays that displayed different adverts based on who scanned them, supermarket aisles whose lighting subtly shifted to highlight products your phone had shown you earlier, and local governments that used predictive engagement scores to prioritize which neighborhoods received public messaging. The algorithms were not merely tools; they were the governors of attention. (This is the essence of modern attention economics.)
But dominance breeds reaction. Regulators in Europe moved first to recast the rules of the Typing. The EU’s Artificial Intelligence Act — which entered into force in 2024 and started to phase in rules on high-risk uses, transparency, and bans on abusive manipulative practices — imposed new legal constraints on how platforms could weaponize information and how employers and states could deploy predictive models. The law’s arrival was not an instant cure, but it marked the first time a bloc of governments recognized that an algorithmic ruling class required legal boundaries. Implementation timelines left many details to member states, but the language shifted public imagination: information capital would now be governed, not simply owned.
Mina’s story pivots on two moments — one intimate, one systemic.
The intimate one happened during a local campaign to redesign the waterfront. A charismatic influencer-type emerged, her posts tailored perfectly to the tastes of Mina’s “urban-radical” cohort. Attendance at the campaign events ballooned — not because her arguments were novel but because the platform’s microtargeting assembled a crowd that shared the same emotional response. At the public meeting, the influencer’s words landed like familiar refrains: people nodded in sync, a community feeling that had been algorithmically brewed. Later that night Mina found herself wondering whether that civic energy had been earned or engineered.
The systemic moment came when a coalition of privacy engineers, municipal planners, and civic hackers published an open toolkit called CommonLens — a set of audited, privacy-first recommender primitives that ran on local servers and used federated learning and differential-privacy techniques to reduce central data hoarding. It didn’t reverse the Typing entirely, but it gave neighborhoods the means to curate their own information environments — slower, messier, more plural. Technologies like federated learning and differential privacy didn’t erase information capital; they changed its ownership contours. Civil society’s tools began to look less like blockades and more like new kinds of capital formation: civic information capital. (Recent years have also seen active policy pushes on data brokers and national privacy advances, especially in the U.S. states and EU, that aimed to curb opaque data markets.)
Mina lived through the paradox: typification made people legible to each other — easing some forms of cooperation — while making publics brittle and easier to manipulate. In markets and in municipal planning alike, AI systems optimized for coherence and engagement. When coherence produced solidarity, communities thrived; when it produced conformity, dissent shut down.
By the time Mina had children, the language of society had changed into three words: curate, audit, and diversify. Citizens learned to audit the algorithms shaping them; public agencies learned to diversify the information ecosystems they sponsored. The ruling class was no longer an immutable nexus of capitalists; it was a contested set of infrastructures and legal regimes. AI could organize information capital efficiently, but its legitimacy depended on rules people could see and challenge.
In the end, Mina realized that “typified society” was neither utopia nor dystopia. It was a new topology for power: nodes of influence stitched together by information flows. The battle for that topology — whether the maps would be closed and proprietary, or open and governed — would determine whether the next societies resembled polished markets or patchwork commons.
Mina taught her child to do one simple thing: when you notice your feed echoing your moods back to you, step outside the echo. Look for sources that disagree. Talk to someone who isn’t your type. The act was small, deliberate, and ancient: a civic practice for an age where information, not just gold or land, had become capital.
All names of people and organizations appearing in this story are pseudonyms
Russia orders state-backed Max messenger app to be pre-installed on new phones

Comments