The College Chronicle

The story beneath the noise.

Wednesday, February 4, 2026

opinion

The Affective Economy of Non-Human Attention

An analysis of how digital platforms and algorithms monetize human emotions, turning affect into a new form of capital and reshaping politics, society, and the self.

There was a time when machines served to lighten our labor. Now they lighten and quietly reorganize our emotions. The digital world hums with unseen choreographies of feeling: sadness, curiosity, envy, delight, all carefully sequenced to keep us scrolling, buying, reacting. The question is no longer whether machines can feel, but whether they have learned to manage how we do.

We now inhabit an economy built not on the extraction of coal, labor, or even data alone, but on the extraction of affect, the energetic pulse of emotion itself. Marx once imagined the worker bending beneath the weight of industrial machinery; today, the worker bends over a glowing screen. Our gestures, our pauses, our bursts of anger in a comment section have all become units of value. In the logic of this age, attention is the new currency, and the human nervous system is its mint.

This is not the first time the capital has colonized what seemed unownable. The nineteenth century turned skill into wage; the twentieth turned image into spectacle. Now, the twenty-first has turned feeling itself into capital. As Guy Debord foresaw, we live in a society where the image devours experience, and as Jonathan Beller argued, every act of looking has become a form of labor. We no longer work for the algorithm; we feel for it. Every flicker of emotion is an unwitting entry in the ledger of engagement. The self has become a mine, and its ore is affect.

Consider the Facebook–Cambridge Analytica scandal of 2018, when millions of personal profiles were harvested to manipulate voters through emotional targeting. Or the rise of TikTok’s “For You” feed, whose uncanny precision in predicting what we want or fear has redefined the global attention span. These moments are not accidents of technology; they are symptoms of an economy built on emotional extraction.

Michel Foucault once described power as a structure of surveillance that renders people visible, compliant, predictable. That tower has dissolved into code. Our screens no longer watch us through fear of exposure; they learn us through intimacy. The algorithm is the new sovereign—not because it commands, but because it predicts. Gilles Deleuze warned that control would no longer confine but modulate us, and today that modulation is emotional: joy amplified, fear redirected, outrage recycled.

You can see it each time an outrage trend erupts online: the quick cycles of anger that platforms like X (formerly Twitter) convert into profit. A tragedy or scandal becomes a brief emotional marketplace; compassion, rage, and grief are monetized through impressions and ad revenue. Shoshana Zuboff called this surveillance capitalism, but what is monitored now is not just behavior; it is feeling itself.

The feed is not neutral; it is a system of effective governance. Its purpose is not to inform but to ensure we keep returning. We do not simply choose what to feel; we are subtly chosen by it. What once had an exchange value, an object's worth measured by labor, has transformed into engagement value, measured by the intensity of emotion it provokes. An image that enrages is worth more than one that enlightens; a tear, more than a thought.

This logic has reshaped the public sphere. Political campaigns are now tested for “emotional resonance” using real-time biometric data. The influencer economy thrives on intimacy, where the line between sincerity and performance is erased. When Instagram’s algorithm briefly prioritized Reels over photos in 2022, entire communities of artists and photographers revolted not because their work was censored, but because their emotions were being formatted for a new attention regime.

Here, Eva Illouz’s insight feels prophetic: emotion is no longer spontaneous; it is manufactured, measured, and marketed. Byung-Chul Han calls this the age of psychopolitics—an empire of self-exploitation rather than repression. We perform our emotions for invisible audiences, mistaking surveillance for authenticity. We are not silenced; we are exhausted.

Philosophers once treated attention as a uniquely human act, the directed beam of consciousness toward meaning. That beam is now refracted. Machines anticipate our impulses before they fully form, learning our patterns faster than we can perceive them. TikTok’s recommendation system, Spotify’s mood-based playlists, and Netflix’s emotion-driven thumbnails all participate in a new kind of cultural engineering, one that decides what deserves to exist by what provokes a measurable feeling.

Maurice Merleau-Ponty saw attention as embodied, rooted in perception’s slow unfolding. Bernard Stiegler later warned that technology would become the prosthesis of consciousness, externalizing memory and desire. Today our feeds are not tools; they are extensions of awareness, shaping what it means to perceive. The 2020s have made this vivid: from AI art generators that mimic our aesthetic impulses to emotion-recognition software in workplaces, the line between perception and production has collapsed.

Politics, too, has migrated into this circuitry. The 2016 and 2020 election cycles in the United States, the Brexit campaign, and countless protest movements across the world from Hong Kong to Tehran showed that the digital crowd coheres not through ideology but through synchronized emotion. Chantal Mouffe argued that democracy depends on transforming antagonism into agonism. But what happens when those emotions are orchestrated by systems indifferent to truth or justice?

Hannah Arendt wrote that freedom resides in the capacity to act unpredictably. Predictability, then, is the quiet death of politics. If our emotional responses are prefigured by non-human attention, democracy becomes theatre: citizens performing emotional scripts authored by unseen statistical models. The new authoritarianism does not burn books or silence speech; it simply ensures that every word, every gesture, every tear fulfills an algorithmic function. We are governed not by ideology, but by effective infrastructure.

What might resistance look like in such a world? Perhaps Simone Weil offers a clue. She saw attention as a spiritual discipline, a stillness before truth. To attend, in Weil’s sense, is to resist distraction, to refuse the conversion of emotion into commodity. Hartmut Rosa reminds us that speed itself is political: the faster the rhythm, the thinner the reflection. The machine’s tempo is instantaneous; the humans must remain slow. Slowness is not laziness; it is rebellion.

Byung-Chul Han calls for “contemplative delay,” the courage to let things be. To scroll less is not asceticism but philosophy: an attempt to reclaim the possibility of authentic emotion in a world where affect has been industrialized. Machines do not need to feel; they only need to organize the world so that we feel endlessly, predictably, profitably. Yet beneath the data’s hum persists a fragile pulse of resistance: the human capacity to withhold attention.

Perhaps the final act of freedom is not to speak, not to post, not to react, but simply to look away, to reclaim the silence between signals, to remember that emotion, before it became a metric, was once the language of being alive.