One
<Gap > Research > Making >
At the start of this project, I was drawn to a central question:
If AI models are designed to recognize and predict human emotion, what happens when they encounter something inherently fluid—like yearning?
Rather than asking how AI interprets emotion, I began questioning what is lost in the process—the nuances erased, the cultural and linguistic subjectivities flattened, and the constraints imposed by predefined sentiment categories. AI does not merely reflect human emotions; it conditions how they are represented in digital systems. Could an AI be designed not to extract, classify, or optimize, but to pause, reflect, and absorb?
This shift in thinking moved my research beyond critique toward artistic intervention—toward an AI model that resists efficiency, embraces uncertainty, and invites a different kind of engagement.
"Machine learning is essentially making fluid and complex social and cultural categories fixed and rarefied. And in doing so, it can lead to consequences as severe as criminal prosecution, jailing, or worse."
This insight sharpened my focus—not just on how AI misrepresents identity, but on how it transforms the way we understand emotional experience. AI does not simply imitate human expression; it filters, classifies, and redefines it through systems designed for efficiency.
Rather than examining AI’s misclassification of identity—a critique well explored in existing research—I wanted to investigate something even more elusive: the abstraction of human emotion.
If AI compresses identity into fixed categories, what does it do to emotions, which are unstable, context-dependent, and resistant to definition?
From AI therapy bots* to sentiment-driven customer service AI*, these models are being integrated into spaces where human emotions are expected to be recognized, validated, and even influenced.
But,
what does it mean for human emotion to be structured through computational logic?
And,
how might an AI system be designed to expose, rather than conceal, its limitations?
This became the foundation of Machine Yearning—an artistic intervention designed to challenge AI’s role in emotional interpretation. Rather than extracting, categorizing, or predicting, this AI disrupts assumptions about how machines and humans engage with emotion.
Inspired by Legacy Russell’s provocation to see errors as intentional disruptions, I wanted to build an AI system that resists efficiency—one that does not provide answers, but instead acts as a mirror for reflection.
To address how the custom AI could engage with emotional interaction, I turned to Bell Hooks and her writings on love, desire, and longing. Among complex emotions, yearning stood out—a state that is persistent yet undefined, deeply personal yet collective, resistant to fixed meaning. Unlike emotions that AI attempts to categorize—joy, anger, sadness—yearning is not easily quantifiable. It exists in flux, shaped by memory, absence, and the desire for transformation.
Rather than a machine that "understands" emotion, Machine Yearning offers an AI that listens, refracts, and responds in ways that disrupt expectation.
It is not an instrument of prediction, but of ambiguity—a space where the boundaries between human and machine are blurred, unsettled, and open to reinterpretation.