Machine Yearning, Digital Design Weekend, London Design Week, Victoria and Albert Museum, September 2024.
to static and essential.
Through a machine learning programme trained and taught on uncategorised data of human sounds, first-hand narratives, and shared feelings of yearning, the Yearning AI is a “useless” but “feeling”, emotionally resonant model. Designed to invite the user not to resolve queries but to create an iterative and reflective interaction for their everyday use of AI.
The model embodied as a sculptural artefact - crafted from fibreglass plastic, is an imagined artefact of the future articulating the once absorbed yearnings through speech.
Methodologies & Methods
Human-AI Interaction > Experience Design > Critical and Speculative Design > Machine Learning and Coding > Participatory Design > System Research > Ethical Data Practice
Collaborators
Code Consultant - Anushka Aggrawal
Tools and Mediums
New Media Experience >
Computational Art >
Python for custom AI development >
Adobe Suite & Procreate for Installation Design >
Garageband for Audio Prototyping
What does Machine Yearning entail?
The custom machine learning programme trained, and taught on human notions, first hand narratives, and shared feelings of yearning, embodied as a vessel in time - a fibrous plastic belly in time absorbing, and voicing out yearnings through speech.
The audience is invited to lean into the vessel, and speak their yearnings. The AI, embodied as the vessel, listens, and responds with its developed interpretation of yearning. The speech based interaction is triggered with recognition of human words, phrases, and sentences.
Machine Yearning is an interactive installation where humans engage in an intimate, reflective dialogue with AI through speech. The audience leans into the sculptural vessel, a fibrous plastic form that listens, absorbs, and responds to their shared views on personal yearnings. The AI, trained on first-hand narratives, human sounds, and shared expressions of yearning, does not seek to inform or solve—it echoes, refracts, and processes the spoken words.
After a brief pause, and the acknowledgement of AI - “I hear you yearn.” it generates a response—its own interpretation of yearning—delivered through synthesized speech. Each response is unique, shaped by the model’s evolving dataset and its recursive engagement with human emotions. This interaction creates an ongoing dialogue where the AI, rather than providing solutions, reflects and reframes the emotional expressions it receives.
By shifting AI’s role from a predictive tool to an expressive entity, Machine Yearning invites users to consider alternative forms of machine intelligence—ones that are relational, poetic, and centered on human vulnerability.
When a human asked “What does the future hold for us?
The machine replied -
a l s >
Why does the Machine Yearn?
Among these applications, conversational AI stands out. Designed to interpret and respond to human emotions, these systems—materialized as tangible chatbots like Alexa, virtual assistants like ChatGPT, and voice interfaces—enter the most intimate spaces of our lives. Yet, emotions, as defined by Britannica, are “a complex experience of consciousness, bodily sensation, and behavior” that resists classification. Large language models (LLMs) attempt to parse these intricacies, yet in doing so, they reduce, categorize, and optimize—transforming the deeply personal into structured data. Their interactions, though seemingly relational, are built on utility, trained on context-stripped inputs, and deployed in ways that dilute the nuances of human experience.
How, then, might we rethink our engagement - making and using with these systems? Can we design ethical AI models as invitations for reflection than prescription?
By teaching, adopting, and infiltrating the open-source GPT-2 LLM with the subjective experience of yearning—a feeling that is ephemeral, collective, and resistant to definition—can we render the machine useless yet full of wonder? And if AI is often imagined as a black box, could it instead take form as an open vessel—one that gathers, absorbs, and echoes human expressions rather than extracting and categorizing them?
Signal - Softbank and their Emotion AI
1. Emotion Cancelling AI built for enhancing customer srvice by altering emotional states of the employees through changing tones of customer speech, and AI triggered video montage to calm and relax the employee.
Refer here.
Signal - Bumble AI Assistant?
2. Bumble co-founder suggesting an AI Concierge that would date other people’s AI concierge so you don’t have to talk to anyone and everyone.
Refer here.
Signal - AI Therapists
3. People increasingly seeking AI advice for their mental health concerns through AI chatbots and AI therapist bots, disregarding harm enacted through computational bias.
Refer here and here.
One
<Gap > Research > Making >
At the start of this project, I was drawn to a central question:
If AI models are designed to recognize and predict human emotion, what happens when they encounter something inherently fluid—like yearning?
Rather than asking how AI interprets emotion, I began questioning what is lost in the process—the nuances erased, the cultural and linguistic subjectivities flattened, and the constraints imposed by predefined sentiment categories. AI does not merely reflect human emotions; it conditions how they are represented in digital systems. Could an AI be designed not to extract, classify, or optimize, but to pause, reflect, and absorb?
This shift in thinking moved my research beyond critique toward artistic intervention—toward an AI model that resists efficiency, embraces uncertainty, and invites a different kind of engagement.
"Machine learning is essentially making fluid and complex social and cultural categories fixed and rarefied. And in doing so, it can lead to consequences as severe as criminal prosecution, jailing, or worse."
This insight sharpened my focus—not just on how AI misrepresents identity, but on how it transforms the way we understand emotional experience. AI does not simply imitate human expression; it filters, classifies, and redefines it through systems designed for efficiency.
Rather than examining AI’s misclassification of identity—a critique well explored in existing research—I wanted to investigate something even more elusive: the abstraction of human emotion.
If AI compresses identity into fixed categories, what does it do to emotions, which are unstable, context-dependent, and resistant to definition?
From AI therapy bots* to sentiment-driven customer service AI*, these models are being integrated into spaces where human emotions are expected to be recognized, validated, and even influenced.
But,
what does it mean for human emotion to be structured through computational logic?
And,
how might an AI system be designed to expose, rather than conceal, its limitations?
This became the foundation of Machine Yearning—an artistic intervention designed to challenge AI’s role in emotional interpretation. Rather than extracting, categorizing, or predicting, this AI disrupts assumptions about how machines and humans engage with emotion.
Inspired by Legacy Russell’s provocation to see errors as intentional disruptions, I wanted to build an AI system that resists efficiency—one that does not provide answers, but instead acts as a mirror for reflection.
To address how the custom AI could engage with emotional interaction, I turned to Bell Hooks and her writings on love, desire, and longing. Among complex emotions, yearning stood out—a state that is persistent yet undefined, deeply personal yet collective, resistant to fixed meaning. Unlike emotions that AI attempts to categorize—joy, anger, sadness—yearning is not easily quantifiable. It exists in flux, shaped by memory, absence, and the desire for transformation.
Rather than a machine that "understands" emotion, Machine Yearning offers an AI that listens, refracts, and responds in ways that disrupt expectation.
It is not an instrument of prediction, but of ambiguity—a space where the boundaries between human and machine are blurred, unsettled, and open to reinterpretation.
Two
<G a t h e r i n g
D a t a >
Machine learning models are shaped by the datasets that train them. Data is not neutral—it carries the biases, omissions, and ethical concerns of its collection process. Data is primarily human. If AI models are built on human data, then the question arises,
How might we ethically collect data to programme more equitable AI futures?
Extractivist Data Methodologies
Ethical + Feminist Data Methodologies
Dubious or Hidden or False Consent and Permission
Informed Consent
Hidden use of Data -
The Function of Data Sharing is not Defined or Shared
The Function of Data Sharing is not Defined or Shared
Transparency about Outcomes Generated
from Use of Shared Data
from Use of Shared Data
Reproduction of Biases
through Inappropriate or Biased Labelling
through Inappropriate or Biased Labelling
Diverse and Communal Collection
Allows for Dilution of Biases
Allows for Dilution of Biases
Designing the Workshop Framework
With guidance from Bella Day* and drawing from co-creative and activist-led workshop practices, I structured a three-part workshop aimed at ethically collecting first-hand emotional narratives. Each part was designed to:
Build an environment of trust – Creating spaces where participants could share stories comfortably and authentically.
Encourage co-creation – Positioning participants as active contributors to the dataset, rather than passive sources of data.
Experiment with multiple data forms – Separating the collected data into textual transcripts of spoken narratives and non-verbal sounds, exploring different modalities of emotional expression.
This workshop process was essential to counteract the biases embedded in conventional AI datasets. By prioritizing human-first data collection, the goal was not to extract rigid classifications of emotion, but to gather narratives in a way that remains fluid, subjective, and open-ended—aligning with the core philosophy of Machine Yearning.
Below is a fundamental flow for each part, and you can access the blueprint for each workshop here.
Build an environment of trust – Creating spaces where participants could share stories comfortably and authentically.
Encourage co-creation – Positioning participants as active contributors to the dataset, rather than passive sources of data.
Experiment with multiple data forms – Separating the collected data into textual transcripts of spoken narratives and non-verbal sounds, exploring different modalities of emotional expression.
This workshop process was essential to counteract the biases embedded in conventional AI datasets. By prioritizing human-first data collection, the goal was not to extract rigid classifications of emotion, but to gather narratives in a way that remains fluid, subjective, and open-ended—aligning with the core philosophy of Machine Yearning.
Below is a fundamental flow for each part, and you can access the blueprint for each workshop here.
Two
< (Un)P r o g r a m m i n g
Y e a r n i n g >
Over weeks of iterative development, we refined the model’s architecture, optimizing its behavior at each stage to align with the evolving conceptual goals of Machine Yearning.
I t e r a t i o n #1
Fine-Tuning a Model on Literary Text
Fine-Tuning a Model on Literary Text
I t e r a t i o n #2
Integrating Text-to-Speech (TTS)
Integrating Text-to-Speech (TTS)
I t e r a t i o n #3
Training on the Collected Dataset
Training on the Collected Dataset
This version was also exhibited as part of my graduate project, housed inside the sculpture, serving as a public test of how an AI, trained on deeply personal narratives, could produce responses that were both abstract and deeply affecting.
I t e r a t i o n #4
Developing a Real-Time Conversational Loop
Developing a Real-Time Conversational Loop
This final iteration was exhibited at the Victoria & Albert Museum (V&A) as part of Digital Design Weekend, where the interactive model generated diverse responses based on audience engagement. Feedback from visitors highlighted the uncanny and thought-provoking nature of the AI’s responses—a machine that does not "understand" human emotion, yet evokes a feeling of recognition through its computational reinterpretation of yearning.
Three
< D e s i g n i n g
t h e
E x p e r i e n c e >
Interacting with Machine Yearning could have been a 3D-rendered digital entity, a chatbot, or a purely screen-based experience. But my inquiry began with a different question:
How might awe serve as a tool for reflection—one that transforms how we think about AI, both as creators and users?
From Disembodied Voices to a Monumental Presence
The Problem of Disembodied AI Voices
With Machine Yearning, I wanted to break from this paradigm. The AI’s embodiment had to be:
AI assistants like Alexa and Siri are designed to be non-threatening and servile, reinforcing gendered power dynamics. Studies show that this "obedient AI woman" trope influences how people interact with women in real life, subtly normalizing expectations of compliance and emotional availability in gendered interactions (West, Kraut, & Chew, How Do People Interact With AI Assistants?, 2019).
By design, these AI systems are also bodiless, invisible, and frictionless—they exist to be heard, not seen. Their presence is only acknowledged when summoned, when spoken to, when needed.
For Machine Yearning, I wanted to counter this. Instead of a bodiless, servile AI, the vessel would be monumental, visible, and absorptive—not a tool to be used, but an entity that gathers, holds, and transforms.
The Vessel as a
Feminist and Cultural Counterpoint to AI Design
Visual Research on Lota
(Images from
Google Arts and Culture)
also symbolising womb, a space, lacuna, pot, an object that gathers and stores -
First drafts of the Machine Yearning Sculpture
I turned to cultural symbols of containment, storytelling, and memory, seeking a form that does not extract but holds.
The Lota as a Vessel for AI
The lota, a common household object in South Asia, became a key reference.The lota is an ancient and enduring vessel, a container that has moved through generations—it carries water, sustenance, and even oral histories. By invoking the lota’s form, Machine Yearning challenges AI’s impermanence and invisibility, positioning the vessel as a cultural artefact in motion across time.
Braided Hair as a Symbol of Woven Stories
Another symbolic vessel is hair. Hair has historically been:
Braiding became an integral design element, intertwining synthetic hair into the vessel’s form—drawing patterns is integral to machine learning process of sense making, similar to how braiding involves deliberate, detailed, focused pattern and iterative making—a material gesture that references women’s histories, embodied knowledge, and oral storytelling traditions in dialogue with machine learning algorithmic knowledge.
Material and Visual Research
Visual Research
Material and Visual Research
Material and Visual Research
Visual Research
I began building plans for the making of the sculpture, the installation, user flow, and tech rider.
Material Research + Floorplan
From Symbol to Interactive Experience
The talking well—an old-world phenomenon where speaking into a deep well created a resonant echo, mirroring one’s own voice back.
Listening to the ocean in a conch shell—a childhood act of hearing something vast and unreachable in a small, intimate object.
Could the vessel act as an acoustic space—one that absorbs and vocalizes yearning in the same breath?
The vessel softly whispers AI-generated yearnings, drawing the audience in.
The listener leans in—curious, immersed—enticed by its voice.
They speak into the vessel, sharing their own yearning.
The AI acknowledges: "I hear you yearn."
They wait for its response—sometimes unsettling, sometimes poetic, sometimes absurd.
They step back, intrigued, wondering, asking questions. In this moment, the human does not command the AI. Instead, they engage in an act of reciprocity.
But a Vessel
Yearning is voiced from a fiberplastic belly.
A cultural vessel houses collective stories.
Bottled-up emotions find resonance.
A talking well echoes back.
An artifact of time—past, present, and future.
An acoustic cave, reverberating longing.
The vessel does not seek to solve, optimize, or assist. Instead, it listens, absorbs, and unsettles—reframing AI not as an obedient voice, but as an evocative presence.
Victoria and Albert Museum, London
Hackney Down Studios
Royal College of Art
And mentioned in press by BRICKS Magazine.
My research within alternative plays, experiments, perspectives on AI continues as I work based on the incredibly remarkable feedback received during the showcases.
If you want to know more, discuss more, please reach out at workxyashika@gmail.com