< S  i  g  n  
a  l  s >


Why does the Machine Yearn?











Machine learning models abstract human behavior—predicting, responding, and influencing our movements, both digitally and physically. Their integration into systems of policing, defense, medicine, and governance makes them powerful, but also precarious—capable of reinforcing biases, flattening complexities, and shaping society in ways that remain largely unexamined.

Among these applications, conversational AI stands out. Designed to interpret and respond to human emotions, these systems—materialized as tangible chatbots like Alexa, virtual assistants like ChatGPT, and voice interfaces—enter the most intimate spaces of our lives. Yet, emotions, as defined by Britannica, are “a complex experience of consciousness, bodily sensation, and behavior” that resists classification. Large language models (LLMs) attempt to parse these intricacies, yet in doing so, they reduce, categorize, and optimize—transforming the deeply personal into structured data. Their interactions, though seemingly relational, are built on utility, trained on context-stripped inputs, and deployed in ways that dilute the nuances of human experience.

How, then, might we rethink our engagement - making and using with these systems? Can we design ethical AI models as invitations for reflection than prescription?

By teaching, adopting, and infiltrating the open-source GPT-2 LLM with the subjective experience of yearning—a feeling that is ephemeral, collective, and resistant to definition—can we render the machine useless yet full of wonder? And if AI is often imagined as a black box, could it instead take form as an open vessel—one that gathers, absorbs, and echoes human expressions rather than extracting and categorizing them?





























Signal - Softbank and their Emotion AI
1.  Emotion Cancelling AI built for enhancing customer srvice by altering emotional states of the employees through changing tones of customer speech, and  AI triggered video montage to calm and relax the employee.

Refer here.


Signal  -  Bumble  AI Assistant?

2. Bumble co-founder suggesting an AI Concierge that would date other people’s AI concierge so you don’t have to talk to anyone and everyone.

Refer here.


Signal - AI Therapists
3. People increasingly seeking AI advice for their mental health concerns through AI chatbots and  AI therapist bots, disregarding harm enacted through computational bias.

Refer here and here.







Y’s Website Navigator