Two
< (Un)P r o g r a m m i n g
Y e a r n i n g >
Over weeks of iterative development, we refined the model’s architecture, optimizing its behavior at each stage to align with the evolving conceptual goals of Machine Yearning.
I t e r a t i o n #1
Fine-Tuning a Model on Literary Text
Fine-Tuning a Model on Literary Text
I t e r a t i o n #2
Integrating Text-to-Speech (TTS)
Integrating Text-to-Speech (TTS)
I t e r a t i o n #3
Training on the Collected Dataset
Training on the Collected Dataset
This version was also exhibited as part of my graduate project, housed inside the sculpture, serving as a public test of how an AI, trained on deeply personal narratives, could produce responses that were both abstract and deeply affecting.
I t e r a t i o n #4
Developing a Real-Time Conversational Loop
Developing a Real-Time Conversational Loop
This final iteration was exhibited at the Victoria & Albert Museum (V&A) as part of Digital Design Weekend, where the interactive model generated diverse responses based on audience engagement. Feedback from visitors highlighted the uncanny and thought-provoking nature of the AI’s responses—a machine that does not "understand" human emotion, yet evokes a feeling of recognition through its computational reinterpretation of yearning.