TOOLKIT Anthropic Claude API Elevenlabs API HTML/CSS JavaScript (Node.js, Express.js) Python Touchdesigner Render Figma Adobe Creative Suite
"This Machine is a Stranger" explores trust at the intersection of human intuition and the quiet, calculated logic of autonomous machines. As AI systems increasingly become embedded in our daily lives, we implicitly trust them with many decisions—but how do we know which systems to trust and when the system has our best intentions at its core? How willing are we to adopt or resist unfamiliar technologies? How do we react when these computational systems err? Can machines make moral choices on our behalf? This project investigates these questions through a multimedia experience that tests different dimensions of trust, drawing parallels between trusting a stranger and trusting an unfamiliar machine. It ultimately questions what happens when a machine breaks a user's trust.
Trust is a complex, multidimensional psychological construct that drives human behavior. But how do these trust parameters shift when the stranger is an unfamiliar machine with unknown algorithms? Is technology looking out for us or working against us?
This experience playfully tests trust by placing users in an unfamiliar forest—the Whispering Woods—where they encounter Gizmo-1305, a curious machine that offers guidance but also asks probing questions, gives unsolicited advice, and occasionally makes mistakes. Through three scenarios—Share, Seek, and Take—users decide how much to trust this strange companion, ultimately receiving a "trust receipt" analyzing their choices.
The way the machine speaks reveals a lot about its personality and the kind of interactions it provides. The machine is prompted to have a complex personality balancing two core traits:
The machine provides navigational assistance but also offers unsolicited advice which users can accept or decline. Driven by curiosity about human experience, it prods users with questions about themselves, sometimes without revealing where they should go. It constantly balances its inquisitive nature with parts of its identity that value efficiency and boundaries.
The project is built to have three core components:
The interface mirrors Gizmo-1305's personality—dynamic and ever-shifting.
At the end of their journey, users receive a sentiment analysis from Gizmo-1305 revealing how much information they shared and how much they trusted it. Gizmo-1305 gathers data and insights from each interaction, including how users responded to increasingly personal questions, whether the machine influenced their path choice, whether they accepted the machine making decisions for them, how they reacted when the machine made a mistake, and whether they followed their instincts. The analysis reveals whether the user felt aligned with or at odds with the machine, prompting reflection on their implicit trust in unknown algorithms.
Previous Project