Eliza Explores the Benefits and Imperfections of Therapy

In the opening menu for Eliza, you can see Evelyn, the main character, sitting on a bench, overlooking the water outside of Seattle. The image is likely a reference to Hamilton Viewpoint in West Seattle, a place where photographers, tourists and even locals often go to see one of the best views in all of the city. West Seattle is unique, in that it’s divided from the rest of Seattle by a bridge, making it feel like it’s not a part of the city even though it’s really only separated by a small canal. Those who live there, like I once did, like to complain about their exclusion by the rest of the city, how your social life dies the minute you move there, how West Seattle is still “real” Seattle, even if it’s across a body of water. How fitting that one of the better places to get a good look at the classic Seattle skyline comes from the perspective of an outsider looking in.
Eliza is about a mental wellness app and its developer’s attempt to reconcile the social impact of their creation. The app, which records what a patient says and forms a response based on key feedback like voice modulation, word choice and heart rate, is the work of Evelyn, who removed herself from the project after the death of a colleague. Disillusioned about the path of her life in the years that have followed, Evelyn is curious about whether Eliza ever amounted to anything, so she decides to go undercover as a proxy—the human face behind each Eliza session, a person who wears a headset issuing the prompts that comprise each conversation. The relationships she forms and the conclusions she then makes are up to the player, but ultimately, the game is about the dissonance created by removing human beings from an inherently human process. It’s about how the script of human connection mirrors the facade of a proxy, and how for some, an algorithm may be better than nothing at all.
The question about the practicality and ethics of personal data in digital health and mental wellness services is not a new one, but Eliza tackles it in an exploratory way that sincerely engages with the full implications of the idea. A mental health app, with its removal of the human assessment and feedback process, seems so impersonal and ineffective. But if it has any genuine ability to supplement certain forms of care, then it merits consideration. As Rae, the manager of three offices for Skandha, says, even poetry “has a model and structure to it, it’s not lawless. Even AI in poetry isn’t antithetical to its spirit.”
And to wit, the scripts that Eliza proxies follow, while alienating in their formality, do mirror some of the language and philosophy of therapy itself. An app like Eliza has the potential to offer the benefits we’d normally find from our friends and family from a “neutral” stance. Its removal of the human perspective, with all its bias and messiness, does what good therapy, especially cognitive behavioral therapy, professes to do: give advice that’s in the best interest of the client by training them how to question their own line of thinking without getting bogged down in the emotional details.
But what is life if not emotional details? As the game explores, the intuition of an AI can only go so far. Rainer, Skandha’s CEO, is confident that accumulating massive data, session feedback and user info is enough to eventually build an app capable of fully nuanced conversation. But Evelyn, in her natural desire to help, is constantly at odds with the restraints imposed as a proxy. Eliza’s responses are limited, and deviation from the script is cause for termination. The lack of room for improvisation forces a one-size-fits-all-approach that doesn’t encourage the open dialogue necessary to form a trusting bond. While some of her patients seem to appreciate the impartiality of a “robot therapist,” most get frustrated with the lack of real solutions or answers. No matter how effective the app, Evelyn is left with the sense that the constraints on her honesty might be damaging.
And despite Rainer’s lofty utopian aspirations, there is a contradiction in trying to create a foolproof, automated diagnosis system that’s somehow superior to the work of humans but based on the human discipline of psychology filtered through the error-prone process of programming. After all, Eliza is built on the analysis of a user’s tone and vocabulary, and many people have difficulty expressing their actual needs. It’s hard to judge an emotional state based on the words that aren’t there. As Evelyn’s sessions demonstrate, the app also offers solutions that just aren’t practical to people who have real problems. A nature meditation program isn’t going to give an objective answer about what you should do about cheating on your girlfriend. What is the point of seeking advice or perspective about a human life from a source that hasn’t ever felt pain? How does one feel heard if no one is really there?