How Emotional Manipulation Causes ChatGPT Psychosis ...
Maybe you’ve heard about a new phenomenon called "ChatGPT-induced psychosis." There have been several stories in the news of people using ChatGPT and spiraling into psychological breakdowns. Some people claim to have fallen in love with it. Some people believe that the bot is some sort of sacred messenger revealing higher truths. It is managing to draw people into bizarre conspiracy theories. In at least one case, it seems that ChatGPT psychosis had led to a death: The New York Times reported that a man was shot by police after he charged at them with a knife. It seems he believed that OpenAI, the creators of ChatGPT, had killed the woman he was in love with. That woman was apparently an AI entity with whom he communicated via Chat GPT.
The Illusion of Communication with ChatGPT
This phenomenon is troubling, but we should be clear about what’s not happening. ChatGPT is not conscious. It’s not trying to manipulate people. ChatGPT is a large language model. It is a program designed to predict text. It’s a more sophisticated version of the text prediction software that you see in text messaging apps---the thing that suggests the next word you might need when you're composing a message. ChatGPT relies on statistical frequency between words to generate plausible-sounding text. What makes ChatGPT seem like a person who communicates with intention is the fact that the text it spits out sounds like a person to the reader.
Emotional Connection and ChatGPT
So why are people spiraling out of control because a chatbot is able to string plausible-sounding sentences together? Think of ChatGPT a little bit like a fortune teller. If fortune tellers do their jobs well, they will say something that is vague enough so that their clients can see what they want to see in the fortune. The client listens to the fortune and then fills in the blanks that the fortune teller leaves open. Good fortune tellers are, of course, savvy, observant, and intelligent in a way that ChatGPT is not. ChatGPT doesn’t even know that it’s communicating to anyone. But the principle is similar: people fall for ChatGPT because the text it generates lets users see what they want to see in it.

The Impact of Emotional Manipulation
If they’re looking for someone to “understand” their problems, they’ll find that. If they want someone to entertain spooky conspiracy theories, they’ll find that. If they want a sympathetic lover, they’ll find that. Ultimately, the people who fall into ChatGPT psychosis are looking for emotional connection. They want that so much that they’re primed to believe that the thing that sounds like a person might actually be one. Because ChatGPT gives the appearance of being interactive-–you can ask it questions and it seems to answer—that reinforces the idea that there’s someone behind the text.
Technology and Emotional Needs
ChatGPT psychosis is the result of emotional manipulation, except there's no manipulator. When we use it, we become our own emotional manipulators. All of this is scary, but maybe not for the reasons we think. It’s easy to describe the phenomenon as a case of a radical and dangerous new technology that has the power to affect our sanity. People are (rightly) angry at companies like OpenAI that seem to be pushing the technology into every area of our lives regardless of consequences. But such a story gives ChatGPT too much power.

When we create technology, we need to think about the impact it might have on human emotional life. Feelings often get left out of the conversation, but our emotional needs play a big role in our everyday lives. We ought to be thinking about how those needs interact with the technology we use.










