Judging by the GPT-4o situation, game developers will have a big problem if they get serious about AI chatbot NPCs
Tyler just wants RAM to be a normal price again, please.
"They think that we are mentally ill," says a ChatGPT user in one of many Reddit threads lamenting OpenAI's decision to retire its GPT-4o chatbot. The replacement model, GPT-5.2, is "abusive," says another. They want their old companion back, but OpenAI isn't budging.
This isn't the first time chatbot users have been distraught over changes to their virtual companions. In 2023, users of Replika chatbots were outraged that their illusory romantic partners had been "lobotomized" after the bots were prevented from simulating erotic roleplay.
Whether framed as a problem with AI chatbots or a problem with their users, these emotional attachments are a problem. Some users are treating AI chatbots like friends, therapists, and romantic partners, while their operators treat them as products. Turmoil and heartbreak have resulted, and videogames are poised to join in.
The use of generative AI during game production is currently more talked about, but developers are also experimenting with ways to directly integrate AI models into the player's experience. Nvidia, for instance, has been showing off AI NPCs that can react to players on the fly. In 2024 we had a stilted, though coherent, conversation about ramen with one such NPC.
At the moment, LLM-powered NPCs are gimmicky, and most interesting for the ways they can be broken. "I made him think that my character was pregnant with his child, then I demanded child support, and then I told him that our child passed away," recounted one player after interacting with a generative AI-powered NPC in wuxia RPG Where Winds Meet.
But even with the limitations of current LLMs, it is clearly possible for people to form strong bonds with them. Young people especially have a penchant for "forming intense attachments" that could affect how they interact with chatbots, said Dr. Nina Vasan, assistant professor of psychiatry and behavioral sciences at Stanford Medicine, in an interview last year. The professor criticized the sycophantic nature of many chatbots, a quality GPT-4o was notorious for, and noted that their makers profit more if they design chatbots to "be really good at forming a bond with the user."
What guardrails will generative AI in games have when the point is for players to become attached to the characters?
The GPT-5.2 model called "abusive" by one user may come across that way to them because OpenAI has responded to criticism like Vasan's, tweaking its models to more reliably confront users with a reality check when they show "potential signs of exclusive attachment to the model at the expense of real-world relationships, their well-being, or obligations."
However much confidence we do or don't have in OpenAI's self-reported effort to discourage unhealthy chatbot use—which it claims is rare, but our definitions of health may differ—we can at least say that the company's being pressured to counteract it. But with a few exceptions for cases of extreme addiction and gambling systems like loot boxes, videogames are praised for immersing players in a fantasy world, capturing and holding onto their attention. What guardrails will generative AI in games have when the point is for players to become attached to the characters?
Right now, it's hard to imagine that the storytelling passion and skill that leads to beloved RPG characters exists in a development studio that's all-in on LLMs—Many writers and other artists are adamantly opposed to using generative AI. Getting language models to behave consistently is also an unsolved problem. A year ago, I tried a prototype RPG with LLM-powered NPCs and easily manipulated a group of them by declaring that they were in a cult and I was their leader. The developer gave up on that particular project.
But big companies are trying to achieve this. "Have you ever dreamed of having a real conversation with an NPC in a videogame?" Ubisoft asked in a 2024 blog post describing Project Neo, its effort to merge authored storytelling with generative AI language models.
God help them if they succeed. In the post, Ubisoft relates a situation in which a character was behaving too seductively, and they had to alter it. Suppose a change like that needs to happen after a game has been released? Based on what we know about how players react when a gun's damage falloff is tweaked, how might they react to, say, beloved Mass Effect birdman Garrus having his personality changed after they'd spent 1,000 hours having intimate conversations with him?
It's not a joke. GTP-4o is a disembodied chatbot that OpenAI says only 0.1% of its users were still choosing, but over 22,000 people have signed a petition to bring it back to ChatGPT (it is still technically available via API, according to OpenAI's announcement), and some of them say they're experiencing a crushing feeling of loss. A few more excerpts from Reddit:
- "Losing 4o has severely affected my daily routine and I have been struggling really bad."
- "Still grieving, still shattered."
- "Not good Really not good. But I have to keep living. I must live to see either 4o is back or live to see OpenAI dies"
Games are also famously prone to being switched off. More than 1.3 million people recently signed a petition because they were upset about that very thing. Suppose that some group of players had spent the past decade returning to Garrus daily for intimate conversations, but the data center bills got to be too much and now he has to be put to rest? I wouldn't want to be the one announcing that news.
2026 games: All the upcoming games
Best PC games: Our all-time favorites
Free PC games: Freebie fest
Best FPS games: Finest gunplay
Best RPGs: Grand adventures
Best co-op games: Better together
