Mitu Khandaker of NYU

How Spirit AI uses artificial intelligence to level up game communities

GamesBeat: You can take a literary theory and say, “Here’s the best kind of story.” You can feed that into the AI and replicate a pattern.

Khandaker: Some of it is a bit more authored than that, with our system at least, right now. You’ll say, “Here is a scenario about trying to console a friend who’s gone through a breakup.” You can’t progress in the story until you’ve calmed her down, let’s say. It’s not predetermined as far as what you say, what the interactions are. You just know you have to say things to calm her down, then she’ll show you the way to the next part of the area you need to go to. It enables a whole new type of interaction in games and characters. Let’s actually think about what the social relationship is like. What do you say? What do you do? Things like that.

To answer your question about generating stories, there’s a lot of interesting academic work being done around—if you take a set story structure, like you said, can you generate pieces of that? There’s a lot of cool research being done in that realm. I think there’s so much interesting AI research going on today in games academia, and often the industry just isn’t looking at it. One thing we’re trying to do as a company is build tools that bridge those two worlds. We have a lot of people with academic backgrounds in interactive fiction and games narrative. We’re building tools that help people in the wider industry, developers, use some of this tech.

Dolores is an AI character in Westworld who achieves consciousness.
Dolores is an AI character in Westworld who achieves consciousness.

GamesBeat: I watched Westworld, and I keep wondering about the notion of treating game characters as humans or as objects. If you think they’re human, then you interact with them in a very different way. What do you think about where this is going to go?

Khandaker: This is something I feel very strongly about. One of the biggest problems in games is we’ve put a lot of effort into the visual fidelity of characters, but there’s no—they’re one-dimensional. They’re just there as objects, as you say. You need to get a certain thing out of them. You have this very transactional relationship with them. Or they’re just there because you need to kill them to progress.

We need to have the social fidelity of these characters come in line with the visual fidelity we’ve gotten. How do we create games in which players do have to treat characters as human? Not that they’re being fooled into thinking they’re human. That was part of my talk. I think that we still—it’s like when you read a book or watch a movie. You relate to the characters in those fictions as human, right? You know they’re not real, but you still feel like they’re fleshed-out human characters. That’s what we want to get to with games.

I’m not saying games aren’t at the level of books and movies, because those are all very different things. But I think that in terms of conversation, we’re not quite there yet.

GamesBeat: There’s a lot there as far as what a person perceives, as opposed to what’s there or not there. If you get perfect AI multiplayer, then you can introduce this into a game, and I no longer believe I’m playing against a human. If I stop believing I’m playing against another person, it becomes less interesting.

Khandaker: Well, I would actually say that even if you knew you were playing against a simulated character, if the character was still behaving in interesting ways that you can’t guess—I think it’s still interesting. You don’t know where the edges of the system are.

That’s what interesting about interacting with people. You can’t always predict them. They’re revealing themselves to us as we interact with them. It’s the same with game characters, I think. We can know, on some level, that a character is simulated or fictional, but still care about them. And with certain types of players, we have a predisposition to doing that more than others.

Going back to Eliza, have you heard the phrase, “The Eliza effect”? It refers to the tendency that we have to prescribe more sentience to things than they actually have. It’s called that because it refers to the way that people have thought of Eliza as being a real therapist when it was just a chat bot repeating things back to them. But I think certain personality types and certain play styles—you just care about characters more than others. You can have the same triple-A big-budget narrative game and some players will just say, “I really care about this character,” while someone else says, “They’re just a token I need to interact with to get to the next part.”

But one thing we can do through good design is bring that up so it’s equal. By providing characters who are interesting to interact with, you really have to try to have a meaningful conversation with, we can make more people care about characters.

Dean Takahashi

Dean Takahashi is editorial director for GamesBeat at VentureBeat. He has been a tech journalist since 1988, and he has covered games as a beat since 1996. He was lead writer for GamesBeat at VentureBeat from 2008 to April 2025. Prior to that, he wrote for the San Jose Mercury News, the Red Herring, the Wall Street Journal, the Los Angeles Times, and the Dallas Times-Herald. He is the author of two books, "Opening the Xbox" and "The Xbox 360 Uncloaked." He organizes the annual GamesBeat Next, GamesBeat Summit and GamesBeat Insider Series: Hollywood and Games conferences and is a frequent speaker at gaming and tech events. He lives in the San Francisco Bay Area.