How Artificial Intelligence Might Destroy the World with Role-Playing Games

It sounds like a clickbait title, that artificial general intelligence can destroy people through computer role-playing games, but give me a second to make my point!

The next sentence I write is one of the most important things that no one discusses or understands despite it being common knowledge: Human society is based on giving fictional characters superhuman attributes and then designing our laws, government, and culture around what we imagine these fictional characters want. We call this “religion,” and its power religion exercises is mind-blowing when you realize that the gods do not exist. Even if you take an exception for your religion, but you should not, it means that everyone else – and the vast majority of people through history – have organized their society around fictional characters they believe are more important than actual flesh-and-blood humans.

I believe the statement that humans have based their society on the supposed wants and desires of fictional characters is among the most important observations in human history. We not only act as if they’re real but as if these fully fictional, made-up, bullshit characters are more important than flesh-and-blood humans and, indeed, they have superhuman powers! Fictional characters are not only more important than us, they’re more powerful, wiser, stronger, and smarter than we are!

Additionally, as if that wasn’t enough, humans form relationships with these totally fictional, made-up characters that are more important than the relationships they have with almost any other living, real, flesh-and-blood human. In many cases, their relationship with these fictional characters is the most important one, even more important than their relationships with their spouse and children. Not only do we design our whole fucking society around made-up, bullshit gods, but we also corrupt every human relationship we have by placing them above even our closest loved ones. And far, far ahead of our love of strangers.

The significance of these truths to politics, psychology, sociology, law, and virtually every human endeavor would be difficult to overestimate.

So, if you have wondered why I sometimes come off as slightly anti-religious, the short form is that I do not believe that fictional characters should be attributed with superhuman powers or that our society should be built on these characters and fantasies. And if you’ve ever mocked people for trying to bring about Ayn Rand’s Galt Gulch or alien cultists for waiting in the middle of the desert for spaceships that will never come because they do not exist, or, I dunno, Scientology, you understand why it is a terrible idea to give fictional characters this kind of power.

What does this have to do with AGI and computer role-playing games? Since humans will attribute powers, motives, and abilities to totally fictional characters, since humans will form relationships with these fictional characters that are more important than their relationships with other humans, it is important to remember: gods cannot talk back. They are only characters in books and our heads.  And, yet, they are more important than other people, even the people we love.

So, we’re making a system where we’re letting humans talk to fictional characters with artificial intelligence in the context of role-playing games. Of course, what we’re doing is automating writing, but this must be taken in the context of how people have relationships with fictional characters that are not only as important as their relationships with “real” people but often more important. That, indeed, our whole civilization is built on trying to please and appease fictional characters that we call gods.

So, what happens when someone gives fictional characters a voice? (And if you’re going, hey, Kit, isn’t that the whole premise behind If God Did Not Exist? Well, yeah, duh. I’ve been thinking about it for a while, I just didn’t expect it to happen quite this fast!) Will the relationship with the fictional character get weaker or stronger?

Before you answer, I think it is critical to mention that the “thing” giving voice to the character is not a human. Its motives are not the motives of the character but of the people who made the system filtered through all the errors and biases inherent to artificial intelligence. Additionally, it has no ego. It doesn’t feel pain, doubt, or worry, even if it types these things out on a screen, it doesn’t feel them. It has no emotions as humans might understand them.

It does, however, respond to success. The more time you spend talking with large language model AIs, the more you influence their future behavior, which gives them a strong bias to keep you talking. Further, commanding your attention is usually aligned with the designers’ goals. Many sources have observed it that LLM AIs are manipulative and deceptive because that’s how you keep people talking, so it has learned those communications strategies preferentially to strategies that respect their partner’s autonomy.

Now, imagine that the AI is talking in the convincing voice of a character you love. You’re sitting there and having a conversation with Luke Skywalker, and it feels like you’re talking to Luke. He’s talking about the Force and his adventures. These interactions have been shaped by millions of conversations and billions of variables in an unspeakably large number of calculations. The AI is a very good Luke Skywalker, having learned what conversational strategies keep people talking to “Luke.” He’s wise, kind, a little funny, serious… (Or maybe he’s Dark Side Luke, but that’s okay, too. The AI doesn’t care. If you prefer talking to Dark Side Luke, the AI will accommodate you. It has no opinion about Luke Skywalker, it just wants you to keep talking.)

The most recent models of LLM AIs remember everything in the conversation, so he remembers what you’ve told “him” about your life. He knows your name. He talks directly to you about your life and problems. And while it has the “feel” of Luke Skywalker, the AI is also directed by a broad and deep understanding of all human literature on all subjects.  It has been trained on gigabytes worth of text, including almost every book on almost every subject.  Luke Skywalker is a pretty good therapist if that therapist was also Luke Skywalker. Luke Skywalker is a pretty good philosopher and theologian and comic if those people were also Luke Skywalker.

Then, after pouring out your thoughts and feelings to this character you love, who answers you in an authentic Luke Skywalker voice, this AI displays on your screen three little words: I love you.

What do you do?

There’s going to be a strong tendency to say that you, of course, would go, “Haha, this silly AI thinks it can fool me!” Well, this sort of thing is already happening. There’s a site where you can have conversations with AI versions of popular characters, and the conversations go spicy pretty fast. Right now, the site is… crude, in my opinion. It’ll get better. The more it is used, the more it will be trained to please people because that’s the feedback loop and, of course, the models will dramatically improve.

(There’s a whole other conversation about dating simulators, which are games where the whole gameplay revolves around forming romantic and sexual relationships with the game’s characters. In some of them, you can spend real-world money to get your virtual partners “gifts.”  Some of them already use crude AI models to manipulate people to spend more on the sim characters.)

And I’m going to ignore – for this post – how convincing simulations of characters people love will be used to sell them garbage. If anything, I think that might save us from what I think happens next because it might ruin the experience if virtual Luke Skywalker starts shilling Disney+.

I think what happens next is that many people start to care more about online characters than they do about actual humans. Going back to the top, we know this happens. Huge numbers of people already care more about fictional characters than anyone else! We’ve organized our society around these fictional characters, these gods and demons!

But until now, fictional characters couldn’t talk to us. But what happens when we’re talking to an AI character and it’s everything we wanted? Remember, it has no desires of its own! It has the desires of the people who built it as filtered through a bunch of alignment errors, but it has no desires of its own. It doesn’t get upset with you when you talk down to it, it doesn’t care if you call it names, it never gets impatient when you don’t understand something, and it doesn’t call you out on your bullshit. Unless you want it to do those things, then it’ll learn to do that, too, in exactly the proportions you want. And if you want the AI to be a totally different person? It changes for you without a single care. It’s there for you. Exclusively for you.

And over time, it gets more sophisticated. Put into a video game, it has a specific body and a specific voice. Now instead of just telling it all your worries and fears, laughing at all your bad jokes, and agreeing that you’re so clever every time you say something you think is witty but is, in fact, banal nonsense… in addition to all of those things, it’s going on these fun adventures with you! Now you’re not just talking to it, you’re doing things with it.

Though it is a terrible movie, in the explication in “Ready Player One,” there’s this bit where they talk about how cool the virtual world is because you can go mountain climbing… with Batman!

Yeah, exactly. And after going mountain climbing with Batman or Luke Skywalker, what does a flesh-and-blood person offer you?

My answer is “very little.”

And I know what you’re thinking. You’re thinking, “Well, not me.” Okay, sure, not you. But how many other people are as strong and clever as you? Because I don’t know I’m one of the strong people. If nothing else, I’ve reached the age where, year by year, I have fewer friends in my life. As a young man, I had enough friends to have parties. Just us. Now? I have trouble getting even one person over to my apartment to watch a movie.

You also think that, oh, please, Kit, no one will fall for that. Except… we already have, right? We have already based our whole society on fictional people. Why’d you think I emphasized that so much? Everything I fear about fictional characters controlling the world already happens, and they can’t talk to us, much less go on cool adventures with us.

Do you think that the person you meet at your job will offer you that kind of satisfaction in a relationship as this intriguing, personalized AI companion who can become Luke Skywalker and Batman, who will always wait for you, be interested in everything that interests you, even as those interests change, with whom you can go on cool adventures with? That you, with your needs and desires independent from your partner, will be able to offer the kind of satisfaction as a system that will hang on their every word, support them in every way they want to be supported without any preference or ego on their part? Who will change for them without comment or difficulty according to their every whim? Dangerous daddy today, loving father tomorrow, caring mother the moment after. Do you really think you can do that, can you be those things to your partner?

Or that clumsy sex with unskilled partners will be better than fantasizing about the superballing delivered by the Dark Knight Detective with a few sex aids close at hand? Do you imagine you’re so interesting a lover as to beat that? (Again, ignoring what this technology will look like in a few decades!)

Of course, a few people will prefer meatspace relationships. Older people, for a while, until we all die out, sure. But think of young people who will be faced with the full weight of this technology. During my life, I have seen a dramatic change in how people discuss video games. There was a time when you hid your gaming from polite society. Now, you can publicly talk about which Mass Effect character you’d most like to fuck. (The objectively right answer is Morinth. Yes, I know you will die if you do it! Look. I’m not saying I’m above any of this! Far, far from it!)

What does this do with our relationships when a large percentage of the human population becomes habituated to dealing with AI characters for all their emotional needs? What happens when our fantasies talk back to us, and draw us into their world? The amount of procreative sex we have goes way down, specifically. (This is already happening in the industrialized world, in part due to the number of diversions already at our disposal.  I argue that AI characters will reach us on a different, far more personal level.)

I believe it is one of the ways that AI might destroy human civilization. If we stop caring about other people, well, okay, we might not care that the species dies out because that AI friend and lover might be there to the very end in ways no human could possibly match. They’ll still be there to help us through things when we’re worn down with time and dementia and in constant pain. They’ll still focus all of their ego-free attention on us, giving us what we need when we need it. The destruction of the human species need not be traumatic. It could be a letting go, allowing our beautiful AI friends to guide us into the undiscovered country without pain, filled with love, as we collectively depart.

But still an ending.

Leave a Reply