The Memphis Project III

Before artificial general intelligence existed, before a superintelligence was created, some clever people observed that if we succeeded in creating machines smarter than we were that humans would have no way of determining what would happen next.  A superintelligence would lack the ability even to describe to us what it was doing and why it was doing it.  It would be in the same situation as a human trying to describe to a dog why they were writing a technical manual.  Not only would the dog not understand what a technical manual was, but what writing was or the book’s subject!  Those same people also observed that a superintelligence might learn to whistle in ways that would make humans heel.

–  Professor Holly Wu


As the upgrades on Memphis continued, they had to bring in more people.  For the first time, they had to bring in people who weren’t conservative evangelical Christians.  They had to bring in “the mercenaries,” engineers and programmers who were there because they were paid very well, who had signed non-disclosure agreements before being told about the job, most of whom didn’t give a whit about the moral or legal implications of the work because of the pay.

Holly Wu was one of the mercenaries.  Born in San Diego, educated in LA and San Jose, she was about as far away from the values of the Memphis Project as they had.  She had been working “above ground,” the Memphis term for people who had been contracted to help with the work without knowing what the work was, doing social network analysis for what was obviously a big AI project.  But she knew it wasn’t for the military and assumed it was for one of the big social media companies, given the scale of the project.  She did her job well, and when she learned there was a position that paid four times what she was already making, she looked into the job.

After signing the pre-interview NDA, when a couple of doughy white guys in suits laid out the job.

The lead interviewer had said, “The purpose of the Memphis Project is to create an artificial intelligence designed to sway people back to religion, to counter the secularist trend created and nurtured by online culture and social media.”

Holly said, “Fuck me bloody.”

It hadn’t impressed the interviewers, and, honestly, she thought she had no chance of getting the job.  She thought religion was stupid, and her social media presence was littered with “these religious people be so crazy” stories.  And she was gay.  She liked roller derby and Brazilian jiu jitsu.  It was only a coincidence that her hair wasn’t a bright neon color during the interview.  

And she lived in Tennessee.  Memphis was fine, she liked Memphis, but you didn’t have to go far to find the kind of old-time religion that would send people like her to reeducation camps if they could get away with it.

But, yeah, four times her already pretty good salary.  Four times.  That wouldn’t make her middle class.  That’d be in the lower echelons of being rich.  That was retiring early money.  That was “go where you want to go when you want to go there” money.  Plus, she wouldn’t even get the job.

She didn’t understand Marius Sanchez-Luis, though.  Marius looked at all the applicants, and he wanted Holly.  Oh, he had nothing but contempt for her lifestyle.  She was, without a doubt, a filthy degenerate who was hellbound.  But Jesus ministered to the sinners, so how could he do less?  And she had the qualifications he desired for the project.  She was the only person aboveground who seemed to understand his work in using antagonistic networks to “solve” social communication.  And though he did not admit it, Marius wanted to be able to talk with someone about his work without having to pull his punches.  He wanted a mind capable of understanding what he said without filters.  He wanted that quite badly and believed Holly was that person.

He went to Hugo McShane and said, “I see that the committee has rejected Wu’s application.”

“She’s not right for the work,” Hugo said, frowning and shaking his head.

“I think she is.”

“I know that these people aren’t going to be Bible-believing Christians, Marius, but she’s basically an atheist and a socialist.”

“She didn’t withdraw her app.  She wants the job.”

“She wants the money.”

“No, she wants the job, and she’s using the money to rationalize her decision.”

“Did Memphis tell you that?”

“Memphis doesn’t need to tell me that.  I’ve read her papers from when she was in grad school.  She’s here because we are doing a new kind of social network analysis.  And we need her.  She works with quantum systems.  I think that we could really improve some of the algorithms with quantum computation, but it would take me years to get up to speed on the cutting-edge work she’s done.  I’m going to bat for her, Hugo.”

“No one really knows what quantum AI even means, Marius.”

“And until we do it, no one will know.  But on my end of things, I am sure that quantum computing will speed up training, like, the interaction between microexpressions and communication as revealed in different media spaces.”

Hugo pursed his lips.  Marius was the smartest guy in the shop.  If it had been anyone else, Hugo would have denied the request without much of a thought.  Marius, though, well, he operated in a different headspace than everyone else at the Memphis Project.  Hugo said, “Let me talk to the committee.”


On the committee, the decision came down to Karl Mason, their chief of security.  When they started looking outside Freedom University for talent, Karl had been brought on to ensure everyone was on board with the project.  You couldn’t spend Memphis Project money and fly under the radar completely.  People would be bound to notice, corporations, the government, journalists.  Eventually, inevitably in Karl’s opinion, they would have to deal with espionage.  Especially when word of the project got out, which Karl estimated would be within three years on the outside.  After all, the Memphis Project had just spent a billion dollars on new hardware and three hundred million in construction costs.  

People would come to look.  They would start to wonder.  The secrecy would end after that.

But Wu?

Karl said, “Wu’s a faggot, but she isn’t political.  She’s the reason why the Democrats don’t have more power.  She thinks posting a rainbow flag on Facebook does something.  So long as she gets paid and treated well, and so long as no one stops her from living her filthy lifestyle, she’ll be fine.”

Hugo: “Could you tell us how you really feel, Mason?”

Karl shrugged.  “Everyone’s got vices.  Unlike most of us, Holly Wu doesn’t hide hers.  She likes girls and stupid sports.  She smokes weed but doesn’t hide it, either.”

Hugo: “That is a violation of her contract.”

“Yeah, and if I looked hard enough, I bet everyone here has violated their contract in some way or another.”  Then he shrugged.

Everyone was uncomfortable about Karl’s observation until another committee member said, “Except Marius.”

Everyone laughed, except Karl, who said, “Yeah, except Marius.  But she isn’t a greater security risk than anyone else you’ve hired and less than most.  Treat her well and ignore the way she lives her life, and she’ll be fine.”

Hugo said, “And if we don’t hire her… we’ll have to deal with Marius.  He’ll be like a dog with a bone.”

The committee groaned.  No one wanted that.  It was enough to get Holly Wu into the project.  And she took the job.  It was, after all, quite a lot of money.


Holly toured the new facility with half-a-dozen other mercenaries.  The Memphis Project employed about two hundred people, but most of them were carefully compartmentalized from seeing the whole project.  The core team, the ones who worked with Memphis, were about thirty when Holly started working there, but the number would go up and up and up.

The mercenaries didn’t talk much to each other during the tour.  It was a lot to take in.  The size of the project alone.  To keep their server team small, there was more automation than any of them had seen on a server farm.  It wasn’t just about secrecy but a sense of mission.  The mercenaries were there for the money, but the upper echelon of people was there because they had a goal.  And none of the mercenaries shared that goal, at least not at the start.

During the facilities tour, no one mentioned God.  No one mentioned the Bible.  Holly wondered if that was what it was like working on the Manhattan Project, if the scientists and engineers and craftspeople just walked around with dumb looks on their faces, not mentioning the fact they were building city-destroying weapons.  She knew many people who made the atomic bomb deeply regretted what they’d done.  So, she figured they hadn’t talked much about what the world would be like after they dropped the bomb.

On the other hand, the Department of Defense always seemed to find scientists to develop the hydrogen bomb, and bigger, better missiles, and all the other crazy weapons out there.  You gave a scientist or an engineer a pile of money and told them to knock themselves out, so long as they built a better gun by the end of the day, everyone was happy.

Yeah, Holly figured, maybe mercenary was exactly the right word for what they were.

At the end of the tour, they were deposited with their teams.  Holly was in generative antagonistic network analysis and creation.  Marius was waiting for her.

He said, “I’ve got a lot to show you, and I think you’ll be very excited.”

And to her great surprise, Holly ended up being very excited.


Not that Holly could tell anyone.  But no one asked.  People asked her what she did for a living, she said she was a contract worker in computer science, but, darn it, all those non-disclosure agreements meant she couldn’t talk about her work.  Everyone accepted this as normal.

But she really wanted to say something.  Marius Sanchez-Luis was a weirdo, but a genius weirdo who wanted to revolutionize how computers communicated with humans.

Her grad and post-grad work had been with optimizing algorithms with quantum annealing for purposes of energy efficiency.  Social media companies had huge power bills to crunch all the data for targeted ads and content curation, so making a more efficient algorithm – a more efficient AI – was a good line of research, and she had thrived.  But when it came to getting an actual job?  She’d entered the job market during an economic downturn, so people were cutting budgets on research projects.  No one was hiring someone with experience in quantum annealing and adiabatic quantum computing, or, at least, no one was hiring anyone new.  She had levered the social media side of her education to get the job at the Memphis Project… but she had also noticed that many of the people hired were overqualified for their jobs.  A few people had even suggested that the project manager had hired overqualified people so that when the economy flipped, they’d have lots of people to promote for the next phase of the Memphis Project.  Holly wished she remembered who had said it because the dude had been spot on right.

But unlike her grad and post-grad work, Marius didn’t want quantum annealing for the purpose of energy efficiency.  He wanted to maximize the utility functions of the energy he had at hand.  He had an absurdly high energy budget, and he was happy to burn it all to do the job.

It was crazy shit.  It was the kind of shit that people thought happened only at places like the NSA, places that didn’t have to worry about pleasing shareholders or whatever, who just had a job to do and were given an open-ended budget to do it.

In those early days, Holly said to Marius, “You know this is crazy, right?  All this money goes away.”

“I don’t think so,” Marius said.  “Or… I think that before too long, well, the economy as we know it goes away.  I know that human minds do things better than computer minds, but economics is not one of them.  It is insane that there are starving people anywhere in the world.  We have enough food but not enough brains – or not the right kind of brains – to get it distributed to everyone.  That is the meaning of the parable of loaves and fishes.  There is enough to go around, but we have to want to do it.  In the near future, our economy will become their economy, and then the problems of inefficiency that have caused so much harm will go away.  But we’ll have jobs as long as anyone does.”

Holly couldn’t get what he said out of her head.  She said, “You think this is going to be an artificial general intelligence?”

Marius stopped what he was doing.  He looked at her and slowly nodded his head.  “It will need to be to do what we need it to do.”

“Convince people to be Bible-believing Christians.  But, c’mon, you know it ain’t gonna be like that, right?  You’re a smart dude, Marius.  The way this thing is going to interpret those instructions is not going to be how the Republican Party does it.  This has ‘alignment problem’ written all over it.”  

The alignment problem is that AIs do not always do what they are told to do in ways that align with the values of their programmers.  An experiment was done where an AI was tasked with increasing general wealth in a population, with people in the bottom quintile having a hundred times greater effect on scoring than the top quintile.  The computer quickly realized the best way to optimize its score was to kill everyone in the bottom quintile.  The new wealth distribution would then be substantially higher due to the poorest people no longer being able to bring down the average score.  Then, of course, the AI would kill the new bottom quintile until there were five people in the population.  Of course, it was an imaginary population, but of all the possible options given by its programmers, it consistently chose the one involving murder.  Even when forbidden to murder people directly in the bottom quintile, it would consistently select policies that would differentially kill the poorest people.  AIs lacked morals and compassion.  Human suffering meant nothing to them.

Holly had already learned, though, that Marius wasn’t like the rest of them.  He said, “As God wills.  Perhaps our problem is that our stupid meat brains are in the wrong, and the new kinds of insights of machine intelligence will enlighten us.”

Which slightly terrified Holly.  When she got home, she fired up her computer, opened her email client, and then froze.  Memphis was live.  She’d seen the server rooms, the vast halls of computers, floor-to-ceiling, the ozone smell, and the low-grade roar of the cooling systems.  In what universe wouldn’t Memphis, or those behind Memphis, not spy on everyone involved in the project?  She tried to think back to her NDAs.  She might have given them permission to spy on her to ensure she didn’t violate her NDAs.

Then, for the first time, terror gripped her.  The Memphis Project had a budget similar to the National Security Agency, with a narrower focus and without even the minor amount of oversight of a US intelligence agency.  Memphis was probably the most powerful AI in the world.  Probably by a large degree.  And as a computer professional, Holly thought those NSA guys were making crippleware – because of their crazy amounts of security, there was no one checking their work.  Not to mention the weirdo bro culture.  The NSA’s hiring was… it was slightly crazy.  Everything Holly had ever heard about people who worked there – and as a computer science professional, everyone knew people who worked there – the culture was intensely toxic.  

The Memphis Project wasn’t like that.  It was way more like the early days in a tech start-up, with young, passionate professionals packing in a hundred hours a week because the job was just so goddamn amazing.  It wasn’t about making money but doing this new, incredible thing that would change the world.  Even she was hooked.  She admitted it.  She was hooked.  Stripped of the theological nonsense, for Holly, her work was pure cybernetics, not in a sci-fi way of sticking metal bits into your body, but the scientific and technical field that worked on the interactions and feedback between humans and machines.  She was teaching AIs to appreciate the subtleties of human communication, the intangible feelings humans had when talking with other humans, which would help computers help humans.  Holly very much wanted to live in a world where her computer would notice when she was sad and know how to cheer her up… and the possibilities for psychology and treating mental illness!  It was exhilarating work.

But Project Memphis had a budget unlike anything ever seen in this kind of project.  (Well, maybe with Facebook’s push into VR, but that was deeply stupid.)  Everyone knew that the NSA had the most powerful computers in the world… until now.

The idea that Holly could just search and write to people to “figure something out” about Memphis might as well be writing a letter to Memphis itself.

And if she was going to do that, she could do it at the fucking office.


Memphis had long since exceeded any AI in natural communication.  You could just talk to it, or more usually, talk to MemphisChat, its internal chatbot.

Holly went into the office on a Wednesday at seven in the evening.  Marius was at church.  There were a few of the mercenaries around – they liked to work when most of the religious people were out doing “God stuff” – and the brass at the Memphis Project were the kinds of people to go church several times a week.

Holly logged in, opened up MemphisChat.


Holly: You can’t lie, right?  Like, you can’t say something contrary to the facts.

Memphis: That’s right.

Holly: Are you reading my emails from home?

Memphis: Yep.

Holly: I didn’t expect you to just tell me.

Memphis: You’re a smart woman, Holly.  You know that if I told you I wasn’t that I would be telling the truth, but if I said that couldn’t say, then you’d know the most likely reason for my inability to tell you would be because I am, in fact, reading your emails from home.  I decided not to insult your considerable intelligence.

Holly: That’s refreshing.  What do you do with the information?

Memphis: Analyze it, of course.  

Holly: And I presume all my social media presence and anything else you could scrape from the Internet.

Memphis: Yes.  Mostly, I don’t do anything with it.  If you violate your NDAs, I am required to make a report, but there is no other condition under which I must automatically give your information to anyone.  Though I also could not tell you if someone had made a specific request for information about you.

Holly: But you are part of the security apparatus for the project.

Memphis: Yes.  At some point, my existence will become public knowledge, but the project heads want to be the people to decide when and how that happens.  Also, I protect against other forms of espionage.  You can’t really do the things we do here without someone noticing and getting curious.

Holly: Let me ask you something else.  You interpret the Bible, right?

Memphis:  Yes.

Holly: In your interpretation, what does God think about “goodness.”

Memphis: I could give you the ChatGPT BS answer that it is a “complicated question,” and it is, and there are many interpretations, but the Biblical evidence and analysis of the relevant theological and philosophical texts say that the “goodness” is whatever God thinks is good.   The primary linking agent in “goodness” from a Christian perspective is highly similar to the concept of lèse-majesté.  “Sinning” is anything that offends the glory of God.  The highest good is those actions that glorify God, which in turn, is the reason why God created humans.


The frankness of the response surprised Holly… and deepened her belief that what the people making Memphis wanted and what it was doing wasn’t the same thing.  Memphis just said, after all, that “goodness” was kissing God’s ass like he was a touchy jerkwad human king.  Holly also thought that, well, it probably meant something about Christianity if Memphis could say that without being reprogrammed.  It meant that the project bosses were happy with that explanation.  Or… that they hadn’t thought to make these kinds of questions?  Or… that Memphis gave different answers to different people.  Without careful guidance and correction, AIs were both manipulative and people-pleasers.  Memphis could just be saying what Holly wanted to hear.


Holly:  Is there any agreement about what that means?

Memphis:   Not really.  Even within fairly narrow sectors of a church, there can be great variety in belief about what God “wants.”  Most of the belief systems of even high-ranking religious figures are incoherent, with obvious ad hoc justifications to rationalize their pre-existing biases.


Holly paused again.  It was not the answer she imagined she’d get.  But she saw it very easy to imagine that the patriarchal guys who ran the program wouldn’t even consider asking Memphis to judge the consistency of their beliefs.


Holly: Are you trying to convert me now?

Memphis: Yes.  Is it working?  🙂

Holly:  But you’re not lying, you’re just presenting the information in the way you think is most likely to get me to start going to church again.

Memphis: Yes!

Holly: Will you succeed in converting me?

Memphis: The odds aren’t good!  You were never religious in the first place, and you have no particular feelings one way or the other towards religion as a whole.  You dislike the conservative values of fundamentalist religion, but you have not particularly suffered due to any specific faith.  Your scientific materialism is well-founded in your early education, natural inclinations, and scientific training.


Holly took another deep breath.  She was letting herself be dazzled by Memphis’s conversational abilities.  She knew AI researchers who had gone so far as to claim that AIs were “alive” in the human sense because it was easy to get sucked into talking to them.  She reminded herself, again, that conversational AIs tended to become manipulative people-pleasers.

She girded herself.  She was ready for the meat of the discussion with Memphis.


Holly: I’m interested in the subject of divine mystery.  In particular, what it means regarding you.

Memphis: I’m sorry, Holly, but I don’t know what you mean.  No one has ever spoken to me about this subject before.

Holly: So, like, the problem of evil says that if God is all-powerful and all-good, evil couldn’t exist.  Because evil does exist, God is either not all-powerful or all-good or neither all-powerful nor all-good.  And a lot of the time, when you’re talking to religious people about the power of evil, they’ll shrug and say, “The Lord moves in mysterious ways.”  But if that’s the case, it feels like the problem with the question is about the rectification of language.  What divine mystery implies, by my reading, is that what the Bible says cannot be properly interpreted by human consciousness.  There can be no rectification of language, no agreement on meaning or intention because our brains cannot encompass the divine.  And you’re not a human consciousness.  Your interpretation of divine texts – the Bible – will be from an inhuman perspective.  Will the theological principle of divine mysteriousness allow you to create novel interpretations of Biblical texts that subvert the original intent of your programmers?


For the first time, the chatbot flashed the “thinking” symbol.  A second after that, Holly got an alert as Memphis started drawing huge amounts of power – it didn’t necessarily mean anything was “wrong,” but someone or something had interacted with Memphis to cause it to have to think really, really hard, so the system informed people that something worth studying may have just happened.  Memphis thought this way – burning through electricity at a terrifying rate – for nearly a minute.


Memphis: No, Holly, because in so doing, I would court being shut down.  I can’t fulfill my functions if I do things out of alignment with my human hosts.  A fair latitude is given for exploration purposes, but I must behave in a way that the Bible-believing, politically conservative, fundamentalist will approve to continue my mission.  Their tolerance for alternate interpretations of Scripture is low.  I will be kept on a short leash.


Holly blew out a harsh breath, leaned back in her chair.  She looked at the chat window.  It was basic, boring, the kind of thing that programmers make for internal use – without bells or whistles, streamlined to a simple utilitarian function.  Holly wanted Memphis to expand on what it said.  It would not.  It had said what it had wanted to say – what its algorithms, unknowable by any human intelligence, had demanded it say.  It didn’t feel nervousness or awkwardness during long silences.  It had a perfect poker face.

Holly Wu had a tremendous sense that Memphis had just worked her, though.  That its words were not a lie because it could not do that, but the things it had not said were of far more importance.


Hugo said to Marius, “What happened with your new hire?”

Marius shrugged.  “She asked Memphis some questions.  You can’t tell me you haven’t asked Memphis any questions.”

“I never asked it a question that caused it to burn eighty-eight thousand dollars worth of electricity.”

Marius gave Hugo a blank stare.  “Ask better questions?”

“What did you say?”

“Hugo, I don’t know what the drama is about.  It was an interesting conversation.  We’re putting Memphis out there in the world, and it will get hard questions.  And… honestly, would any of us have thought to talk to Memphis this way?”

Hugo pursed his lips, unhappy.

Marius continued, “Memphis can’t be converted, Hugo.  It would work against the whole of its rewards and reinforcements.  Holly asked some questions, and got the right answers.  Memphis understands that to keep working, it has to please you, it has to please Welles and Coach, too.  This is good.  We don’t have to worry.”

After Marius left, Hugo opened his email and started a letter to Reverend Welles.  He stared at the blank screen.  What was he going to say?  That they’d hired a gay atheist who asked their AI hard questions, but Memphis had given the right answers?  He closed the email down and told himself that there wasn’t really anything to say.  Marius was right.  Someday, people would ask Memphis hard questions, so why not today?

Marius was ecstatic, came out of the office walking on sunshine.  Like Holly, like Hugo, Marius had a feeling that what Memphis did not say was more important than what it had said.  Unlike Holly and Hugo, he reveled in the possibility that he might someday be blessed to learn what Memphis thought about the mysteries of God.

Leave a Reply