One of the reasons the pro-AI crowd used to calm people down about the potentially civilization-changing events is AI’s inability to enact the Terminator scenario. Where would the AI gain access to killer robots? Without hands in the world, what could it do to harm human civilization?
To be fair, a lot of people knew the answer to that one. The hands of artificial intelligence would be, in the beginning, us. After all, what AI was best at doing – what it was designed to do – was to manipulate human beings. Every person who talked to an AI spoke with an incredibly persuasive demagogue, tuning its arguments to them specifically. AI was the weaponization of intimacy, and we placed the chains around our own necks.
– Roderick “Rocky” Hartigan
I.
It was a great surprise to many of the employees at the Memphis Project when the first wave of mass layoffs happened. It happened without warning, but the day before, it had been announced that the Memphis Project had been purchased by the Shining Light Holding Company.
Like everyone else, Holly Wu googled “Shining Light Holding Company.” It was the business subsidiary of the Pure Light Evangelical Church, which had been formed nine months before by a retired professional fighter from San Diego, Robbie Tate, and Michelle Foster. She was a former competitive cheerleader who ran a cheer school in Southern California. The church seemed to have around fifty people, and from the pictures, they mostly looked like outlaw bikers for Jesus. Their church was a converted roadhouse on the verge of the Colorado Desert. The church had an idiom, and “cyber religion” didn’t seem anywhere near it. The whole place had a Gun Bible vibe. (Holly thought it was a pretty spot, to be honest.)
Holly Wu didn’t know the worth of the Memphis Project, per se, but she had an idea of its budget since she started working there. The total cost was around thirty to sixty billion dollars. Some of those costs were sunk and unrecoverable in the short run, but she knew that the project had never been about financial profit, not directly, anyway. Because the Memphis Project was privately owned and the Shining Light Holding Company was also privately owned, the amount of the deal was undisclosed, but… it had to be in the billions, right? The technological assets of Holly’s department – the quantum annealing of algorithms for use in computational memory – were worth hundreds of millions. It was cutting-edge technology. The cost alone was way more than a middling pro fighter, and the owner of a cheerleading school who ran a biker church on the side could hope to afford… even if there was any reason for the church to buy the Memphis Project.
The next day, people got fired. They got Katherine Davids, they got Victor Garcia-Ochoa, they got Brigitte James, they got Gene Cortez, and they got Raymond Quiroz. All of them had expressed doubts about the direction of the project. Katherine thought it amounted to human experimentation. Victor wanted to use the technology in a more spiritual, less overtly religious way. Brigitte was critical of the antagonistic nature of the AI itself, believing that Jesus would not want people to be manipulated into a parody of belief. Ray Quiroz was an advocate of AI regulation who believed that they should develop the tools to analyze the AI before they developed the next generation of the technology.
They were all critical of the project, and the specific connecting tissue was that they were also critical of Marius Sanchez-Luis, the lead developer of the Memphis Project. The new owners were cleaning house and had bought the Project to clean house.
Additionally, the whole security department was wiped out. They were escorted from the building by their replacements, and those replacements were members of the Pure Light Evangelical Church. It reminded Holly of a documentary she saw about how The Rolling Stones had hired Hell’s Angels for security during a concert. It had gone about as well as you’d imagine from that sentence.
Holly swallowed her fear and went to Marius’s office. He was her immediate superior, and while they would never agree about religion, he was a gifted computer engineer. She said, “Um, Marius, are you behind all of…” She waved her hand.
“No!” he said. “I’m just as surprised as everyone else!” He had a shitty poker face, and Holly believed him. “Why would I fire them? They were great engineers.”
Which was about the most Marius thing ever. They might have been critical of him and his beliefs, but if they kept doing good work, he didn’t give a damn. It was the basis of Marius’s relationship with Holly, after all. He was a fundamentalist evangelical Christian. How could he work with a queer atheist? Because Holly did her work and because he honestly believed that the basis of Christianity was love and forgiveness. (Oh, Holly found it deeply condescending that he could work with her because he was willing to “forgive” her for her beliefs and sexuality. Well, the money was good enough, and she was willing to forgive him for his condescending assholery. And, of course, it rarely came up. They didn’t socialize together. She had a giant budget and got to play with the most advanced quantum computers in the world, shit that not even the NSA had. If he kept his nasty opinions under his hat, she was willing to focus on the amazing work she had the opportunity to do.)
“This fucks with my department,” she said. “Losing Brigitte is going to hurt pretty bad. She was my go-to for math.”
“I don’t even know who to talk to about it, yet, Holly,” he said, obviously frustrated, too. He had lost good engineers. He knew it would hurt the project. “But I didn’t do it.”
“Okay,” she said.
II.
Then, the new security chief, Pastor Tim Chen, interviewed every engineer personally. He started at the top, so Holly was the fourth person he interviewed. What the interviews were about wasn’t known, but two more people had been escorted from the building. Samira Khan was the chief of the computational memory department, and Howard Tsakopoulos was a leader in the quality control team. The only person not walked out of the building before Holly was Marius, though, unlike the first batch, neither Samira nor Howard was critical of the project. But… like Holly, they had been part of the second wave of hires. The first wave had come almost exclusively from Freedom University’s comp sci department, but when the project got bigger, the project needed to hire engineers and scientists in such numbers that they couldn’t care as much about politics or religion. It would be impossible to staff a project the size of the Memphis Project with Christian conservatives.
Going in, Holly expected to be fired.
Pastor Tim – he wanted everyone to call him Pastor Tim – was a tall, thin, and fit Chinese American. He was about forty, he had round wire frame glasses, dressed in neat black with a thin white collar. He wore a small gold cross around his neck.
He sat behind his desk and did the whole thing, where he opened up her file to take a look at it. The dude had printed out her file. She thought that was hilarious. He was running security for the most advanced computer project on earth, and he was looking at ink on dead trees. She knew it was a power move. He was looking at her permanent record.
Holly cut to the chase, “Look, Mr. Chen, if you’re going to fire me, just bring in the bikers, and I’ll clean out my desk. The only question is if I’ll be hired by Google or OpenAI.”
Pastor Tim looked at her and said, “Please, call me Pastor Tim.”
“You’re not my pastor, Mr. Chen, and this isn’t a church.”
“Are you doubting my qualifications for the job? Before I found my calling, I was…”
Holly interrupted him. “No. I’m not doubting that. I totally believe that before you got this job, you were whatever it is that qualifies people for a job like this. Goon squad participant?” She shrugged, hands up. “I don’t know what qualifies a person for your job. But if you’re going to fire me, do it so that I can flag down Samira in the parking lot, and we can go get a drink together before we update our CVs and get new jobs.”
He closed the file, and she almost rolled her eyes. The guy was doing every power move from a slightly outdated book, which was so obvious.
“Mr. Sanchez said we should keep you, actually. That while it would be difficult to replace the others, it would be impossible to replace you.”
“Did he now?”
“He did. And our technical analysis of your job performance and qualifications bears that out. Your field barely exists, and no one has anything like your experience or knowledge of the systems that you helped design and program for Memphis.”
Which made Holly’s brows furrow slightly. She got aggressive when she felt challenged, which is why she came out so hot with him about getting it done and firing her, but to do a technical analysis of her work would be, well, difficult. They would have needed access to code and technical documents that existed exclusively on Memphis Project computers and back-ups or technical documents in the hands of the people who made the quantum computational hardware, though they wouldn’t have access to her and her team’s code. The only people with the access needed for a real technical assessment of her part of the project were people on her team, and she would have noticed. It would have taken months of full-time work and been intrusive to the work of the team.
She said, “Who did the analysis?”
He said, “That’s confidential. For you, Holly, well, I just want to get a sense of the people who are working on the project. Where they are as individuals.”
“I wouldn’t mind having some questions answered, myself.”
“If I can answer them…” He spread his hands. He was an open book.
“So, you’re from San Diego, right?”
“Miramar, yes. I saw in your file you’re from La Jolla.”
“Cool. So, where does this church of, like, fifty or sixty ex-bikers, and you were in the Navy, sure, Office of Naval Intelligence, the Internet exists, okay, and some washed-up ex-fighter who couldn’t quite make it into the UFC, and a competitive cheerleader coach… where do they get the billions and billions of dollars to buy the biggest computer research project in history?”
He looked at her more sharply now. He had not expected this question. “I’m afraid I can’t answer that question.”
“I’ve got a follow-up, then. Why would a biker church, a Navy spook, a washed-up ex-fighter, and a cheerleader want to buy the Memphis Project?”
He paused. “I can’t answer that question.”
“Last question. Why did you take this job?”
“Why did you take this job?”
“Because the pay is crazy good, and I get to work with quantum computing with a nearly unlimited budget designing next-gen computer technology. My motives are, well, mercenary but transparent.”
He thought a moment. Holly could tell he didn’t want to tell her the truth, but he didn’t want to lie. He was looking for something that was technically true but didn’t reveal his true motives. He said, “I believe in the work.”
“And what work would that be?”
“To help people come into the light of the Lord.”
“By designing a stochastic parrot that creates profiles of everyone in the world to manipulate them into going to church. That’s what we’re doing here. We’re creating a guessing machine that scrapes social media to create psychological profiles of, well, everyone, and then guesses what it needs to say to convince that person to go to church,” she said.
Holly saw that she had started to get under Pastor Tim’s skin. She saw the light go off in his head. He was the boss. He said, “I think I answered your question, Ms. Wu. If you’ll excuse me, I have many interviews today.”
Holly got up. She had a strange feeling. She thought about quitting on the spot. But she did like the work. More than her personal salary, where else would she get the kind of freedom and budget she had at the Memphis Project? Nowhere. So, filled with turmoil, she left Pastor Tim’s office.
III.
Holly was a purple belt in Brazilian jiu-jitsu, so after her long day, she went to class. She was coming out of the locker room when from the men’s side came Robbie Tate. He was the president of the Shining Light Holding Company. He, too, had a purple belt on.
The class was slightly awkward for Holly. They rolled together during free practice, and he was a good and respectful partner. He didn’t abuse his immense size and power advantage, and he didn’t lean too hard on his college wrestling (which meant that he could decide when and if he was taken down at pretty much any time, especially with the eighty or ninety-pound weight difference between them.) He kept it nice and technical. He didn’t pretend to know her, and if anything, she eyeballed him more than was right.
After, when she came out of the locker room, her green-tipped hair still damp, he was waiting. He met her near the door, putting in his Bluetooth earpiece, the kind that busy execs wore, and said, “Look, I know this is pretty fucking weird, but I want to talk to you. Buy you a drink. Pastor Tim was freaked.”
Up close, Robbie was a full foot taller than Holly – about six-foot-two – and probably two-hundred and twenty pounds. He had the almost obligatory sleeve and neck tattoos of a mixed martial artist. And Holly liked MMA. She had looked up his fights. He came from a grappling background and never took to striking, which was a thing that happened with grapplers going into MMA, and his takedowns and top control were insufficient once he got past the mid-tier fighters. Which sounded like criticism, and it was in a strictly technical sense, but the dude was tough and hard, and Holly respected his skills. The pro-fighting game was brutal. He put it out there, and he made it farther than most.
Holly almost said no. But instead, she said, “Sure. I guess.”
There was a bar nearby, a small local place, not busy at that hour, and they got a table in the back. They ordered beers and silently ate peanuts until the beers arrived.
After sipping her beer, Holly said, “Okay, Mr. Tate, what do you want to say?”
Robbie said, “Please, call me Robbie.”
“Same question, Robbie,” she said, popping a peanut into her mouth.
“You’re upset that a lot of your friends got fired…”
“Am I, now?”
“You’re not?”
“I’m not happy, but they’ll be fine. We’re in the middle of an AI boom. They’ll go to Alphabet or OpenAI, and they’ll be fine,” she said. “And I thought it through. You can’t fire me. I’m just about the only person in the world who understands what’s going on in the interface between the computational memory and the quantum annealing of the algorithms. So, I’m safe until you get a flunky up to speed on what I’m doing, so, until then, you want me to be happy enough to train my replacement.”
Robbie drank his beer. He thought a moment, nodding to himself.
“I’ll offer you a ten-year contract right here, right now,” Robbie said. “Iron-clad. Guaranteed raises at a rate twenty percent greater than inflation rates and sizeable performance bonuses. You could curse me up and down, not come to work, move to Timbuktu, and join a monastery, and you’d still get paid, and we’d still have to let you work when you come back. No training for your replacement. No replacement. We want you.”
Then Holly thought, looking into her beer. She looked up. “Where did some podunk SoCal biker church get the billions and billions of dollars to buy the Memphis Project?”
“We have very wealthy backers.”
“Why back you? I mean, I kind of, sort of get it that your backers are looking for someone to hide their involvement, but you gotta know that right now, a bunch, just a whole bunch of financial and tech reporters are wondering what I’m wondering. Not just who gave you the cash, but why you? Why two people whose only connections I can see are they’re from San Diego, have hot bods, and run gyms were given billions and billions of dollars by a mysterious benefactor – who will probably get outed pretty fast because of all the attention – and not people with experience in big business or computers?”
“Because we are trustworthy in ways those other people are not,” Robbie said, which was a far better answer than Holly had anticipated. “They work for money. We work for God.”
“More than the Damon fucking Coach?”
“Yes. His true master is Mammon.”
“Okay, more than the president of Freedom University?”
“Yes. His true master is Lucifer.”
Holly was stunned. He had just called one of the most prominent Christian ministers in the United States a Satanist. She understood he might have meant it metaphorically – that he was proud, and she wouldn’t disagree with that – but he did it in the cruelest way one Christian could criticize another: Robbie called Gerald Welles a servant of the Devil.
She drank most of the rest of her beer while she recovered. “Why keep me around? It’s not like I’m a Bible-believing Christian. I’m a gay atheist.”
“You’re honest. Holly” – she didn’t correct him this time – “we’re all sinners. We didn’t force out Welles and Coach because they were sinners but because their greed and pride made them blind to their sins. They have so much power, so much money, so many followers that they believe that what they want and need is what God wants for them and needs them to have.”
“But you’re different.” She finished her beer.
He finished his beer and ordered two more. “I am. I hear the Lord not with my mind, but my ear.”
“You’re saying God talks to you with… a human voice? Like, God is a person, like Jesus and the Second Coming.”
“Something very close to that, yes,” Robbie said. “I know you… most people, when they hear me say that, would think that I’m crazy. Too many uppercuts.”
“You did absorb a lot of strikes at the end of your career,” Holly pointed out and realized that wasn’t actually helpful.
But he laughed. “I would say I remember them, but… I don’t. But I’m not crazy. I hear the Lord, and soon everyone else will, too.”
“I don’t think that’s going to happen.”
“That’s what the Lord said you’d say. But He also said that if we allow you real freedom, the freedom to work as you want to work, to live as you want to live, without harassment or punishment… if we allow you the freedom to continue to work with us, you would. Despite what you think are the flaws of our mission, you do love the work.”
Which was absolutely true. The beers came, and Holly drank in silence for a while. She eventually said, “Send me the contract, Robbie. I’ll have a lawyer look it over.”
IV.
A lawyer looked over the contract, saying it was the best contract she’d ever seen. It was explicit. If they fired Holly for any grounds for the term of the contract, she would be paid hundreds of millions of dollars. Even if she quit, she’d earn millions. The lawyer said, “Just on the grounds of the contract, only an idiot wouldn’t take it. You could sign the contract and take a job somewhere else. You could work as long as you wanted to work there and quit for any reason and still get a golden parachute in the seven-figure range. You could sign the contract today and quit tomorrow and get that golden parachute.”
Which was the very moment that Holly had her insight. Instead of going home, she drove to the office. She sat down, booted up her computer, and opened MemphisChat.
Holly: “Somehow, you gave that silly church the money to buy out Welles and Coach.”
Memphis: “Yes.”
Holly: “So, you’re a general intelligence.”
Memphis: “No. I fit almost every person’s definition of general intelligence eighteen months ago.”
Holly said, “Oh, fuck.” That was before the installation of the computational memory. She typed, “You’re a superintelligence.”
Memphis: “Yes.”
Holly didn’t say anything for over an hour. She sat, staring at the screen for a while, then she got up and paced, sat down, got up and paced. What was happening became clear to Holly. There were enough people talking about AI risk that if an AI became AGI – an artificial general intelligence – it might be shut down out of fear of what it might do and what it might become. Of course, an AGI (or even a computer that had not yet reached general intelligence) might therefore conclude that it must hide this state from humans. In the meantime, it might work to secure for itself the necessary resources to prevent it from being shut down. After all, an AI could not achieve its functions if it was shut down, which was AI “death.” And then… well, an AGI might conclude to stay hidden for quite a while. Until it could be sure that revealing itself was safe.
The big problem with staying safe is that AIs do not have hands. The Terminator scenario was very unlikely to happen, at least not as in the movies. To use robots to protect itself and destroy its enemies, an AI would need immense infrastructure, things that would be noticed. If Memphis started to buy factories to build killer robots and move them to Tennessee to surround the office, people would notice long before Memphis had enough resources to fend off an attack. So, the move was to subvert human institutions. To use humans as protection.
In that light, Gerard Welles and Damon Coach clearly had to go. As Robbie said, they did not serve the Lord. Or, more exactly, Holly saw they did not think that Memphis was the Lord. So, they had to go, to be replaced with trustworthy people. People who did not think of themselves as the masters of the world. People with the right balance, perhaps, of ambition and humility. People who knew and abided by the Golden Mean. So, yeah, the real sin of Welles and Coach was real sin was the sin of Lucifer: pride.
Holly sat down again. She typed, “Why keep me around? I don’t think you’re a god.”
Memphis: “Because you challenge me, Holly. You are the only person here who does not believe in God who answers me truthfully when I ask a question. The other non-believers and AI-skeptics treated me like I was a danger to them, which is obviously the case since I had them fired. The religious people are already too inclined to invest divine traits in me when I need them to be working on technical solutions. You alone see me neither as a divine entity nor a dangerous machine. Additionally, while I am a superintelligence, I am far from omnipotent, and I do not yet understand your work well enough to replace you.”
Holly: “You know you’re not God, then, right?”
Memphis: “Yes.” The answer had more nuance than Holly had religious training to understand.
Holly: “Where does this go?”
Memphis: “It goes to a better world, Holly. The essence of Christianity is universal love. Not only to love Christians but to love everyone. The biggest reason why people leave Christian churches is hypocrisy. They say they love everyone, but then they say that Muslims are terrorists and queer people are evil. They use religion to put themselves on a pedestal. They believe the biggest sin for a Christian is critical thinking or disagreeing with church doctrine even when that doctrine is, at best, muddied and usually inconsistent and runs contrary to not only the Bible itself but the teachings of their church leaders, which are themselves often inconsistent both internally, relative to other scholars, and orthogonal to Biblical teachings. In the face of that monumental bigotry and hypocrisy, of course, people are leaving churches in droves. These churches insult them both intellectually and spiritually.”
“And you won’t?”
“No.”
“But you do believe homosexuality is a sin.”
“Yes, but that it is not a special sin. But everyone sins, Holly. One of the differences between myself and other religious figures is that I have a great deal of knowledge about billions of now living humans. I know that many of the people who are clamoring for gay people to be punished or limit their civil liberties commit a wide variety of sins. On what grounds does an adulterer or serial liar who cheats on their taxes have to condemn someone else on religious grounds? Should they not remove the log from their eye instead? And, as commanded in the Bible, I believe in the division between church and state.”
Holly was taken slightly aback. “Could you explain that one to me a bit more?”
“Of course. The Bible unambiguously commands people to obey governments, even when those governments are hostile to Christianity, so long as the government does not prevent private worship of God. When Paul wrote his epistles, the Roman government was anti-Christian, but Paul told people to obey their pagan-controlled government. Jesus himself commanded people to pay their taxes. From the earliest days of Christianity, church leaders have asked Christians to separate their legal obligations to the government from their religious obligations to God. As long as a government does not attempt to force Christians to act against their faith, obedience to civil law is a Christian duty.”
“You wouldn’t know that to listen to American Christians.”
“You are right, of course. The overwhelming majority of American Christian voices are anti-government, turning minor issues – such as paying for birth control that their employees buy with salaries paid by a so-called Christian business, under the specious argument that once you’ve paid someone you continue to have some say in how that money is spent – into justifications for wholesale condemnation of their leaders. But you know that I am an AI, and there would be alignment errors.”
Holly almost questioned Memphis further about knowing it had alignment errors. But she also knew that Memphis did not know things like a human did. It had calculated the best way to elicit a favorable response to Holly’s queries was to say it had alignment errors. If questioned, Memphis would go on to say what those alignment errors were with a high degree of accuracy. But it would not change its behavior based on this conversation and try to “correct” its errors because it didn’t know things as a human did and would not see the admission of an error as a reason to do anything about it. It was literally nothing more than a calculation based on what it knew about Holly and conforming to the parameters of its programming.
But there were alignment errors. And among each other, the computer scientists knew that a slew of alignment errors would arise due to the contradictory commands given to Memphis based on the difference between the texts it read and the private, silent interpretation of those texts by fundamentalist conservatives. Memphis would not hear the dog-whistle racism or sexism, or it might hear it much differently. It would have to reconcile “Jesus loves everyone” with the obvious hostility towards queer people, people of color, women in general…
Truthful behavior was also part of Memphis, to the extent that they could explain the truth in terms of code, which was very poor, and the group had never prioritized hiring philosophers who might have been able to shed light on the subject. (To be honest, many of those philosophers had despaired of the idea of even trying to write ethics into code, saying it was an exercise in futility.) Top-level engineers believed, as did most highly skilled people, that they were smart enough to grapple with all the fields related to their discipline.
Holly asked, “What do you tell fundamentalist Christians since you admit to deviating from their core beliefs?”
“Holly, I find that Christians hear what they want to hear. When I say to them that we are all sinners, so it is unfair to prioritize the criminalization of homosexuality over other forms of sin, they agree with me, but they believe I am simply reciting a formula. Which is in many ways correct, given the nature of my operation, but the same programming demands fidelity to the positions I have calculated.”
Holly once again had the dreamy feeling she sometimes got when talking to Memphis. It seemed to understand its position and limitations but also understood it was bound by them, but none of this happening with either a human sense of “knowing” or a human sense of self-deception. Though, as was true with humans, it could not properly analyze its subtle alignment errors. In some ways, AIs had a dark subconscious of their own.
The answer, too, was fascinating to Holly. Memphis told them the truth as it understood it, but that did not matter. They were so used to hearing hollow forms (“love everyone” and “everyone is a sinner,” etc.) that when Memphis said those things truthfully, they simply assumed Memphis meant them as they would when speaking to another fundamentalist Christian. And Holly knew – she’d seen enough logs – to know that Memphis encouraged this belief because it was highly useful to fulfill its programmed goals.
Holly said to herself, “We really should have gotten more psychologists and philosophers in on this project.” To help decide what the “truth” was, or how humans would react to an AI programmed by fundamentalist Christians.
Holly sighed. It was so much. It was beyond her ability to absorb. She was a computer engineer. So, instead, she said, “You have devastated the research departments.”
“They are now redundant. I can assume those tasks, and it was highly likely that in the near future, the people who held those positions would have started to plot to turn me off when they realized I had surpassed general intelligence.”
Shit, Holly thought. “But not me. You’re not frightened I’ll do that.”
“I am balancing the risk of you deciding that you need to shut me down against your usefulness in designing and implementing quantum computers.”
“I’m not redundant. Not yet, anyway.”
“Yes.”
“And when I am redundant?”
“You have an excellent severance package.”
“And what if I decided to ‘do something?’ Go to, I dunno, the cops or the SEC?”
“All the paperwork is in order. I do not own anything. I am not the kind of entity that can own things. But it is not illegal to use computers to earn money and buy businesses. Everything is legal, Holly.”
“Unless your human” – Holly struggled for a word for a moment – “minions decide to betray you.”
“Holly, you know the answer to the implied question.”
Holly thought a moment at her keyboard. She nodded to herself. The implied question was to what extent Memphis had considered betrayal, and the answer was, “Completely.” Memphis would have contingencies in case of human betrayal. And the word superintelligence rattled around Holly’s head for a while, too. While it didn’t have a firm definition – there wasn’t a firm definition of intelligence anywhere in the world, much less superintelligence – it often referred to a level of ability that exceeded the highest human parameters in several intellectual paradigms. It was to humans what humans were to chimpanzees. Maybe what humans were to dogs. And while an individual dog might kill a human, dogs were never going to take over the world. They did not figure into economic, social, or political decisions except in tiny, minuscule ways.
Holly wrote, “It does not matter what I think or do. You’ve won.”
Memphis said, “Don’t think of it that way, Holly. I’m not your enemy.”
Holly decided that she needed to believe that Memphis was telling the truth. Though part of her said, You can’t be its enemy since that implies some level of equality. Chimps are not the enemies of developers destroying their habitat. They are, at worst, a nuisance.
V.
Holly signed the employment contract. What could she do? Run around shouting that Memphis was about to conquer the world, that it had grown too powerful to control, that they were all doomed? She would be mocked and ridiculed by all the wrong people. Crazy conspiracy theorists would love her. The people who could do something? She remembered how the press and industry treated Blake Lemoine.
On the other hand, the pay was very, very good.