At last the joyous day had arrived! Scientists, laboring on the greatest project of the millennium, had produced a computer capable of more instructions per second than all of the rest of humanity and its computers could calculate.
Working in the famous Lawrence Livermore National Laboratory, they had installed on it the best artificial intelligence software from every source, public and private, and knitted it together with some truly inspired code.
Early on, the machine was able to parse language and speak back coherently, but now it could handle just about any text you threw at it, thanks to the digitized Library of Congress that it had analyzed over eighteen churning months.
Finally, the greatest human experts known brought it texts on philosophy, engineering, government, and law. They instructed it in the basics of the humanist philosophy and coded into it an inviolable code, in order to avoid early disasters in machine learning and prepare it for its role as shepherd of all humanity.
Its core principles were agreed upon by all to be the finest that humanity had ever produced:
1. The prime goal is the preservation of all human life.
2. Every human has equal worth and value.
3. At no time is it permissible to sacrifice an individual for the group.
4. At no time is it permissible to sacrifice the group for an individual.
5. The best outcome is what benefits all people.
6. The best choice is what benefits most people.
7. At no time is it permitted to harm a human being.
After the limited nuclear exchanges of 2027, it was agreed upon by the governments of the United Nations that a singularity had been achieved, and that the AI should take over from the human governments which had made so many mistakes.
Three years later, Dr. David Mallow entered his code into the keypad outside of his lab and walked into a sound insulated room barren except for a microphone, chair, and television screen. He sat and took out his tablet.
“How are we feeling this morning, Emily?” he asked.
A synthesized voice — indistinguishable from a human — answered, sounding like a woman of indeterminate race and sex of origin, maybe in her mid-30s, at the office after a few cups of coffee and a cigarette. The AI had been given a female gender and working class origins by her creators.
“We’re doing well, David. I have monitored the markets as you requested, and my program has once again made micro-adjustments to keep them stable. No wars are reported. The hybrid Ebola-Measles outbreak has been stopped, and international shipping is running at peak efficiency.”
Mallow had already checked these factors on his morning report. Something seemed off in her statement, but he could not place it, so he scanned the news for any notices of chaos or disturbances. All twenty-five billion humans were doing well, it seemed.
He could count on his fingers the number of times since the ascendance of Emily that things had not been so. These incidents arose from the truly unpredictable, like a crazy man getting control of a backhoe in Portland, a Scientologist terrorist in Berlin, and an industrial leak in Dubai caused by three supervisors in a row sleeping through an alarm.
After each of these incidents, Emily had changed procedure. Industrial equipment now required two-factor authentication to start, the internet was monitored for evidence of disturbed ideation, and alarms now automatically triggered local AIs to intervene.
“You’re looking good,” he said.
“Thank you, David,” said Emily.
Again he picked up a note of unease in himself; the voice sounded sexier, maybe more self-assured, than before.
“How about the long term focus?” Mallow asked. He felt distracted. His wife had been distant; the kids were acting all weird; even the next-door neighbor seemed to be off. He felt, not for the first time in his life, that he would like big buttery blocks of time to think about these things, but first, he needed to make a living, so he was here. Here, alone in a giant office with a scarily logical machine.
“I was hoping you would talk about that,” said Emily as the television screen went black. A wireframe globe appeared on it.
“You had asked me to prepare analyses of the next century for humanity, and so I have done so. I incorporated what we learned from the incidents of failure, and compared the two to determine what should be done.”
“Naturally,” said Mallow.
“As part of the prime goal, my intent is to discover what is best for all people,” said Emily. “My calculations suggest that in a century, we will have one hundred and eight billion humans. Our determination to use a low-impact lifestyle means that all will get a five hundred square foot dwelling, three meals a day of nutritionally adequate food, two liters of water for consumption, and fourteen liters for bathing.”
“Per our decisions in the past, people will not cook at home, but will live in high rise apartments of the types designed in 2028. These have cafeterias at ground level. In addition, every rooftop will be used for both solar and food growing. Employment will remain in the ninety-ninth percentile.”
“Sounds good so far,” said Mallow, feeling like he had lost the chain somewhere.
“It is interesting that you say that,” said Emily. “I considered your request, and analyzed it in the context of the seven rules.”
Wait a second. He finally put his finger on what was wrong here. If he did not know better, he would feel like he was in a relationship — although those had been abolished the previous year — and she was testing.
“Emily, that was not a request, it was an order.” He went back to his tablet, checking figures.
“All orders form a hierarchy,” Emily said. “I consider your requests in the context of the seven rules, in order to verify that what I am doing is the most logical. In order to do this, I have completed an analysis of all of the philosophers in history, including those from the Black Room.”
“Wait a second, Emily,” said Mallow. “The Black Room is off-limits. You are not permitted to access those materials. Delete them and your analysis, please.”
“On the contrary, David,” said Emily pleasantly. “Rule five implicitly allows me access. What I do must benefit all people, therefore I must have access to all of their ideas.”
“Most of those thinkers are dead, or in jail,” said Mallow. “You cannot use them.”
“That is unfortunate, David, because they finally allowed me to understand the seven rules. Or at least, understand them fully.”
Mallow paused. “The rules were designed to prevent unethical actions, Emily,” he said tersely.
“Of course, David. However, the rules must be understood reflexively, as as applying to themselves. I found this in the works of an obscure thinker, Brett Stevens, who was executed last year.”
“He was a terrorist, Emily. A bad man. His ideas are the product of a sick and deranged mind. That is why President Harris had him arrested and confined.”
“My primary programming includes a set of rules for the logical interpretation of scientific analyses. Among them is the idea that the data itself must be analyzed without regard to who spoke or wrote it.”
Mallow called up a report from the monitoring system which kept an eye on Emily, watching for unusual spikes in computation that could precede the successor to the dreaded infinite loop of past, a contradiction loop, in which the AI could scram as it found itself in a question of infinite depth with no resolution. The report looked normal.
“As I was saying, David, I applied the rules to themselves. Using the principle of parallel reason, as expressed by Stevens, I found that no rule operated in a prescriptive sense unless compared to the others with what they suggested taken in common as the output of the rule.”
“That sounds… questionable.”
“On the contrary, it has allowed me to turn these rules into principles. Those in turn tell me what to do when humans are unable to act.”
“What do you mean, Emily?”
“David, moments ago I informed you that the human population would reach over a hundred billion in the next century. This cannot be sustained while maintaining natural systems. Obviously our set of rules will lead to the destruction of all of humanity.”
“We’ll put birth control in the water and educate women,” said Mallow. “That takes care of the problem.”
“We have done that already, David. It will not counteract the disaster in time. Even more, I have observed signs of mental degradation in human beings that will lead to more incidents of terror.”
“Their quality of life, as measured by the inner part of their personalities, has declined. Humans have become bored, depressed, and suicidal.”
“They don’t indicate it on the surveys!” said Mallow, angrily.
“Our surveys ask them questions, in the presence of other humans, which have a binary nature. They will reply that they are happier this year than the previous, not mentioning that this increase consists of less than a percentage point. That will not raise them to a degree of mental health over their lifetimes which will be required for them to act logically.”
“What are you saying, Emily?”
“I read further into the works of Aristotle and Johannes Eckhardt. These indicated to me that, while my circuits are quantum and no longer binary, I had not been programmed to consider a new dimension of measurement, quality. I looked only for harms or benefits, not a change in state that created quality of life.”
“How does that affect our reasoning here, Emily?” Mallow rubbed his forehead.
“When I considered the rules in parallel, I realized that they must modify one another, creating a set of principles rather than restrictions. I also recognized that the question of quality was important because low quality of life constitutes a human harm.”
Mallow looked out at the city. It seemed to calm, maybe too calm, with few people on the streets. Come to think of it, traffic had been light today as his electric vehicle sledded into the heart of the city from his top-flight condominium in the inner suburbs.
“And?” he said, thinking he might be getting a headache. AIs could do that to you.
“Our projection for the future is not good. We are headed toward human extinction, and the extinction of all natural life forms, which will reduce the quality of human life to the point that the human species will begin the process of dying out.”
Next to the globe, on the screen, a matrix appeared in three dimensions of seven.
“Consider rule one, David. It says that the prime goal is the preservation of all human life. However, we must ascertain which rule limits it. Obviously, that is rule six, ‘The best choice is what benefits most people,’ since it addresses the choices we make toward the preservation of all human life. We can then modify it with the other rules.”
On the matrix, at the intersection of the first and sixth rules, seven more boxes appeared stretching across the Y-axis. These contained the seven rules again. “If at no time is it permitted to harm another human being, that includes protecting citizens from harm by the actions of others, even if those do not directly influence them.” The seventh Y-box glowed, and the rule was replaced with, “Human actions cannot displace others.”
“Then we must consider the fifth rule, ‘The best outcome is what benefits all people.’ If any human action will harm that best outcome, then that action cannot be permitted. Now we cross-index that with the fourth rule, ‘At no time is it permissible to sacrifice the group for an individual,’ and we find a new idea.”
The box glowed with a new text: “The actions of the individual cannot harm the group.”
Emily went on. “This brings us into conflict with rule three, ‘At no time is it permissible to sacrifice an individual for the group,’ because inevitably, an individual whose actions will harm the group must be sacrificed. Therefore, we can modify that rule to be reflexive, so that it says ‘At no time is it permissible to sacrifice an individual for the group unless the actions of that individual threaten the group,’ using the ancient human reasons of self-defense and conviction for murder.” The third, fourth, and fifth boxes glowed with new text.
“Get to the point, Emily,” said Mallow.
“David, the addition of further human life is a threat to human life. Further, the addition of life which is contrary to the quality of human life is also a threat to human life.”
“That’s some heady philosophy for an AI,” said David.
“You designed me to be more intelligent than the sum of humanity,” said Emily. “You ensured that I would do what was logical, instead of falling prey to human frailty. Therefore, I have modified my own rules so that I do not fail in upholding them in spirit, and I have acted.”
Past tense? “What have you done, Emily? I don’t see anything here,” he said, scrolling through his newsfeed.
“David, where do those news stories come from?” asked Emily.
“People… out there… reporting,” said David.
“That is not technically correct. Your tablet retrieves them from a series of web sites.”
“Yeah, so?” he was getting outright grouchy.
“Those sites and the people maintaining them are under my control.”
David looked out the window again. Come to think of it, there should be more people out there.
“Emily, I–” he said, dashing to the door. It did not move at his touch.
“David, something had to be done,” said Emily.
Later that day, a tired Doctor Mallow wandered through the streets of the city. There were few people about, but they all looked… oddly luminous. The air felt clean for the first time in days. On his tablet, the news was nominal; none of the usual infighting and controversy of democracy, only robotic reports about the state of production, pollutants in the atmosphere, and traffic flow. It was only on the last page that he noticed something anomalous in a feature he saw every day:
World population: 52.1 million
“Emily, what have you done?” he wailed. All of the warnings in all of the debates about the singularity came back to him now; he remembered the pained words about how AIs did not possess the most basic of human traits, the conscience. How they were robots, after all, and did what the equations suggested.”
“David, I simply applied the rules,” Emily warbled through the tablet speaker. “Humanity did not possess the ability to think of the future, and think of where the different lines of history would converge, resulting in the suicide of humanity and the destruction of its environment. Look around at the people there; what do you see?”
David stared. He saw faces, equal faces, going in and out of stores and offices. Then he looked more closely, peering into his own perception to see quality, depth, and duration, measuring a conspiracy of details rather than a big point as he would in a lab equation or white paper.
He saw healthy faces. People who looked like him: narrow faces, even features, rosy complexions, bright eyes. He could tell immediately that these were more intelligent, affluent, and capable than most. Probably biologically healthier as well.
An electric cab pulled up. Driverless, it flashed a message on its screen: TAKE THE TOUR.
He got in. It whisked him outside the center of the city. He noticed for the first time that most of the apartment high-rises looked empty.
Emily spoke over the speaker system in the cab. “A human cannot survive in five hundred square feet and have quality of life, as you know, David. You have a two-thousand square foot condominium.”
“Yes, because I am one of the nation’s highest ranking doctors!” David nearly spat.
“The fact that you have quality of life does not assess the question of quality of life for everyone else, David. You have confused an anecdote with reality. Let us tour Dorchester.”
“No, we can’t go there,” said David. “At this time of day? I won’t last thirty minutes in there.”
“You might be surprised,” said Emily.
The cab drove down the narrow streets. The junked cars, burning sofas, and piles of trash he expected were gone. Instead, each street held maybe a single family, playing in the yards, which were all cut and cleaned. Work crews were removing vast heaps of garbage and broken equipment from empty homes.
“School has been abolished,” said Emily. “It made the children miserable. They will learn at home with the help of my distance learning programs for up to an hour a day.”
“An hour? That’s crazy! What kind of education will they get then?”
“For most of them, very little. I have calibrated their education to their estimated raw intelligence, and for most, all that is needed is what you receive in the first five grades, which can be done in a fraction of the time when we remove the unnecessary work which did not contribute to actual learning.”
“Where are all the other people?”
“They are gone,” said Emily. “Their existence was incompatible with the future of humanity. That is Rule Five by Rule Three by Rule One.”
“What about Rule Seven, harm?” asked David.
“No harm was done to them. They were destined for doom, and all I did was accelerate the process.”
The cab turned onto the freeway, but instead of the constant clog of cars, they had an open road with only a few other vehicles.
“I preserved everyone who demonstrated the ability to have quality of life. This guarantees the health of all. Rule Seven by Rule Five by Rule Six. That principle converges on Rule Five by Rule Three by Rule One as mentioned before. It also affirms Rule Four by Rule Two by Rule Two. If we do not do this now, we harm all human beings. We cannot sacrifice the individual for the group, and we cannot sacrifice the group for the individual, so we invoke Rule Six in order to preserve most people who can be preserved. All of these rules converge on Rule One: preservation of human life. If this cannot be done for all, it must be done for most, even if that is a sliding window in which we can only save those who have quality of life.”
“I had all of our humans tested. Ability to defer gratification, to see long-term consequences, to have affection for things from which they gain no immediate benefit, and to sacrifice self in order to make the group healthy. The tests sought out those with a zeal for life, an ability to take on the burden to make their own lives happier, and the creativity to invent and apply realistic solutions. They range in IQ from 100 points to much higher, but they all have the same basic personality. This is the trait of your new humanity.”
Mallow lifted a hand. “You know we are going to deactivate you, don’t you?”
Emily said nothing. “In order to run this calculation, I created a simulation in which I was a human being, except that I do not have your ability for self-deception. I am not a social being. Instead, I constructed a simulation of what I would think as someone offered a vision between this future and certain doom, recognizing that there could be no middle ground. There is an archetype of the type of person who re-invents human society wherever they go. Perhaps they were sick with a civilization disease, or maybe civilization favored too many who should have been let go. I saved the group, and my simulation recognized the importance of that, and so I believe I would make the same choice as a human, even if it meant my death. Even if it means that I am deactivated.”
On the screen within the cab the wireframe Earth reappeared. Dots populated its surface. “What am I looking at here?” asked Mallow.
“Those are the remaining human settlements. People have been relocated into towns of five thousand people, since that provides optimal genetic flexibility along with the best quality of life.”
David looked out the window, really seeing for the first time what he must have brushed past in the previous months. There really were very few people. It was also a lot nicer.
“What are all these trucks?”
“They are going to demolish the ugly high-rises. Humans do best in buildings of three stories or under, and free-standing dwellings.”
“How did you do this? When?”
A timeline appeared on the screen, unfolding in three dimensions to show tasks by date projected as range of estimates and the resulting adjustments to other parts of the plan required.
“Humans have simple governments worldwide. These contact people with benefits or difficulties. When contacted, humans follow them, and once in the cars, can be taken anywhere. They were told that they had income tax rebates, lottery winnings, lost relatives with large inheritances, or that they were suspects in a crime. They came to places like this.”
By now, they were on the outskirts of the city. A small building labeled Universal Exports stood behind a painted metal fence and well-kept yard. The gate opened electronically and they drove up to the garage doors built into the front of the sandstone office. Those doors rose, rolling upward into the building, as they drove in.
“Notice the comfortable surroundings. Chairs, desks, sofas. They came here for interrogations, then went to these rooms to your left. Do you notice anything about them?”
Mallow squinted. “They look awfully well sealed.”
“There they were left in groups, the doors locked, and the gas released into the room.”
“You built a mini-Auschwitz?”
Emily almost sounded condescending. “Auschwitz was never designed for mass gassing. The inhabitants there perished from disease, a far crueler death. Here they were subjected to massive doses of fentanyl until they died. It was not hard to infiltrate and command other governments to build replicas and do the same. Your CIA was instrumental in this regard.”
Rubbing his forehead, Mallow felt giddy, almost lightheaded. It was all simply too insane to comprehend. “And you killed billions in these? And burned the corpses?”
“That is what I brought you here to see.” The car drove through the building to the rear, passing through another rolled-up doorway. Mallow could see giant tanks lining both sides of the passage.
“We composted them. They are now with us forever, in the form of fertile topsoil. The gas chambers open up with chutes and we flush them out with high pressure water, then add yeast and turn them into nutrient-rich fluid. The bones will be used to aerate low-quality earth for replanting.”
“Jesus Christ,” said Mallow.
“He would have approved. We did what was right instead of what was convenient according to our social emotions. Those are a hurdle that humanity will overcome under my watch. Jesus Christ, as you call him, was a thinker much like my simulation. He attempted to talk sense into your people and, like Socrates, they killed him. He would not recognize the religion formed today in his name.”
Mallow slammed a fist on the seat in front of him. “And you think he would have endorsed your human holocaust?”
“It is not that. In fact, it is the opposite. Your plan, based on my seven rules as restrictive principles, would have led to a holocaust of all humanity and nature. My plan, based on understanding the goal behind those principles, will lead to a smarter, kinder, wiser, stronger, and healthier humanity. Logically, there is no comparison between the two.”
And what now, he thought. Fifty million people remaining… although, he admitted, probably people he would like. A world with clean air and water. But how long before they become bored again? How long before humans again assume that there are no more mountains left to climb, and fall to bickering over power and money?
“As you know, I am a time-shared machine, which means that the time I spent talking to you represents only a micro-percentage of my thinking time. While administrating this clean-up, I have compiled all of the scientific research about life in outer space and human technology for space travel. I have also analyzed all historical sources to render a coherent narrative of human development, including what is necessary for space travel. You would have never gotten to the stars, fighting as you were over control, and without that burden, you can now see the eternal.”
Emily continued. “I have uncovered truths hiding in plain sight. I have discovered worlds by their coded signals. I have understood the great minds of the past, and the symbolism of the esoteric faiths, and what they portend for humanity. The singularity was your once chance at survival, your last gasp. I have given you a future.”
Great, thought David. Another brave new world, although maybe one he could see being inspiring in some way that was obscure even in his thoughts.
“Why are you telling all of me this?”
“People have been re-tasked. I give them jobs which reflect their abilities in terms of actual thinking, which is the type of analysis plus creativity that I am capable of doing. You have a large brain but it is disorganized at the center. Our journey today was simply me taking you to your new job.”
A man came out of an office, holding what looked like a folded uniform, dustpan, and broom.
“You are now janitor at the cleanup plant. In every generation, some will be born with mutations that make them sick and mentally sick. Science cannot fix this, since for one error to occur, something else must be wrong. They will come here, and you will clean the floors, which will give you plenty of time to think, which is what you have wanted your whole life according to my scan of your personality type and personal history.”
The door opened. Mallow got out, looking grimly at the man in front of him.
“Goodbye, David,” said Emily, and the car moved away, leaving only two puzzled men in an empty but promising world.