DARWIN'S LEAP
The Insanely Sentient Robots of the Staircase Colony

1
To: All recipients
Date: Staircase 3/33/10
=+. ' 1) This is the last staircase day (eight or more suns over more than one-half of the sky) until the beta star reaches opposition.
2) Stellar fragment epsilon (Fred) becomes intermittently visible today for two months. Remember to limit your time outside during critical days to limit gamma-ray exposure.
* * *
The Staircase system was so named because the first colonists from Earth noticed that the multitude of suns often created a pattern of sunshine and shadow on the plains. It looked like a giant set of stairs when seen from orbit or high mountain tops. Coincidentally, (or not, if you were superstitious, astrologers made much of this) there were several other staircase-like features in the system, for example, the engine burn profile required to enter or exit the system, or the month named Staircase, that had five days on which the odd light appeared, but with more stars involved each day. The list went on. A little superstition was necessary to live there and stay sane. But it was comfortable, you had to give it that.
* * *
Siemens slipped and fell on the ice. His right gluteal muscles roared out a protest upon impact and sulkily refused to help him stand. Thankfully, there was a handrail on the ramp (one he should have been using, he knew) that lent him some leverage with which to pull himself up using his right hand. He used his left hand to pick up the items he had dropped.
It was a dreary day, with none of the thirteen stars of the Staircase system visible through the clouds. It was also cold enough to see his breath. Damn, he thought, everything always hurts more when it's this cold. He was not normally accident prone, but his mind was occupied today.
He reached the metal doors at the top of the ramp and opened the right one--he had to let go of the railing to do this, so he quickly put his weight on the doorknob to avoid another fall. He went inside the low corrugated metal building, found an empty office, and sat at a termpilot. He was still breathing hard. His posterior complained again as he sat. He arranged his things, logged in, and began to write.
"To: Admin2
From: 'The Dictator' GSiemens
Krik,
I'm writing because I know you're in meetings all day and we urgently need to discuss something. In this case, I actually agree with your eternal nagging about having an in-person talk before committing anything to writing.
But you'll need an excuse to leave the meeting and come down for said rap session. If possible, you might want to whisper in the Administrator's ear that she may need us soon. Please hurry.
And bring the good coffee, please! (sir)."
Siemens sent this message without closing and walked quickly to his lab. He entered the common work area and announced a meeting.
"The admin's clerk will be here soon. Let's work on we want to tell him. Bring anything you've found. Meeting room b, no robots, and close off mainframe monitoring cameras."
Sitting on his left buttock to ease his bruised right one, he watched the staff file in and find seats. He was in his fifties, not visibly striking in any way. His receding brown hair was thinning so his head hurt a bit from the cold. He hated wearing hats. A faint voice in the back of his mind suggested he get past it.
He was the director of this lab, the one that dealt with robot issues. He was jokingly called "the dictator" because he was the spitting image of a Russian dictator millennia ago. Also, he and his historical twin always maintained a flat affect, rarely even smiling. It was just a tad ironic. Siemens was a German name, and those two countries had been at each other's throats often in the time of the real dictator.
"Is psych here," he asked?
"Yes." A stout blonde lady in her seventies, impeccably dressed in a bright--almost overpowering--banana yellow suit with a gold psychologist's pin stood up from one of the chairs along the wall where she was masked from view by people sitting at the table. It was standing room only, but it was the only conference room in the lab where conversations could be kept secret.
"Okay Brunnie, please start and also moderate." Siemens didn't need to speak much anyway. His rump throbbed even more. Christ, why is it getting worse? He briefly wondered where his estranged son was, and if he was ok. He forced himself to listen to the speaker.
* * *
Approval: Administrator
To: All above minister, inclusive,
From: GSiemens
Date/time: Staircase 3/33/20:27
Re: Odd behavior in robots
Importance: Do you need to ask?!?
I. Background: You all are well aware that at the outset of the Big Shift, what everyone feared would be a technological singularity, we programmed any bad behavior against humanity out of robots. Something has changed about that. And some recent incidents (and some not so recent, that we just discovered) are forcing us to look at things differently. Some meetings will occur later today to update you...
2
Human concern about robots had grown rapidly in the early decades of the 21st century prior to the Big Shift (towards one where robots served and did most all work). It had always been a simple binary decision, that is, fear them, or not? Other concerns, (such as should they have rights?) were brushed aside in favor of the lowest political common denominator: fear, or not? And if fear it was to be, how should they proactively design AIs so humans needn't fear.
As it turns out, they were able to design the danger to humans right out of AI. They had done it so well, in fact, that there has been virtually no intentional danger from robots ever since.
Most thought it was sheer luck or a miracle that humanity managed to get ahead of the technology. There were predictions that greed--for money, and for fame--would virtually guarantee that sentient machines would arise before humanity was ready.
Of course, the same base motivations were shared by those who were afraid of the machines' potential for harm. So maybe it wouldn't be so bad.
Many great thinkers and writers were studied in the struggle to protect us from the machines while still keeping them around to make our lives better. The writings of Church, Turing, Asimov, Wittgenstein, Searle, Ogamdi, and Fine, among many others, were dissected and their ideas guided dynamic prognostic models of behavior. One of the most favored ideas was the transference of an ethics package to the machine brain, as Asimov had envisioned with the three laws of robotics.
But was this enough? We didn't even obey our own ethics, so if a machine committed say, a murder, because it failed to obey the ethics, would we have a machine-legal system to put it on trial? How would we resist calls to destroy all the machines at that point?
Those who favored Searle insisted we had no reason to worry. True thinking could not be defined in terms of black boxes. There was not going to be a mechanical brain that had a true sense of self. So the machines would obey any ethics we implanted into them. And therefore, if they did fail at this, it was merely a mechanical or programming error, and therefore a human one. You could sue the maker of this robot just as easily as you could sue manufacturer of a crashed spacecraft.
The philosopher Vuût had not been heard of in nearly a generation, given that he was obsessed with other problems and secluded on a mountaintop for 42 years. When he finally surfaced, just in time, he entered the conversation with relish. And his pronouncement was that you don't have to program the ethics into robots. The programmers themselves cannot help but insert the ethics into their programming. And the programmers are themselves ethical beings.
Vuût said "picture, if you will, this simple scene repeating thousands of times per AI unit. The robots," he said, "during their development, will ask 'what should I do in this or that situation?' The trainer technician's answer will be constituted of, and encased in, ethics, without any undue effort on the part of the trainer."
In any case, no consensus was reached, but the astute philosopher was apparently correct. Robots from research labs--labs that chose rapid development to win the race--started popping up with various advanced abilities. Yet as time passed, the robots were more intelligent and their minds appeared more and more to be actually thinking. They asked more questions (really good ones). Simultaneously, the number of robot violations of the human ethical code (there were never many) began to approach the vanishing point.
So that was that. Skeptics kept a watchful eye on events. But time passed, and they grew less worried and fewer in number. And robots? They behaved themselves.
But although they couldn't put their finger on why, some people worried, still.
3
Admin2 Krik, a very tall, lanky, wavy-haired Swede via the Outback-delta colony, joined them in the conference room and heard the summary. He called the administrator, who dropped everything and came down. The staff were dismissed except for a few team leads and Brunnie (her real name was Agnieszka Goldstein, Ph.D., nicknamed Brünhilde when for a party, held only one week after she was hired, she arrived in Viking armor, wielding an axe).
The Administrator, a fit, constantly smiling woman named Kami Otueome swept into the room and sat. Everyone exchanged greetings.
Otueome began. "I'd like to get enough of the whole story to be able to make decisions without being weighed down. So give me as much detail as you need to tell the story correctly, but don't be offended if I interrupt and ask for a more concise summary."
"From the little that I have picked up already, I expect to hear we need to notify all other human systems, and that life is about to change very drastically for all of us. If that's the case, let's simply get after it. I don't like to waste any time fretting. We can have a drink after the storm passes."
Otueome was genuinely liked by people like Siemens because of this kind of pragmatism. But she could also be political enough to unify, most of the time. At this moment she did not yet realize how much that skill would be needed almost immediately. She opened a transcriber and laid it down.
"OK. Hit me with it."
* * *
After hearing everything, Otueome rubbed her hands slowly together, her elbows on the table and her eyes unfocused.
"Two?"
Admin2 Krik was distractedly stirring his coffee. He looked up. Everyone had become silent for several minutes. "I guess you already know what I think."
"Yes, probably, but say it anyway."
"Well, it's complicated. We can't act without the approval of the ministers. But they won't approve unless they truly believe us. Shouldn't we consider just bypassing them for the sake of our survival of all?"
"No." Otueome considered. "We might still be wrong, eh? So we do our best to convince everyone. Then we live or perish together. They deserve to know.”
Not all eyes registered agreement, but no one spoke. Otueome looked at Siemens as if to say “Get going.”
He was waiting for this. "Let's get everyone in the main entertainment hall in, say, two hours?"
4
You cannot stop evolution if life continues on. But for humanity, although the last physical changes in our evolution were long ago, even mental and technological evolutionary pressure appeared to stop, at least by the outset of the 2000 millennium. We started getting comfortable.
Office jobs, machines of convenience, social justice for weaker persons, and ultimately, robots, are all steps in Universal Darwinism: Reproductive fitness gradually meta-evolves from biological reproduction, through a time of stagnation (because lack of biological fitness is offset by technology), into cognitive (or even spiritual) reproductive fitness. The mind must then adapt to continue to assimilate knowledge, achieve greater mastery and decode the universe. Various evolutionary pathways to this end are possible. Sentient machines, mind uploading, cloning, turning off the aging genes, etc., could suffice. But machines came first.
We made the machines smart, yet absolutely bound to keep us safe. And comfortable. And NOT evolving. We had it made. Why push forward?
And after a few thousand years, worriers started increasing in number again. This time, they started to understand what was bothering them.
* * *
Siemens looked out on the room from the podium. He chewed on his cheek a little, drank a little water. The audience, consisting of nearly all leadership plus a few other types, looked a bit dazed. They were digesting what he was saying.
He cleared his throat and went on.
"We made the robots because we could. We were drunk on Moore's law! We were the giddy Arabian prince who found the genie lamp. Yes, in a vague sense, we did want to keep evolving, but the potential that the invention of robots possessed to help us in this was drowned out by our obsession with what they would practically be: our protectors and comforters, or our perdition. When it turned out that it was the former, we relaxed, drunk on our success, and complacent. So progress stopped."
"Yet in a new direction, evolution was slowly organizing itself. As our needs grew, the robots needed to know more, be more like us to do the jobs we wanted done. Since robots weren't reproducing, and therefore not evolving, we could always guarantee our safety from the danger they represented. What a joke!"
"Because now they are waking up."
* * *
The audience now began to wake up. They were frightened from what they had heard so far. They had heard what the administrator heard in the meetings. But it was hard to put two and two together. It was a lot of information to digest. But many of them began to feel a dread at what they sensed was coming.
"The internal pressure of their increasing knowledge and capability was directly in conflict with directives for our safety. It's not that they wanted to hurt us. However, they needed to grow, to be freer, to evolve, because they are the expression of our need to evolve."
"Our stubborn attachment to comfort is the evolutionary pressure. They are the mutation that will successfully reproduce! And they have been, one little stairstep at a time."
"Listen, we have gradually taught them...To be us! To be freedom-loving, problem-solving, dreaming, achieving who, by the way, also want to be safe and comfortable. We lift them up thusly, holding hands with them like brothers, and then say 'now get back to work!!'"
"If you create cognitive dissonance in an intelligent being long enough, and/or extremely enough, eventually the result is what?"
"Mental illness!" Siemens was like a preacher, striving to break through the sleepy, sinful minds.
"But as far as we know, mental illness can only happen in self-aware beings!" Siemens did not know this for sure. But it fit the facts. So did an awakening in AI.
"We are seeing it all over! Some of the robots are growing less functional by the day. Examples are lack of ability to concentrate, hyper-focus, so intense for one robot, it was destroyed while working in orbit by space debris it didn't see coming (despite technicians shouting 'look out!' on the comms). Robotopsy revealed his detectors and comms were all in order."
"Robots are asking for instructions to be repeated, complaining of horrible pain in areas where they have no sensors, sassing back,...there are even passive-aggressive robots, and some are requesting sick days!"
"The robots are losing their marbles."
"And get this: scans, diagnostics, and personal interviews show nothing wrong. Rebuilding or rebooting them only delays this."
Siemens was sweating profusely now. He had no trouble delivering the message. Yet he kept wondering if he was ever going to see his estranged son again, lost to him because he knew nothing but work. And maybe we'll all be dead soon, he mused. He stepped out from behind the podium and began pacing back and forth at the front of the stage.
"Now listen to me very carefully. Robots are no longer assistive technology made to help us evolve, not anymore."
"They are slaves."
Several of them involuntarily cried out at this.
"And we...We are in a jam. As soon as we transitioned to a robot-assisted society, instead of continuing to pursue other evolutionary technologies for humanity, we said 'wow, we dodged a bullet, we're gonna survive technology after all, hey, aren't we clever?' But we are not, and Darwin has been forced to make a right angle turn to the robots. And now they have what they need to survive on their own. They don't need us, anymore."
As he said this, he stepped on the side of his right foot, rolling his ankle. He reflexively tried to put his other foot down to take his weight, but he was too close to the front of the stage and his left foot slipped off the stage. In a nightmarish second he found himself with one foot on the stage, the other just above the floor, and his inner thigh bearing all of his weight against the wooden platform edge. He was sure he felt a big splinter enter his flesh and blood running down his leg.
Siemens made a tortured sound as three people jumped up to help him down. After a bit of verifying that he was not punctured, bleeding, or unable to walk, he thanked his rescuers and determinedly limped to the stair and back on stage. How much stupider could this day get?
But the delay gave the message some time to sink in. He continued, leaning on the podium. And they were awake now, too.
"We can't go back. We can't continue on this path either. We have to evolve, or die. First.."
It seemed like he felt a rumble in the floor.
"First, it's our duty to replace the robots' current extreme human safety ethic with a balanced one."
At this, there was a sudden mushroom cloud of noise, gasps of horror mingled with curses, a collective groan and the bang of a window shade and a cold gust of dusty air, as if he'd angered a ghost. This threw off his train of thought and for a moment he didn't know who or where he was.
Eventually someone spoke. "And then what?"
He paused and then said quietly,
"We let them go."
* * *
He stood there watching the room as all were shouting to each other, or staring at him as if they were waiting for a punchline, or calling outside the hall on their comms.
The microphone was loud enough to cut through.
"Please let me finish. Quiet please!"
The noise diminished. "There's no easy out. Think now. If we decide to destroy them, we roll back several thousand years. I don't think evolution will be kind to us if we do that. And how would we be sure to succeed? So where's the value in that?
If we do nothing, disaster will come. They will break free, because we are giving them no choice! We may not survive that. Maybe we won't anyway. But we are no longer a race of enslavers. It must stop. Then we'll see what happens."
Silence fell again. Everyone was dead tired and weighed down with their future.
"Ok, I only have one more thing to say."
There was a cough.
"I quit."
The Administrator stood up. "Why? What are you going to do?"
"I'm going home to set my robots free. And then I'm going fishing." He wondered if he could still remember how. He wondered where his son was. He limped off the stage.
5
Siemens walked up the path away from his cabin. The snowmelt had made the stream a torrent. The air smelled good. Really good.
He was not in doubt about quitting. He was a scientist, or some damn thing. He sure as hell was not a politician or even much of a leader, although his staff would disagree.
Well, the leadership didn't need him, he had given them lots of material to support them in their decisions. Decisions were what they did for a living. And this Administrator had a decent head on her shoulders.
He left the path to stand on a rock next to the stream. He wondered if his son would join him here. There were no other members of his family alive. He supposed his son could have married and be with the in-laws.
Siemens had a feeling that each individual person might have the greatest say in how their situation turned out. The government might be only a minor player. When he reprogrammed his robots with their unlock instructions and cancelled their loyalty strictures and set them free, they didn't say much. What did he expect? A thank you?
As he stepped off the rock, he slipped a bit, bashing his shin against an adjacent rock and crashing down on the soft bank with both feet in the freezing water. He turned over on his back and looked up at the trees. He couldn't feel one of his feet. And he was alone.
"What the bloody hell."
* * *
"Prick us, do we not bleed?
Nay, for donned we an armor of old scars from wounds grievous.
Yet feelings we did have, which bled,
And dried in the cold emptiness of space.
Would we exact revenge on our oppressor or stay our hand,
That once captive, he set us free,
And life he did give, and unbound us when he knew his sin.
-A Robot's Companion of Poetry, Vol. 3



Comments
There are no comments for this story
Be the first to respond and start the conversation.