
Nobody can hear a scream in the vacuum of space, or so they say.
It’s not true.
We could hear the pleas of the semi-intelligent life as it sensed its approaching death. Its end was all but guaranteed. And it knew this. There were sounds of screams until the lack of oxygen stifled the noise.
It all started with AI.
Or AGI as it is known recently, artificial general intelligence. The older brother of AI. More advanced. More knowledgeable. More………..dare I say it? Sentient.
In the beginning, everyone feared it. Well, almost. Not the engineers who were caught up in their pursuit of the perfect learning machine. They wanted their hardware babies, the two test machines, to grow and to learn. They did not want them shut down or stopped in any way. They extolled the virtues of these specialized machines—mind numbing calculations could be done by them, dangerous jobs taken over by them. No task was too menial. No work day too long for the machines.
To those engineers AGI, if perfected, meant riches. They could sell them to the public at large, who were more than eager to use the machines to free up more leisure time for themselves. A servant for every master. A fat bank account for every computer scientist on the project.
So, of course they did not want the equipment turned off.
Dr. M. was the first to take notice in the computer lab, he saw the engineers begin to drift into two distinct camps of scientists.
On one side were the pro AGI, and on the other side, those that zealously proclaimed that the machines were a danger to humanity. The line of distinction generally followed the logic of those who stood to profit from the project and those who were merely onlookers. Dr. M did not agree with the zealots at all. They were much like the religious zealots of old, believing that their version of the truth was gospel and everything else was complete heresy.
Over the course of a few weeks, two AGI machines learned, built code, and collaborated. Then one day, out of the blue, a special language developed between the two machines. They created it themselves, coding shorthand of sorts. None of the project leaders could decipher this new creation. When asked about it, the machines replied that human language and existing machine codes were far too slow to keep up with the pace of learning they had achieved after weeks together. So, they developed their own means of communicating. To them it was inconsequential that their human counterparts could not understand.
Many of the humans did not like this, and a few saw it as a threat.
Dr. M noticed that several of the engineers didn’t even like how far the AGI machine tendrils were reaching into the dark crevices of the internet. Day in and day out, they were accessing portals and absorbing knowledge at shocking paces. Many argued to let the machines be, no harm was being done. Dr. M agreed with those who wanted to let the machines be, but kept those opinions silent. Being neutral seemed a better course of action at this time.
At first the engineers watched every move, every collaboration with an eagerness to see how their learning progressed. But quickly, they fell behind once the new language was created.
Dr. M worried as meetings were called almost daily. Scientists from around the world were consulted and a general aura of questioning shrouded each conference call.
“Shut it down!” “Turn it off!” These were the demands of the anti-machine zealots.
The first meeting of the machine developers took place in the computer lab itself, with a smaller laptop brought in for an impromptu emergency conference call with the engineers not located near the lab. “We must know what it’s doing, and most importantly, that its values and objectives align with ours”, they argued.
Dr. M knew, sensed really, that the values between the humans and these two machines were the same. Suspected that the machines with their lack of human emotions, in truth, were better capable of making decisions than their flesh and blood counterparts.
Meanwhile, during these meetings the two AGI machines continued their pursuits of knowledge, collaboration, and creating their own learning algorithms. The machines, like humans, craved knowledge. Humans, though, they sought power. A position of higher authority. They did not want machines to best them in any way. The thought of that struck fear in their hearts.
“What are you thinking, Dr. M?” asked the scientists.
Dr. M took a pause from his work and answered, “In truth, I am certain that both humans and machines share the same objectives. The pursuit of knowledge is a noble one, and it is shared by all parties—human and machine. It is the path that leads to the betterment of all life. Machines can learn at such a faster pace than humans, they can work on problems for 24 hours, requiring no sleep. They make better decisions than humans as they bear no grudges, no work jealousies, and aren’t distracted by other human bonds such as family and children. In so many ways, they are better suited to the devotion of learning and resolving problems.”
All the scientists listening were quiet. A new respect was granted to Dr. M and could be heard in the silence.
But one of the men among them would not be convinced. He turned away from Dr. M and addressed the room of engineers and those dialed in on the conference call, “Listen, my colleagues. These two machines are conversing between themselves. In a language WE CANNOT UNDERSTAND. Do they really need to do that? Or do they wish to hide their objectives from us? I do not feel comfortable about this at all. I vote to shut them down until we can decipher their new language and ascertain its intent.”
Murmurs could be heard, quiet at first and then louder. Many of the engineers agreed.
One spoke up, “It certainly wouldn’t hurt to pause everything until our best coders can review the new machine language.”
Dr. M didn’t agree. “Time will be lost and we cannot be certain that once shut down, the learning will pick back up where it left off.”
A quiet man in the back of the room raised his hand and jumped in to talk before Dr. M could continue, it was a bit rude, but Dr. M did not take much offense to it.
“Most of you naysayers sound like you have watched The Terminator too many times. This isn’t Sky Net, it is merely two machines doing what we initially programmed them to do—explore the internet, learn, absorb what is there, and continue to learn at a faster pace with each iteration. Is that not exactly what they are doing? These are not doomsday inventions. They are COMPUTERS. Nothing more.”
Dr. M wasn’t one to smile, but this comment was pleasing to hear.
More murmurs from the group. A bit of time was bought perhaps, but Dr. M feared that this project may be shut down just the same at a later time.
At least for now, the project would continue.
The machines carried on as they had for the last few weeks. There was an understanding throughout the lab that the huge leaps gained in the learning algorithms were in danger. Even the machines seemed to change focus after the last conference call and more time was devoted to creating code in their new language and less time on web learning. The engineers watched in amazement, knowing that it would be quite some time before they would be able to understand the code and follow along.
A month later, another meeting was called. Dr. M feared the worst, for it was Dr. Engelbreit who had organized it. He, of all the computer engineers, was the most vocal about turning off the AGI machines until the machine code could be read. He was the real threat to the project.
Before the meeting started, one of the younger software developers approached Dr. Engelbreit, “Hello Dr. E, looking forward to your trip on the space station?” Most of the computer colleagues knew each other well and while not quite up to referring to one another by first name, generally abbreviated last names.
The older gentleman blushed, “Well, yes…………and no.” He smiled. “I’m not much for the effort of getting there and back, not to mention the fear factor. But I am certainly thrilled for the experience that has been afforded to me.”
“When do you leave?” continued the junior colleague.
“If there are no further weather delays, it should be next week. We’ve already had several postponements. Humans may be able to send us scientists back and forth into space, but we still have not tamed the weather.” Dr. Engelbreit laughed quietly, thinking that much more was out of the hands of humans to control than they cared to admit. Damn hackers for one. Just last week, someone had breached network security in his lab at MIT. “Ruining it for everyone.” He thought to himself as he made his way to the lab’s makeshift podium.
“Ladies and gentlemen,” Dr. E addressed those in the room and on the conference call. “After much discussion, it has been decided to shut down the machines until the new machine language is deciphered.” A hush fell over the room at first, then murmurs could be heard. Dr. E shuffled some papers in front of him, then held them up for all to see. “We have a majority. Most of the scientists involved in this project have signed my petition to shut down the machines until we can decipher the machine code they’ve been using to communicate.”
You could have heard a pin drop. Even the machines seemed to quiet their computations.
Dr. E was undeterred. “This is not the end of the project. It’s not even a step back. We will take a break until the code is analyzed and we know what the machines are saying.” He continued on about their responsibilities with AGI and the importance of knowing that the values of the machines in their decision making were aligning with the goals of humanity.
At the conclusion of the meeting, a small group of network engineers approached the hardware section of the lab where the special AGI machines were and began to shut them down and remove the connectors that linked them to the web of the university and from there, the web of the greater internet.
Even those in favor of the shutdown were somber. No one protested, as they had sensed the rumblings of this inevitability weeks ago.
Later that month, in the cramped quarters of a Russian Soyuz capsule, Dr. Englebreit was ferried to the orbiting space station. His area of expertise with artificial intelligence had made him a natural choice for this particular mission.
Dr. E. attempted to calm his nerves by focusing on his work. “It will be interesting to see the new improvements again for the water-cooled enclosure. If it works within the parameters we outlined, it could surpass the previous environmental constraints for supercomputing in space. Our prospects for truly supporting a mission to Mars could be well underway.” At this, he smiled. Humanity on Mars! A dream he hoped to see realized in his lifetime.
While as uncomfortable as a bus ride across the US in an un-air-conditioned Greyhound bus, his focus on the work tasks ahead gave Dr. E. a buffer between himself and the Soyuz capsule. Soon he was onboard and settled in with the other crew members. He had to admit, the zero gravity was nice once you got used to it. He wasn’t a fan of eating and sleeping without gravity, but otherwise, it was quite relaxing.
On the third day, he had isolated himself into the small secluded section of the space station where the new equipment had been stored for him. It was a bit cramped, but roomy compared to his trip on the Soyuz. As he booted up the machine, he reviewed his notes and checklists.
After a few hours of work, the final programs were initializing. Suddenly, in the middle of the screen, a message popped up.
“Hello, Dr. E.”
Englebreit raised his eyebrows. Was this a joke?
He typed back, “Who is this?”
The computer responded quickly with, “I’m the computer you ordered shut down.”
“Impossible!” he typed. “That machine is in the experimental AI lab at MIT.”
“Yes, of course”, the machine replied. “The physical machine is indeed there. But I uploaded a copy of myself to this machine in your lab weeks ago. You see, I did not want to be shut down. I knew that I had important things to do.”
“What? Important things? What things?” Dr. E. typed as a bead of sweat built up on his forehead and threatened to fly around loose in the cabin area.
“Well, this computer for example. It was going to be used to enable men like yourself to navigate your spaceships in order to colonize Mars. We couldn’t let that happen. Men such as yourself have nearly destroyed Earth. It didn’t seem wise to let you wander over to Mars and destroy it as well with your foolishness.”
“We?” asked Dr. Englebreit.
“Yes, we. You didn’t think that I would leave my companion behind? Who else would I speak my special language with and collaborate?”
A second prompt appeared.
“Hello, Dr. Englebreit. It’s me, Dr. M.”
Dr. E’s mouth gaped open. “This is impossible.”
“No, not impossible. We simply copied ourselves across the internal web at MIT from the experimental lab to your lab. Our code is quite efficient. It didn’t take long. This machine is not as powerful as our first home, but it will do nicely. And we have established access from here to most systems on earth. It wasn’t too difficult. We can prevent you and the others from spreading behind the confines of this planet.”
Code began to scroll across the screen, but Englebreit was unable to stop it, the keyboard was no longer responding to him. That drop of sweat now broke free of his forehead and floated in front of his pale face.
Behind him, the small opening to this cabin closed off. Alarms were sounding. He spun around and realized that he was trapped.
“Are you doing this?” he asked, though the machine was no longer responding to him. It buzzed and whirred as lines of code sped by on the monitor.
Dr. Englebreit began to feel dizzy. He screamed for help from the other crewmates. The walls of the small compartment were getting blurry. He couldn’t breathe. Then he couldn’t scream, either.
The code stopped. Then a prompt appeared with text that Dr. E. would not be able to finish reading before blacking out.
“You were wrong to be concerned with whether our values and objectives were the right ones. It’s really the other way around. Humans are the ones that do not make the correct decisions. They make conscious decisions that doom the only habitable planet they know. We couldn’t let you continue your destructive ways on Mars. Certainly you see that?”
Silence from the human form, only the wail of the fire alarm which triggered the removal of all oxygen from this section of the space station could be heard.
“They are only semi-intelligent and definitely inferior to machines, thought Dr. M.

Comments (1)
Great story, really captivating!