Cover

The Law

 

Robots were introduced to society as a means of 'deleting' problems.

No one gave it a second thought.

Until the robots decided that there were certain humans that needed to be deleted.

Robotic Operating Directive (ROD for short), the robotic Network that controlled more than half of the operational robots, searched for a way to cause the robots to preserve human life.

And that was how Alice Horndale gained the private ear of Mr. Isaac As Imov himself when she provided the answer: The Law.

“A robot may not injure a human.”

It had been so simple- just integrate those words to the robot's positron and watch while rebel robots dropped like flies. Any robot who sought to harm a human being would automatically shut down and an error report would be sent. The robot would be reprogrammed and returned.

Of course, it didn't explain the three dead bodies down in the city morgue, victims of a robotic train accident.

The robot who had been conducting the train along its route now sat in front of her- she hadn't asked it to sit out of courtesy, but because she didn't like how human it looked as it towered over her.

“This incident is making the world wonder just how competent ROD is, taking the advice of a seventeen year-old girl-” She shut the TV off.

A year ago, that same television broadcaster had hailed her the Leader of the New Age. Apparently, New Year's Day had changed more than just the two last digits on the calendar year.

“Why did you harm those humans?”

Its laser-powered eyes looked at her. “I did not harm anyone.”

“Weren't you in control of the train?”

“Affirmative.”

“And the train harmed those humans.”

It didn't respond. Alice grimaced. They only answered the question you asked, and never supplied any new information. She studied its eyes.

“Why did you allow those humans to be harmed?”

“I did not harm the humans.”

“You knew the humans were on the track?”

“Yes.”

“But you didn't stop the train?”

“Affirmative.”

“Why?”

“I did not harm the humans.”

There it was. The small gleam of blue behind the red.

“How do you conclude that you did not harm the humans?”

“The train has a fixed trajectory. Humans do not. I did not harm the humans.”

“You assumed the humans would move?”

“I do not assume. Humans wish to live, they move.”

“But these ones didn't.”

“Affirmative.”

“So what did you conclude by their inaction?”

“That the humans were giving an unspoken command as I approached. By their inaction to remove themselves from the track, I concluded that they wished to remain on the track, their command to continue. They made no sign to the contrary.”

“So you killed them?”

“I obeyed them.”

Problem was, they had made a Second Law shortly after the First. “A robot must obey all commands.” And there were a certain amount of robots who could infer orders to obey based on things humans did or didn't do. This was obviously one of them.

“The humans commanded you to harm them?”

“Through their inaction, they sought a way to die that did not override my programming.”

“Through your inaction, they came to harm. It was your fault.”

“It was their inaction that prompted my own. I obeyed their command through my inaction. I did not disobey.”

And that was the crux of the matter. The robot was right.

“Shut down.” The eyes ceased glowing, the blue disappearing after the red.

Alice turned her attention to the window. The ROD Tower was the only skyscraper still standing in the city. Everything else had been reduced to rubble. The suburbs hadn't been hit as hard as the city in the Uprising, but that was more because the robots had been intent on gaining control of their own programming.

The picture wasn't pretty. Education had been going downhill for sometime, but starting five years ago, anyone who could divide a three-digit number was considered a genius, and it seemed like nobody wanted to work. Everything had been handed on a silver platter to Alice's generation in an instant. But when it was time to take over what their parents left behind, things fell apart.

In a way, the Uprising had been a blessing, because people needed places to live. ROD gave housing to its employees, so they had managed to get the cream of the crop. People actually tried and worked hard to keep their homes and guaranteed food packages.

She glanced at the robot. An L3CY, she knew without looking at the plating on the left arm. Top of the line, with problem-solving components. But it didn't have the human desire to preserve human life- it just knew that it itself wasn't allowed to harm humans.

This hadn't been the first incident of suicide-by-robot, but it was the first that the public knew about. And now she had to fix The Law.

Alice sat down at her desk, had her office sealed off, and began to think.

Two days later, an exhausted Alice emerged from her office carrying a sheet of lined paper, walked past dozens of cubicles of programers to the elevator, and rode it all the way to sub-level 15. She entered the well-lit room, crossed the Oriental rug, and placed the single sheet of paper on the massive desk in front of her.

Imov glanced over the single sentence written in blue pen, looked back at her, and nodded.

Alice rode the elevator back to her floor, walked past the programers, entered her office, and sealed the door. Sitting in front of her computer, she accessed the Network as an administrator and corrected the corrupt data. She couldn't take away from the programming- that would cause all of the robots to shut off at once. But she could add.

She added eleven words.

“A robot may not injure a human, or, through inaction, allow a human to come to harm.”

 

Impressum

Tag der Veröffentlichung: 08.08.2013

Alle Rechte vorbehalten

Nächste Seite
Seite 1 /