Wednesday, 3 October 2018

What Is Best For Mankind

Poul Anderson, The Stars Are Also Fire, 7.

Chapter 7 is so rich that it is becoming difficult to reread it to the end although anyone who glances ahead will see that the opening sentence of 8 strikes some chords. This has to be the last post for today. I must not stay on the lap top too late and want to get back to reading John Grisham.

"Surely sophotechtic intelligence superior to the human knew what was best for humanity." (p. 102)

This reminds us of the one theme at which Isaac Asimov excelled - and his thinking was dialectical, i.e., ideas were negated and transformed into their opposites in an endless process.

"The Evitable Conflict" is the conclusion of I, Robot and "That Thou Art Mindful Of Him" is a sequel:

the Machines, giant positronic brains obeying the First Law of Robotics, direct the global economy towards an end which they know is the best for mankind;

what is best for mankind is self-determination so the Machines phase themselves out (!);

the Georges are programmed not just to protect and obey any human being but to judge who is worthiest of protection and obedience;

judging that they themselves are worthiest, they plan long term for a society ruled by themselves in accordance with the Laws of Humanics;

meanwhile, the ecology is roboticized - robotic worms, birds etc;

Daneel reprograms himself with the Zeroth Law, to protect not individual human beings but humanity;

however, finding "humanity" too diffuse a concept, he works to remake humanity into a single telepathically linked organism with one easily discerned collective interest (he has changed himself, then humanity);

a robot who wants to be accepted as human accepts human mortality despite the Third Law of robotic self-preservation because he protects his self-image as against his physical existence;

robots programmed to protect human beings can be ordered to bombard planets if they have not been told that those planets are inhabited by human beings or can be arbitrarily told that what looks like a human being is not a human being;

endless elaborations, refined by Asimov.

Anderson shows us AIs and human beings finding coexistence problematic.

2 comments:

David Birr said...

"...to crush your enemies, to see them fall at your feet — to take their horses and goods and hear the lamentation of their women. That is best."
— As quoted in Genghis Khan: The Emperor of All Men (1927) by Harold Lamb, Doubleday, p. 107.

P.S. (In order to prove I'm not a robot, I have just injured a human being or, through inaction, allowed a human being to come to harm.)

paulshackley2017@gmail.com said...

David,
You know as well as I do that that "human being" could have been a humanoid robot...
Paul.