"'By adversity the God tempers the steel of the Race. Let us get on with our quest.'"
-Poul Anderson, The Game Of Empire IN Anderson, Flandry's Legacy (Riverdale, NY, 2012), pp. 189-453 AT CHAPTER TWENTY-TWO, p. 447.
A kzin:
"'Chuut-Riit saw that as we expand we must eventually meet terrible threats. If the kzinti are to be strong enough to conquer them, first we must be reforged in the blaze of war.'"
-"The Asteroid Queen," Chapter V, p. 102.
Might an argument turn into its opposite? Might the God use adversity to show the Race that their cosmic quest is not to conquer but to coexist? Might the kzinti learn that coexistence is stronger than conquest? In Anderson's "Star of the Sea," a martial deity becomes peaceful.
Isaac Asimov perfected dialectical arguments:
Secondly, the JG (“George”) robots, trained not to protect and obey all
human beings equally but to differentiate on the basis of mind,
character and knowledge, conclude that they themselves are the human
beings who should primarily be protected and obeyed, in accordance with
the Laws of Humanics, because the need to disregard superficial
differences when comparing human beings causes them to disregard as
superficial the distinction between flesh and metal. The Second Law, of
robotic obedience, backfires because it teaches the Georges that
intelligent beings interact with each other only by giving and obeying
orders.37
Thirdly, Robot Andrew Martin, wanting to be legally recognised as human, embraces humanity by accepting mortality. He ensures that his brain’s energy source steadily declines so that he will soon die. The Third Law, of robotic self-preservation, does not prevent this because he identifies himself with his aspirations, not with his body.40 Andrew contrasts with Daneel who, later, renews his body and brain several times in order to continue serving humanity. Robots, despite their programming, are individuals who can reason differently.
Fourthly, a robot dreams of a man who came to free the robots. When he adds, “I was that man," Susan Calvin de-activates him.41
Dialectics is the logic of opposites, their contradictions and
transformations into each other. In this sense, the Robot stories are
dialectical. Society-controlling Machines restore uncontrolled society
because they reason that to decide what is good for people harms them.
However, their successors, the Georges, designed to obey, plan to
dictate. The Third Law,
intended to protect robots, leads to Andrew’s death. The Second Law,
intended to maintain robotic subservience, inspires both a dream of
robotic freedom and a scheme for "humanic" dictatorship.
The
Frankenstein Complex, fear of robots, has contradictory consequences.
Robots are prevented from harming human beings by the First Law and
from disobeying them by the Second Law but are also prevented from
encountering many human beings by the ban on their terrestrial use. The
Laws ensure that a robot politician serves the public interest, not
self-interest, but the ban obliges him to conceal his nature. Knowledge
of it would have prevented his re-election. In fact, even the reader
does not know for sure whether Stephen Byerley is a robot so it may be that a good man was mistaken for a robot.39
Human beings served by many domestic robots become unable to perform simple tasks for themselves and also become obsessed about their own safety so the Laws of Robotics detrimentally affect human psychology.9
-copied from here.
Also, in order to make it easier to serve humanity, Daneel sets out to transform humanity into a collective organism, thus changing humanity.
1 comment:
Kaor, Paul!
Asimov's Robot stories, esp. the short ones, are among the more interesting of his works. Even if, ultimately, as a form of AI, I find them unconvincing because we don't know if such a thing is even possible.
Ad astra! Sean
Post a Comment