(Right on, Turing.)
Poul Anderson, Harvest Of Stars, 36.
The computer of Kyra's ship Kestrel is unconscious and speaks in Kyra's voice. Most pilots prefer a voice that sounds like a person of the opposite sex but Kyra feels that this would suggest the presence of a real person. I agree with Kyra.
We are told that this computer:
steers and crews the ship;
observes;
reasons;
warns;
advises;
proposes;
learns;
adapts;
provides music, spectacle, text and virtual reality;
creates original audiovisual abstractions;
is a good albeit unsurprising conversationalist;
can reasonably be called the ship's "...brain." (p. 348)
Comments
Observation, reasoning, learning, conversation and brain usually entail consciousness. It must be understood that, in this case, they are being used in ways that do not entail that.
To what extent is an unconscious entity able to simulate conversation? Anderson is careful to add that, in this case, the element of surprise is lacking.
The ability to conduct a conversation indistinguishable from that of a conscious being is the criterion of the Turing test. If an entity that unequivocally passed this test to everyone's satisfaction were then shown to be an analog computer whose inner workings could be fully described without ascribing any consciousness to it, then I would have one massive philosophical problem because this would mean that the appearance of consciousness in other human beings would be no proof that they are in fact conscious.
3 comments:
I think the Turing test is now obsolete as a concept. I saw a video recently of an AI assistant booking an appointment and having a conversation with the human secretary at the other end of the line, and the secretary was absolutely unaware of the fact that she was conversing with a machine.
Yet the machine is, unquestionably, -not- a conscious entity. It just fakes one fairly well in certain circumstances.
Mr Stirling,
But not in every circumstance?
Think of the ways of responding to a joke:
not understanding the joke;
understanding it but not thinking it's funny;
smiling;
laughing;
telling another one;
taking offense;
threatening legal action or physical violence because of an offensive joke;
something else I haven't thought of.
A Turing Test graduate would have to do one of these (we can't predict which) and respond equally authentically to every other conversational input. Not easy for a machine that is merely faking it.
Paul.
Not easy, but not impossible. The algorithm just has to recognize "humor" and pick from a menu of responses.
Post a Comment