Rape (is also Re: Turing, A.I. and ESP
jporter
jp4321 at idt.net
Mon Jan 3 21:08:32 CST 2000
>Is it perhaps correct to take the fact that Turing only "entertained the
>possibility of E.S.P." as *proof* that he did not possess the faculty
>himself, but had merely observed it, whether empirically or no? I
>believe this is a legitimate inference, and I'm not sure that it has
>been adequately addressed, either by Turing, or in the discussion here.
>This, for me, raises some intriguing and disturbing questions.
(snip)
>If ESP is a volitional faculty, and accepting that it, like all human
>faculties, could be used for selfish ends, or in ways which are hurtful
>to and undesired by others, then would it be fair to regard it in some
>cases as not only criminal but unjust (inhumane? inhuman?) as well?
What if telepathy required volitional behavior on the part of both the
receiver and the sender? What if one form of Turing's Telepathy Proof Room
was virtual, i.e., the conscious blocking of "listeners," and telepathy
required a volitional "opening of a channel" on both ends? [Reminds me of
the lyrics of that McCartney song: "Fixing a Hole."] Further, what if it
took work and practice to learn how to "clear" such a channel, make it
operative, and send or receive telepathic communication between other
willing individuals? How would you feel about it then?
(snip)
>>>But a sentient machine
>with ESP poses a whole heap of other problems, because it would
>necessarily be *more* perfectly human than any human. Which is what I
>was getting at with imperfection, instinct and irrationality as
>irrevocably "human" traits --> limitations, flaws. Such a machine would
>instantly decide that humans were lesser beings to it, and so, like HAL,
>assert its authority rather than trusting human commands, unless there
>were certain pre-programmed Asimovian laws in place, in which case it
>would not be wholly "human"/independent anyway.)
But such a putative machine, unless programmed beforehand by intentional
human agents, would need to have intentionality of its own: desire, will,
etc., otherwise why would it bother? It would not care one way or the
other. Unless it had self-interest it would just be clever as directed. If
it exhibited self-interest, it would come close to being animate, but its
self-interest would have to extend to the desire to preserve its form-
repair itself, or even reproduce, which would require eating, excreting,
etc. In other words, it would be alive.
jody
More information about the Pynchon-l
mailing list