What about Entropy?

Terrance lycidas2 at earthlink.net
Mon Feb 3 23:15:11 CST 2003



Richard Fiero wrote:
> 
> Someone once wrote:
>  > >  entropy is the loss of information in a transmitted message.
> 
> That's not quite right.  The information capacity of a channel
> is maximized as the entropy is maximized. The information
> theoretic entropy decreases as the receiver's certainty of the
> next symbol rises. If the receiver is certain of the next
> symbol then no information is transmitted or received with that symbol.


I guess Entropy, like any other word, has several meanings. Pynchon uses
the plural "entropies" in GR and both Enzian and Slothrop attribute
these to THEM. 

 "entropy is the loss of information in a transmitted message" this is a
standard definition. It's one definition in Websters, American
Heritage,  OED, and it is also one that Wiener uses in Cybernetics and
one that Pynchon uses in his fictions.



More information about the Pynchon-l mailing list