What about Entropy?
Richard Fiero
rfiero at pophost.com
Mon Feb 3 18:19:14 CST 2003
Someone once wrote:
> > entropy is the loss of information in a transmitted message.
That's not quite right. The information capacity of a channel
is maximized as the entropy is maximized. The information
theoretic entropy decreases as the receiver's certainty of the
next symbol rises. If the receiver is certain of the next
symbol then no information is transmitted or received with that symbol.
More information about the Pynchon-l
mailing list