[Fis] Answer to the comments made by Joseph

Robert E. Ulanowicz ulan at umces.edu
Mon Jul 27 21:25:08 CEST 2015


Folks

I know there is a long legacy of equating information with entropy, and
dimensionally, they are the same. Qualitatively, however, they are
antithetical. From the point of view of statistical mechanics, information
is a *decrease* in entropy, i.e., they are negatives of each other.

This all devolves back upon the requirement that *both* entropy and
information require a reference state. (The third law of thermodynamics.)
Once a reference distribution has been identified, one can then quantify
both entropy and information. It actually turns out that against any
reference state, entropy can be parsed into two components, mutual
information and conditional (or residual) entropy. Change the reference
state and the decomposition changes.
<http://people.clas.ufl.edu/ulan/files/FISPAP.pdf> (See also Chapter 5 in
<http://people.clas.ufl.edu/ulan/publications/ecosystems/gand/>.)

Cheers to all,
Bob

> Folks,
>
> Doing dimensional analysis entropy is heat difference divided by
> temperature. Heat is energy, and temperature is energy per degree of
> freedom. Dividing, we get units of inverse degrees of freedom. I submit
> that information has the same fundamental measure (this is a consequence
> of Scott Muller¡¯s asymmetry principle of information. So fundamentally we
> are talking about the same basic thing with information and entropy.
>
> I agree, though, that it is viewed from different perspectives and they
> have differing conventions for measurement.
>
> I agree with Loet¡¯s other points.
>
> John





More information about the Fis mailing list