[Fis] The Same and Not the Same

Joseph Brenner joe.brenner at bluewin.ch
Tue Jul 28 09:06:59 CEST 2015


Dear Bob and All,

I have found many useful things in the recent postings, especially Bob U.'s 
point about the parsing of entropy into two components, mutual information 
and residual entropy; /qualitatively/, information and entropy are 
(epistemologically) antithetical, and, I might add, ontologically 
contradictorial. Also John's point about loops not being computable, as one 
might expect if they reflect the evolution of real processes.

But what about mutual information itself? Mutual information is defined, I 
believe, as a measure of the mutual dependence of random variables. But 
suppose the variables or process elements are not random, but there is still 
mutual dependence. What about the information content here?

Perhaps in this context, I can ask again the question of whether it makes 
sense to 'parse' /information/ itself into two interactive components that 
differ in their dimensionality, with meaning associated with the emergent 
component with the higher dimensionality.

Curious,

Joseph


----- Original Message ----- 
From: "Robert E. Ulanowicz" <ulan at umces.edu>
To: "John Collier" <Collierj at ukzn.ac.za>
Cc: <loet at leydesdorff.net>; "'Joseph Brenner'" <joe.brenner at bluewin.ch>; 
"'Fernando Flores'" <fernando.flores at kultur.lu.se>; <fis at listas.unizar.es>
Sent: Monday, July 27, 2015 9:25 PM
Subject: Re: [Fis] Answer to the comments made by Joseph


> Folks
>
> I know there is a long legacy of equating information with entropy, and
> dimensionally, they are the same. Qualitatively, however, they are
> antithetical. From the point of view of statistical mechanics, information
> is a *decrease* in entropy, i.e., they are negatives of each other.
>
> This all devolves back upon the requirement that *both* entropy and
> information require a reference state. (The third law of thermodynamics.)
> Once a reference distribution has been identified, one can then quantify
> both entropy and information. It actually turns out that against any
> reference state, entropy can be parsed into two components, mutual
> information and conditional (or residual) entropy. Change the reference
> state and the decomposition changes.
> <http://people.clas.ufl.edu/ulan/files/FISPAP.pdf> (See also Chapter 5 in
> <http://people.clas.ufl.edu/ulan/publications/ecosystems/gand/>.)
>
> Cheers to all,
> Bob
>
>> Folks,
>>
>> Doing dimensional analysis entropy is heat difference divided by
>> temperature. Heat is energy, and temperature is energy per degree of
>> freedom. Dividing, we get units of inverse degrees of freedom. I submit
>> that information has the same fundamental measure (this is a consequence
>> of Scott Muller¡¯s asymmetry principle of information. So fundamentally 
>> we
>> are talking about the same basic thing with information and entropy.
>>
>> I agree, though, that it is viewed from different perspectives and they
>> have differing conventions for measurement.
>>
>> I agree with Loet¡¯s other points.
>>
>> John
>
> 




More information about the Fis mailing list