[Fis] The Same and Not the Same

Loet Leydesdorff loet at leydesdorff.net
Tue Jul 28 11:24:08 CEST 2015


Dear Joe, 

The semantic aspects are external to the Shannon perspective (by
definition). However, Weaver (1949) gives openings. (I discuss this
extensively in the paper at http://arxiv.org/abs/1507.05251.) "Mutual
redundancy" can be defined as a measure of the imprint of meaning processing
on the communication of information. Meaning itself cannot be measured, but
hypothesizing meaning processing enables one to specify expectations about
higher order loops. 

The dimensionality remains in bits of information. The probability
distribution, however, becomes multi-variate when next-order loops are
added: instead of p(i), for example, p(ijk....)

Best,
Loet


Loet Leydesdorff 
Emeritus University of Amsterdam
Amsterdam School of Communication Research (ASCoR)
loet at leydesdorff.net ; http://www.leydesdorff.net/ 
Honorary Professor, SPRU, University of Sussex; 
Guest Professor Zhejiang Univ., Hangzhou; Visiting Professor, ISTIC,
Beijing;
Visiting Professor, Birkbeck, University of London; 
http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en

-----Original Message-----
From: Joseph Brenner [mailto:joe.brenner at bluewin.ch] 
Sent: Tuesday, July 28, 2015 9:07 AM
To: ulan at umces.edu; John Collier
Cc: loet at leydesdorff.net; 'Fernando Flores'; fis at listas.unizar.es
Subject: The Same and Not the Same

Dear Bob and All,

I have found many useful things in the recent postings, especially Bob U.'s
point about the parsing of entropy into two components, mutual information
and residual entropy; /qualitatively/, information and entropy are
(epistemologically) antithetical, and, I might add, ontologically
contradictorial. Also John's point about loops not being computable, as one
might expect if they reflect the evolution of real processes.

But what about mutual information itself? Mutual information is defined, I
believe, as a measure of the mutual dependence of random variables. But
suppose the variables or process elements are not random, but there is still
mutual dependence. What about the information content here?

Perhaps in this context, I can ask again the question of whether it makes
sense to 'parse' /information/ itself into two interactive components that
differ in their dimensionality, with meaning associated with the emergent
component with the higher dimensionality.

Curious,

Joseph


----- Original Message -----
From: "Robert E. Ulanowicz" <ulan at umces.edu>
To: "John Collier" <Collierj at ukzn.ac.za>
Cc: <loet at leydesdorff.net>; "'Joseph Brenner'" <joe.brenner at bluewin.ch>;
"'Fernando Flores'" <fernando.flores at kultur.lu.se>; <fis at listas.unizar.es>
Sent: Monday, July 27, 2015 9:25 PM
Subject: Re: [Fis] Answer to the comments made by Joseph


> Folks
>
> I know there is a long legacy of equating information with entropy, 
> and dimensionally, they are the same. Qualitatively, however, they are 
> antithetical. From the point of view of statistical mechanics, 
> information is a *decrease* in entropy, i.e., they are negatives of each
other.
>
> This all devolves back upon the requirement that *both* entropy and 
> information require a reference state. (The third law of 
> thermodynamics.) Once a reference distribution has been identified, 
> one can then quantify both entropy and information. It actually turns 
> out that against any reference state, entropy can be parsed into two 
> components, mutual information and conditional (or residual) entropy. 
> Change the reference state and the decomposition changes.
> <http://people.clas.ufl.edu/ulan/files/FISPAP.pdf> (See also Chapter 5 
> in
> <http://people.clas.ufl.edu/ulan/publications/ecosystems/gand/>.)
>
> Cheers to all,
> Bob
>
>> Folks,
>>
>> Doing dimensional analysis entropy is heat difference divided by 
>> temperature. Heat is energy, and temperature is energy per degree of 
>> freedom. Dividing, we get units of inverse degrees of freedom. I 
>> submit that information has the same fundamental measure (this is a 
>> consequence of Scott Muller¡¯s asymmetry principle of information. So 
>> fundamentally we are talking about the same basic thing with 
>> information and entropy.
>>
>> I agree, though, that it is viewed from different perspectives and 
>> they have differing conventions for measurement.
>>
>> I agree with Loet¡¯s other points.
>>
>> John
>
> 





More information about the Fis mailing list