[Fis] The information-entropy relation clarified: The New Jerseyator

Sungchul Ji sji at pharmacy.rutgers.edu
Sun Jun 3 06:13:11 CEST 2018


Hi  FISers,


One simple (and may be too simple) way to distinguish between information and entropy may be as follows:


(i)  Define  information (I) as in Eq. (1)


                  I = -log_2(m/n) = - log_2 (m) + log_2(n)                                                       (1)


where n is the number of all possible choices (also called variety) and m is the actual choices made or selected.


(ii) Define the negative binary logarithm of n, i.e., -log_2 (n), as the 'variety' of all possible choices  and hence identical with Shannon entropy H, as suggested by Wicken [1].  Then Eq. (1) can be re-writtens as Eq. (2):


                   I = - log_2(m) - H                                                                                               (2)


(iii) It is evident that when m = 1 (i.e., when only one is chosen out of all the variety of choices available) , Eq. (2) reduces to Eq. (3):


                    I = - H                                                                                                                  (3)


(iv) As is well known, Eq. (3) is the basis for the so-called the "negentropy priniciple of Information" frist advocated by Shroedinger followed by Brillouin,and others.  But Eq. (3) is clearly not a principle but a special case of Eq. (2)  with m = 1.


(v)  In conlcusion, I claim that information and negative entropry are not the same qualitatively nor quantiatively (except when m = 1 in Eq. (2)) and represent two opposite nodes of a fundamental triad [2]:






                                                                                 Selection

                                        H ---------------------------------------------------------------->  I
              (uncertainty before selection)                                         (Uncertainty after selection)





Figure 1.  The New Jerseyator model of information (NMI) [3].  Since selection requires free energy dissipation, NMI implicates both information and energy.  That is, without energy dissipation, no energy, and hence NMI may be viewed as a self-organizing process (also called dissipative structure) or an ‘-ator’.  Also NMI is consistent with “uncertainty reduction model of information.”



With all the best.


Sung


P.s.  There are experimetnal evidences that informattion and entropy are orthogonal, thus giving rise to the Planck-Shannon plane that has been shown to distiguish between cancer and healthy cell mRNA levels.  I will discus this in n a later post.



References:

   [1]  Wicken, J. S. (1987).  Entropy and information: suggestions for common language. Phil. Sci. 54: 176=193.
   [2] Burgin, M (2010).  Theory of Information: Funadamentality, Diversity, and Unification.  World Scientific Publishing, New Jersey,

   [3] Ji, S. (2018).  The Cell Langauge theory: Connecting Mind and Matter.  World Scientific Publishing, New Jersey.  Figure 10.24.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listas.unizar.es/pipermail/fis/attachments/20180603/47fdaaad/attachment.html>


More information about the Fis mailing list