[Fis] What is information? and What is life?
Bob Logan
logan at physics.utoronto.ca
Mon Dec 19 17:38:58 CET 2016
Dear Dick - I loved your analysis. You are right on the money. It also explains why Shannon dominated the field of information. He had a mathematical formula and there is nothing more appealing to a scientist than a mathematical formula. But you are right his formula only tells us of how many bits are needed to represent some information but tells us nothing about its meaning or its significance. As Marshall McLuhan said about Shannon information it is figure without ground. A figure only acquires meaning when one understands the ground in which it operates. So Shannon’s contribution to engineering is excellent but it tells us nothing about its nature or its impact as you wisely pointed out. Thanks for your insight.
I would like to refer to your insight the next time I write about info and want to attribute you correctly. Can you’ll me a bit about yourself like where you do your research. thanks - Bob Logan
______________________
Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications
On Dec 19, 2016, at 6:48 AM, Dick Stoute <dick.stoute at gmail.com> wrote:
List,
Please allow me to respond to Loet about the definition of information stated below.
1. the definition of information as uncertainty is counter-intuitive ("bizarre"); (p. 27)
I agree. I struggled with this definition for a long time before realising that Shannon was really discussing "amount of information" or the number of bits needed to convey a message. He was looking for a formula that would provide an accurate estimate of the number of bits needed to convey a message and realised that the amount of information (number of bits) needed to convey a message was dependent on the "amount" of uncertainty that had to be eliminated and so he equated these.
It makes sense to do this, but we must distinguish between "amount of information" and "information". For example, we can measure amount of water in liters, but this does not tell us what water is and likewise the measure we use for "amount of information" does not tell us what information is. We can, for example equate the amount of water needed to fill a container with the volume of the container, but we should not think that water is therefore identical to an empty volume. Similarly we should not think that information is identical to uncertainty.
By equating the number of bits needed to convey a message with the "amount of uncertainty" that has to be eliminated Shannon, in effect, equated opposites so that he could get an estimate of the number of bits needed to eliminate the uncertainty. We should not therefore consider that this equation establishes what information is.
Dick
On 18 December 2016 at 15:05, Loet Leydesdorff <loet at leydesdorff.net <mailto:loet at leydesdorff.net>> wrote:
Dear James and colleagues,
Weaver (1949) made two major remarks about his coauthor (Shannon)'s contribution:
1. the definition of information as uncertainty is counter-intuitive ("bizarre"); (p. 27)
2. "In particular, information must not be confused with meaning." (p. 8)
The definition of information as relevant for a system of reference confuses information with "meaningful information" and thus sacrifices the surplus value of Shannon's counter-intuitive definition.
information observer
that integrates interactive processes such as
physical interactions such photons stimulating the retina of the eye, human-machine interactions (this is the level that Shannon lives on), biological interaction such body temperature relative to touch ice or heat source, social interaction such as this forum started by Pedro, economic interaction such as the stock market, ... [Lerner, page 1].
We are in need of a theory of meaning. Otherwise, one cannot measure meaningful information. In a previous series of communications we discussed redundancy from this perspective.
Lerner introduces mathematical expectation E[Sap] (difference between of a priory entropy [sic] and a posteriori entropy), which is distinguished from the notion of relative information Iap (Learner, page 7).
) expresses in bits of information the information generated when the a priori distribution is turned into the a posteriori one . This follows within the Shannon framework without needing an observer. I use this equation, for example, in my 1995-book The Challenge of Scientometrics (Chapters 8 and 9), with a reference to Theil (1972). The relative information is defined as the H/H(max).
I agree that the intuitive notion of information is derived from the Latin “in-formare” (Varela, 1979). But most of us do no longer use “force” and “mass” in the intuitive (Aristotelian) sense. J The proliferation of the meanings of information if confused with “meaningful information” is indicative for an “index sui et falsi”, in my opinion. The repetitive discussion lames the progression at this list. It is “like asking whether a glass is half empty or half full” (Hayles, 1990, p. 59).
This act of forming forming an information process results in the construction of an observer that is the owner [holder] of information.
The system of reference is then no longer the message, but the observer who provides meaning to the information (uncertainty). I agree that this is a selection process, but the variation first has to be specified independently (before it can be selected.
And Lerner introduces the threshold between objective and subjective observes (page 27). This leads to a consideration selection and cooperation that includes entanglement.
I don’t see a direct relation between information and entanglement. An observer can be entangled.
Best,
Loet
PS. Pedro: Let me assume that this is my second posting in the week which ends tonight. L.
_______________________________________________
Fis mailing list
Fis at listas.unizar.es <mailto:Fis at listas.unizar.es>
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis <http://listas.unizar.es/cgi-bin/mailman/listinfo/fis>
--
4 Austin Dr. Prior Park St. James, Barbados BB23004
Tel: 246-421-8855
Cell: 246-243-5938
_______________________________________________
Fis mailing list
Fis at listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listas.unizar.es/pipermail/fis/attachments/20161219/cea8cb3c/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image007.png
Type: image/png
Size: 975 bytes
Desc: not available
URL: <http://listas.unizar.es/pipermail/fis/attachments/20161219/cea8cb3c/attachment.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image008.png
Type: image/png
Size: 497 bytes
Desc: not available
URL: <http://listas.unizar.es/pipermail/fis/attachments/20161219/cea8cb3c/attachment-0001.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image009.png
Type: image/png
Size: 495 bytes
Desc: not available
URL: <http://listas.unizar.es/pipermail/fis/attachments/20161219/cea8cb3c/attachment-0002.png>
More information about the Fis
mailing list