<html>
<head>
<meta content="text/html; charset=windows-1252"
http-equiv="Content-Type">
</head>
<body bgcolor="#FFFFFF" text="#000000">
<div class="moz-cite-prefix">Dear Hector and colleagues,<br>
<br>
I have found very interesting your message. It has reminded me in
another level the problems we have in this list to keep focused
discussions particularly regarding disciplinary (non
philosophical, non general) matters. Most people in the list pay
lip tribute to multidisciplinarity concerning the problem of
establishing the foundations of information science. But in
actuality only the "generalist" community including philosophers
and people close to information theory have sufficient critical
mass to voluntarily or involuntarily bias the debate towards their
views, specially the preoccupation for these big questions that
(fortunately) at the time being can not be answered.<br>
<br>
In my particular stance, already commented upon in my last
message, and in quite a few previous ones, the most strategic
problem relates to the biological origins of meaning, that hiatus
that notoriously separates the inanimate/objective from the
animate/subjective forms of information. The recent revolution in
signaling science has a few things to say about that, how life
cycles are advanced among constellations of colligated
info&energy flows and how the meaning of signals is
molecularly fabricated, not so far away from our social
"narratives". But helas I have failed to capture the attention and
interest of my FIS colleagues --a complain, to myself, which is
widely shared among most, if not all of us!! In any case I omit
self-propaganda of my papers on the matter. <br>
<br>
Please, do not take that as a manifestation of bitterness. The
fact is that we have a serious imbalance in the composition of our
discussion community. In part, enlisting practicing research
scientists in a generalist list like this one is very difficult.
And maintaining topical discussions on their specialized matters
of interest is almost impossible given the lack of critical mass,
and the disinterest of broad segments of the list. See for
instance the poor performance of most specialized sessions
organized so far. Spontaneous "tangents" come to the rescue, as
they have always been accepted in this list, and can be genuinely
creative, but most of the derivations go again and again to those
ghostly questions. <br>
<br>
Now, going to the positive part, I have recently proposed to the
board of IS4SI, the common info society into which FIS integrated,
the arrangement of Working Groups, or Interest Groups, so that
maintaining a general discussion list be compatible with parallel
exchanges among more homogeneous participants. For instance, here
at FIS it wouldn't be too difficult arranging a working group on
info philosophy and another on info theory and the definition of
information (the quest for establishing standards); and perhaps we
could try one in biophysics and neurodynamics, and another group
in bioinformation, plus social info matters... Who knows? I think
it is an interesting step to try in order to achieve some "ratchet
effect", and we could count with fis' own web pages to support the
new works, and perhaps it would be easier to get some financing
for small meetings face to face... Well, I offer myself to start
working with the bioinfo club, and if anyone is interested in the
initial coordination of one of these possible teams, just speak up
(either in the list or offline). If any of these could work a
little among us, we would have made advancements to arrange the
idea in wider scale. <br>
<br>
Best wishes--Pedro<br>
<br>
El 30/03/2017 a las 22:01, Terrence W. DEACON escribió:<br>
</div>
<blockquote
cite="mid:CAOJbPRLftFaD6trh4t_WPt2pOQBMPkiKMSXXsA_mmVn1rdwihw@mail.gmail.com"
type="cite">
<meta http-equiv="Content-Type" content="text/html;
charset=windows-1252">
<div dir="ltr">Dear Hector,
<div><br>
</div>
<div>Whenever I read an email or hear a response that begins
with the phrase "With all due respect" I fear that what
follows will indeed be disrespectful and self-promoting.
Scholarly respect is particularly important when the diversity
of backgrounds of the contributors is so broad and their level
of erudition in these different fields is likewise broad. Best
to begin with the assumption that all are well-read expert
scholars rather than complaining about others' ignorance of
what you refer to—an assumption that is often mistaken.</div>
<div><br>
</div>
<div>In our short email notes one cannot expect each author to
provide a list of all current mathematical and
non-mathematical formal definitions of information, or to
provide an evidentiary list of their own papers on the topic
as a proof of competence, in order to make a point. Since we
are inevitably forced to use short-hand terms to qualify our
particular usages, my only suggestion is that we need to find
mutially understandable qualifiers for these different uses,
to avoid pointless bickering about what 'information' is or
how it should be used. </div>
<div><br>
</div>
<div>The term "information" is not "fixed" to a particular
technical definition currently standard to only one or two
fields like mathematics, physics, or computation theory. Nor
can we assume that technical approaches in one field will be
relevant to problems outside that field. I would hope that we
are collectively attempting to expand our mutual understanding
of this concept, recognizing its diversity, and the value of
the many very different approaches in different fields. I
would like us to stop making claims that one or another
approach has exclusive priority and remain open to dialogue
and constructive argument. So although we should credit
Wiener, Fano, Solomonoff, Kolmogorov, Chaitin, Bennett,
Landauer, and many many others with greatly extending the
field beyond Shannon's initial contribution, even a full
bibliography of mathematical and physical contributions to the
understanding of this concept would only scratch the surface.
Information concepts are critical to molecular and
evolutionary biology, cognitive neuroscience, semiotics and
linguistics, and social theory—to name but a few more
divergent fields. Each of these fields has their own list of
luminaries and important discoveries. </div>
<div><br>
</div>
<div>The challenge is always to find a common set of terms and
assumptions to ground such ambitious multidisciplinary
explorations. </div>
<div>To those who are convinced that the past 65 years of
research HAS dealt with all the relevant issues I beg your
patience with those of us who remain less convinced.</div>
<div><br>
</div>
<div>— Terry</div>
<div><br>
</div>
<div><br>
</div>
<div><br>
</div>
</div>
<div class="gmail_extra"><br>
<div class="gmail_quote">On Thu, Mar 30, 2017 at 11:12 AM, John
Collier <span dir="ltr"><<a moz-do-not-send="true"
href="mailto:Collierj@ukzn.ac.za" target="_blank">Collierj@ukzn.ac.za</a>></span>
wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0
.8ex;border-left:1px #ccc solid;padding-left:1ex">
<div link="blue" vlink="purple" lang="EN-ZA">
<div class="m_7570415511380260885WordSection1">
<p class="MsoNormal"><span
style="font-size:11.0pt;font-family:"Calibri",sans-serif;color:#0d0d0d">Dear
Hector,</span></p>
<p class="MsoNormal"><span
style="font-size:11.0pt;font-family:"Calibri",sans-serif;color:#0d0d0d"> </span></p>
<p class="MsoNormal"><span
style="font-size:11.0pt;font-family:"Calibri",sans-serif;color:#0d0d0d">Personally
I agree that algorithmic information theory and the
related concepts of randomness and Bennett’s logical
depth are the best way to go. I have used them in
many of my own works. When I met Chaitin a few years
back we talked mostly about how unrewarding and
controversial our work on information theory has
been. When I did an article on information for the
Stanford Encyclopaedia of Philosophy it was rejected
in part becausewe of fierce divisions between
supporters of Chaitin and supporters of Kolmogorov!
The stuff I put in on Spencer Brown was criticized
because “he was some sort of Buddhist, wasn’t he?”
It sounds like you have run into similar problems. </span></p>
<p class="MsoNormal"><span
style="font-size:11.0pt;font-family:"Calibri",sans-serif;color:#0d0d0d">That
is why I suggested a realignment of what this group
should be aiming for. I think the end result would
justify our thinking, and your work certainly
furthers it. But it does need to be worked out.
Personally, I don’t have the patience for it.</span></p>
<span class="">
<p class="MsoNormal"><span
style="font-size:11.0pt;font-family:"Calibri",sans-serif;color:#0d0d0d">John
Collier</span></p>
<p class="MsoNormal"><span
style="font-size:11.0pt;font-family:"Calibri",sans-serif;color:#0d0d0d">Emeritus
Professor and Senior Research Associate</span></p>
<p class="MsoNormal"><span
style="font-size:11.0pt;font-family:"Calibri",sans-serif;color:#0d0d0d">Philosophy,
University of KwaZulu-Natal</span></p>
<p class="MsoNormal"><span
style="font-size:11.0pt;font-family:"Calibri",sans-serif;color:#0d0d0d"><a
moz-do-not-send="true"
href="http://web.ncf.ca/collier" target="_blank"><span
style="color:#0563c1">http://web.ncf.ca/collier</span></a></span></p>
<p class="MsoNormal"><span
style="font-size:11.0pt;font-family:"Calibri",sans-serif;color:#0d0d0d"> </span></p>
</span>
<div style="border:none;border-left:solid blue
1.5pt;padding:0cm 0cm 0cm 4.0pt">
<div>
<div style="border:none;border-top:solid #e1e1e1
1.0pt;padding:3.0pt 0cm 0cm 0cm">
<p class="MsoNormal"><b><span
style="font-size:11.0pt;font-family:"Calibri",sans-serif"
lang="EN-US">From:</span></b><span
style="font-size:11.0pt;font-family:"Calibri",sans-serif"
lang="EN-US"> Hector Zenil [mailto:<a
moz-do-not-send="true"
href="mailto:hzenilc@gmail.com"
target="_blank">hzenilc@gmail.com</a>]
<br>
<b>Sent:</b> Thursday, 30 March 2017 10:48 AM<br>
<b>To:</b> John Collier <<a
moz-do-not-send="true"
href="mailto:Collierj@ukzn.ac.za"
target="_blank">Collierj@ukzn.ac.za</a>>;
fis <<a moz-do-not-send="true"
href="mailto:fis@listas.unizar.es"
target="_blank">fis@listas.unizar.es</a>><span
class=""><br>
<b>Subject:</b> Re: [Fis] Causation is
transfer of information</span></span></p>
</div>
</div>
<p class="MsoNormal"> </p>
<div>
<div>
<p class="MsoNormal" style="margin-bottom:12.0pt">Dear
John et al. Some comments below:</p>
<div><span class="">
<p class="MsoNormal">On Thu, Mar 30, 2017 at
9:47 AM, John Collier <<a
moz-do-not-send="true"
href="mailto:Collierj@ukzn.ac.za"
target="_blank">Collierj@ukzn.ac.za</a>>
wrote:</p>
<blockquote
style="border:none;border-left:solid #cccccc
1.0pt;padding:0cm 0cm 0cm
6.0pt;margin-left:4.8pt;margin-right:0cm">
<div>
<div>
<p class="MsoNormal"><span
style="font-size:11.0pt;font-family:"Calibri",sans-serif;color:#0d0d0d">I
think we should try to categorize
and relate information concepts
rather than trying to decide which
is the “right one”. I have tried to
do this by looking at various uses
of information in science, and argue
that the main uses show progressive
containment:
</span><span style="color:black"><a
moz-do-not-send="true"
href="http://www.triple-c.at/index.php/tripleC/article/view/278/269"
target="_blank">Kinds of
Information in Scientific Use</a>.
2011. cognition, communication,
co-operation. <a
moz-do-not-send="true"
href="http://www.triple-c.at/index.php/tripleC/issue/view/22"
target="_blank">Vol 9, No 2</a></span></p>
<p class="MsoNormal"><span
style="color:black">There are
various mathematical formulations of
information as well, and I think the
same strategy is required here.
Sometimes they are equivalent,
sometimes close to equivalent, and
sometimes quite different in form
and motivation. Work on the
foundations of information science
needs to make these relations clear.
A few years back (more than a
decade) a mathematician on a list
(newsgroup) argued that there were
dozens of different mathematical
definitions of information. I
thought this was a bit excessive,
and argued with him about
convergences, but he was right that
they were mathematically different.
We need to look at information
theory structures and their models
to see where they are equivalent and
where (and if) they overlap.
Different mathematical forms can
have models in common, sometimes all
of them.</span></p>
</div>
</div>
</blockquote>
<div>
<p class="MsoNormal"> </p>
</div>
</span>
<div>
<p class="MsoNormal">The agreement among
professional mathematicians is that the
correct definition of randomness as opposed
to information is the Martin Loef definition
for the infinite asymptotic case, and
Kolmogorov-Chaitin for the finite case.
Algorithmic probability (Solomonoff, Levin)
is the theory of optimal induction and thus
provides a formal universal meaning to the
value of information. Then the general
agreement is also that Bennett's logical
depth separates the concept of randomness
from information structure. No much
controversy in in there on the nature of
classical information as algorithmic
information. Notice that 'algorithmic
information' is not just one more definiton
of information, IS the definition of
mathematical information (again, by way of
defining algorithmic randomness). So adding
'algorithmic' to information is not to talk
about a special case that can then be
ignored by philosophy of information. </p>
</div>
<div>
<p class="MsoNormal">All the above builds on
(and well beyond) Shannon Entropy, which is
not even very properly discussed in
philosophy of information beyond its most
basic definition (we rarely, if ever, see
discussions around mutual information,
conditional information, Judea Pearl's
interventionist approach and
counterfactuals, etc), let alone anything of
the more advanced areas mentioned above, or
a discussion on the now well established
area of quantum information that is also
comletely ignored. </p>
</div>
<div>
<p class="MsoNormal">This is like trying to do
philosophy of cosmology discussing Gamow and
Hubble but ignoring relativity, or trying to
do philosophy of language today discussing
Locke and Hume but not Chomsky, or doing
philosophy of mind discussing the findings
of Ramon y Cajal and claiming that his
theories are not enough to explain the
brain. It is some sort of strawman fallacy
contructing an opponent living in the 40s to
claim in 2017 that it fails at explaining
everything about information. Shannon
Entropy is a counting-symbol function, with
interesting applications, Shannon himself
knew it. It makes no sense to expect a
counting-symbol function to tell anything
interesting about information after 60
years. I refer again to my Entropy deceiving
paper:
<a moz-do-not-send="true"
href="https://arxiv.org/abs/1608.05972"
target="_blank">https://arxiv.org/abs/1608.<wbr>05972</a></p>
</div>
<div>
<p class="MsoNormal">I do not blame
philosophers on this one, phycisists seem to
assign Shannon Entropy some mystical power,
this is why I wrote a paper proving how it
cannot be used in graph complexity as some
phycists have recently suggested (e.g.
Bianconi via Barabasi). But this is the kind
of discussion that we should have having,
telling phycisists not to go back to the 40s
when it comes to characterizing new objects.
If Shannon Entropy fails at characterizing
sequences it will not work for other objects
(graphs!).</p>
</div>
<div>
<p class="MsoNormal"> </p>
</div>
<div>
<p class="MsoNormal">I think the field of
philosophy of information cannot get serious
until serious discussion on topics above
starts to take place. Right now the field is
small and carried out by a few
mathematicians and phycisists. Philosophers
are left behind because they are choosing to
ignore all the theory developed in the last
50 to 60 years. I hope this is taken
constructively. I think we philosophers need
to step up, if we are not be leading the
discussion at least we should not be 50 or
60 years behind. I have tried to to close
that gap but usually I also get convenently
ignored =)</p>
</div>
<span class="">
<div>
<p class="MsoNormal"> </p>
</div>
</span><span class="">
<blockquote
style="border:none;border-left:solid #cccccc
1.0pt;padding:0cm 0cm 0cm
6.0pt;margin-left:4.8pt;margin-right:0cm">
<div>
<div>
<p class="MsoNormal"><span
style="font-size:11.0pt;font-family:"Calibri",sans-serif;color:#0d0d0d">I
have argued that information
originates in symmetry breaking
(making a difference, if you like,
but I see it as a dynamic process
rather than merely as a
representation) </span><span
style="color:black"><a
moz-do-not-send="true"
href="http://web.ncf.ca/collier/papers/infsym.pdf"
target="_blank">Information
Originates in Symmetry Breaking</a> (<i>Symmetry</i> 1996).</span></p>
</div>
</div>
</blockquote>
</span>
<div>
<p class="MsoNormal">Very nice paper. I agree
on symmetry breaking, I have similar ideas:</p>
</div>
<div>
<p class="MsoNormal"><a moz-do-not-send="true"
href="https://arxiv.org/abs/1210.1572"
target="_blank">https://arxiv.org/abs/1210.<wbr>1572</a></p>
</div>
<div>
<p class="MsoNormal">(published in the journal
of Natural Computing)</p>
</div>
<div>
<p class="MsoNormal">On how symmetric rules
can produce assymetric information.</p>
</div>
<div>
<p class="MsoNormal"> </p>
</div>
<div>
<p class="MsoNormal">Best,</p>
</div>
<div>
<p class="MsoNormal"> </p>
</div>
<div>
<p class="MsoNormal">Hector Zenil</p>
</div>
<div>
<p class="MsoNormal"><a moz-do-not-send="true"
href="http://www.hectorzenil.net/"
target="_blank">http://www.hectorzenil.net/</a></p>
</div>
<div>
<div class="h5">
<div>
<p class="MsoNormal"> </p>
</div>
<br>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</blockquote>
</div>
</div>
</blockquote>
<br>
<p><br>
</p>
<pre class="moz-signature" cols="72">--
-------------------------------------------------
Pedro C. Marijuán
Grupo de Bioinformación / Bioinformation Group
Instituto Aragonés de Ciencias de la Salud
Centro de Investigación Biomédica de Aragón (CIBA)
Avda. San Juan Bosco, 13, planta 0
50009 Zaragoza, Spain
Tfno. +34 976 71 3526 (& 6818)
<a class="moz-txt-link-abbreviated" href="mailto:pcmarijuan.iacs@aragon.es">pcmarijuan.iacs@aragon.es</a>
<a class="moz-txt-link-freetext" href="http://sites.google.com/site/pedrocmarijuan/">http://sites.google.com/site/pedrocmarijuan/</a>
------------------------------------------------- </pre>
</body>
</html>