[Fis] Causation is transfer... and something else
Pedro C. Marijuan
pcmarijuan.iacs at aragon.es
Fri Mar 31 14:00:00 CEST 2017
Dear Hector and colleagues,
I have found very interesting your message. It has reminded me in
another level the problems we have in this list to keep focused
discussions particularly regarding disciplinary (non philosophical, non
general) matters. Most people in the list pay lip tribute to
multidisciplinarity concerning the problem of establishing the
foundations of information science. But in actuality only the
"generalist" community including philosophers and people close to
information theory have sufficient critical mass to voluntarily or
involuntarily bias the debate towards their views, specially the
preoccupation for these big questions that (fortunately) at the time
being can not be answered.
In my particular stance, already commented upon in my last message, and
in quite a few previous ones, the most strategic problem relates to the
biological origins of meaning, that hiatus that notoriously separates
the inanimate/objective from the animate/subjective forms of
information. The recent revolution in signaling science has a few things
to say about that, how life cycles are advanced among constellations of
colligated info&energy flows and how the meaning of signals is
molecularly fabricated, not so far away from our social "narratives".
But helas I have failed to capture the attention and interest of my FIS
colleagues --a complain, to myself, which is widely shared among most,
if not all of us!! In any case I omit self-propaganda of my papers on
the matter.
Please, do not take that as a manifestation of bitterness. The fact is
that we have a serious imbalance in the composition of our discussion
community. In part, enlisting practicing research scientists in a
generalist list like this one is very difficult. And maintaining topical
discussions on their specialized matters of interest is almost
impossible given the lack of critical mass, and the disinterest of broad
segments of the list. See for instance the poor performance of most
specialized sessions organized so far. Spontaneous "tangents" come to
the rescue, as they have always been accepted in this list, and can be
genuinely creative, but most of the derivations go again and again to
those ghostly questions.
Now, going to the positive part, I have recently proposed to the board
of IS4SI, the common info society into which FIS integrated, the
arrangement of Working Groups, or Interest Groups, so that maintaining a
general discussion list be compatible with parallel exchanges among more
homogeneous participants. For instance, here at FIS it wouldn't be too
difficult arranging a working group on info philosophy and another on
info theory and the definition of information (the quest for
establishing standards); and perhaps we could try one in biophysics and
neurodynamics, and another group in bioinformation, plus social info
matters... Who knows? I think it is an interesting step to try in order
to achieve some "ratchet effect", and we could count with fis' own web
pages to support the new works, and perhaps it would be easier to get
some financing for small meetings face to face... Well, I offer myself
to start working with the bioinfo club, and if anyone is interested in
the initial coordination of one of these possible teams, just speak up
(either in the list or offline). If any of these could work a little
among us, we would have made advancements to arrange the idea in wider
scale.
Best wishes--Pedro
El 30/03/2017 a las 22:01, Terrence W. DEACON escribió:
> Dear Hector,
>
> Whenever I read an email or hear a response that begins with the
> phrase "With all due respect" I fear that what follows will indeed be
> disrespectful and self-promoting. Scholarly respect is particularly
> important when the diversity of backgrounds of the contributors is so
> broad and their level of erudition in these different fields is
> likewise broad. Best to begin with the assumption that all are
> well-read expert scholars rather than complaining about others'
> ignorance of what you refer to—an assumption that is often mistaken.
>
> In our short email notes one cannot expect each author to provide a
> list of all current mathematical and non-mathematical formal
> definitions of information, or to provide an evidentiary list of their
> own papers on the topic as a proof of competence, in order to make a
> point. Since we are inevitably forced to use short-hand terms to
> qualify our particular usages, my only suggestion is that we need to
> find mutially understandable qualifiers for these different uses, to
> avoid pointless bickering about what 'information' is or how it should
> be used.
>
> The term "information" is not "fixed" to a particular technical
> definition currently standard to only one or two fields like
> mathematics, physics, or computation theory. Nor can we assume that
> technical approaches in one field will be relevant to problems outside
> that field. I would hope that we are collectively attempting to expand
> our mutual understanding of this concept, recognizing its diversity,
> and the value of the many very different approaches in different
> fields. I would like us to stop making claims that one or another
> approach has exclusive priority and remain open to dialogue and
> constructive argument. So although we should credit Wiener, Fano,
> Solomonoff, Kolmogorov, Chaitin, Bennett, Landauer, and many many
> others with greatly extending the field beyond Shannon's initial
> contribution, even a full bibliography of mathematical and physical
> contributions to the understanding of this concept would only scratch
> the surface. Information concepts are critical to molecular and
> evolutionary biology, cognitive neuroscience, semiotics and
> linguistics, and social theory—to name but a few more divergent
> fields. Each of these fields has their own list of luminaries and
> important discoveries.
>
> The challenge is always to find a common set of terms and assumptions
> to ground such ambitious multidisciplinary explorations.
> To those who are convinced that the past 65 years of research HAS
> dealt with all the relevant issues I beg your patience with those of
> us who remain less convinced.
>
> — Terry
>
>
>
>
> On Thu, Mar 30, 2017 at 11:12 AM, John Collier <Collierj at ukzn.ac.za
> <mailto:Collierj at ukzn.ac.za>> wrote:
>
> Dear Hector,
>
> Personally I agree that algorithmic information theory and the
> related concepts of randomness and Bennett’s logical depth are the
> best way to go. I have used them in many of my own works. When I
> met Chaitin a few years back we talked mostly about how
> unrewarding and controversial our work on information theory has
> been. When I did an article on information for the Stanford
> Encyclopaedia of Philosophy it was rejected in part becausewe of
> fierce divisions between supporters of Chaitin and supporters of
> Kolmogorov! The stuff I put in on Spencer Brown was criticized
> because “he was some sort of Buddhist, wasn’t he?” It sounds like
> you have run into similar problems.
>
> That is why I suggested a realignment of what this group should be
> aiming for. I think the end result would justify our thinking, and
> your work certainly furthers it. But it does need to be worked
> out. Personally, I don’t have the patience for it.
>
> John Collier
>
> Emeritus Professor and Senior Research Associate
>
> Philosophy, University of KwaZulu-Natal
>
> http://web.ncf.ca/collier
>
> *From:*Hector Zenil [mailto:hzenilc at gmail.com
> <mailto:hzenilc at gmail.com>]
> *Sent:* Thursday, 30 March 2017 10:48 AM
> *To:* John Collier <Collierj at ukzn.ac.za
> <mailto:Collierj at ukzn.ac.za>>; fis <fis at listas.unizar.es
> <mailto:fis at listas.unizar.es>>
> *Subject:* Re: [Fis] Causation is transfer of information
>
> Dear John et al. Some comments below:
>
> On Thu, Mar 30, 2017 at 9:47 AM, John Collier <Collierj at ukzn.ac.za
> <mailto:Collierj at ukzn.ac.za>> wrote:
>
> I think we should try to categorize and relate information
> concepts rather than trying to decide which is the “right
> one”. I have tried to do this by looking at various uses of
> information in science, and argue that the main uses show
> progressive containment: Kinds of Information in Scientific
> Use
> <http://www.triple-c.at/index.php/tripleC/article/view/278/269>.
> 2011. cognition, communication, co-operation. Vol 9, No 2
> <http://www.triple-c.at/index.php/tripleC/issue/view/22>
>
> There are various mathematical formulations of information as
> well, and I think the same strategy is required here.
> Sometimes they are equivalent, sometimes close to equivalent,
> and sometimes quite different in form and motivation. Work on
> the foundations of information science needs to make these
> relations clear. A few years back (more than a decade) a
> mathematician on a list (newsgroup) argued that there were
> dozens of different mathematical definitions of information. I
> thought this was a bit excessive, and argued with him about
> convergences, but he was right that they were mathematically
> different. We need to look at information theory structures
> and their models to see where they are equivalent and where
> (and if) they overlap. Different mathematical forms can have
> models in common, sometimes all of them.
>
> The agreement among professional mathematicians is that the
> correct definition of randomness as opposed to information is the
> Martin Loef definition for the infinite asymptotic case, and
> Kolmogorov-Chaitin for the finite case. Algorithmic probability
> (Solomonoff, Levin) is the theory of optimal induction and thus
> provides a formal universal meaning to the value of information.
> Then the general agreement is also that Bennett's logical depth
> separates the concept of randomness from information structure. No
> much controversy in in there on the nature of classical
> information as algorithmic information. Notice that 'algorithmic
> information' is not just one more definiton of information, IS the
> definition of mathematical information (again, by way of defining
> algorithmic randomness). So adding 'algorithmic' to information is
> not to talk about a special case that can then be ignored by
> philosophy of information.
>
> All the above builds on (and well beyond) Shannon Entropy, which
> is not even very properly discussed in philosophy of information
> beyond its most basic definition (we rarely, if ever, see
> discussions around mutual information, conditional information,
> Judea Pearl's interventionist approach and counterfactuals, etc),
> let alone anything of the more advanced areas mentioned above, or
> a discussion on the now well established area of quantum
> information that is also comletely ignored.
>
> This is like trying to do philosophy of cosmology discussing Gamow
> and Hubble but ignoring relativity, or trying to do philosophy of
> language today discussing Locke and Hume but not Chomsky, or doing
> philosophy of mind discussing the findings of Ramon y Cajal and
> claiming that his theories are not enough to explain the brain. It
> is some sort of strawman fallacy contructing an opponent living in
> the 40s to claim in 2017 that it fails at explaining everything
> about information. Shannon Entropy is a counting-symbol function,
> with interesting applications, Shannon himself knew it. It makes
> no sense to expect a counting-symbol function to tell anything
> interesting about information after 60 years. I refer again to my
> Entropy deceiving paper: https://arxiv.org/abs/1608.05972
> <https://arxiv.org/abs/1608.05972>
>
> I do not blame philosophers on this one, phycisists seem to assign
> Shannon Entropy some mystical power, this is why I wrote a paper
> proving how it cannot be used in graph complexity as some phycists
> have recently suggested (e.g. Bianconi via Barabasi). But this is
> the kind of discussion that we should have having, telling
> phycisists not to go back to the 40s when it comes to
> characterizing new objects. If Shannon Entropy fails at
> characterizing sequences it will not work for other objects (graphs!).
>
> I think the field of philosophy of information cannot get serious
> until serious discussion on topics above starts to take place.
> Right now the field is small and carried out by a few
> mathematicians and phycisists. Philosophers are left behind
> because they are choosing to ignore all the theory developed in
> the last 50 to 60 years. I hope this is taken constructively. I
> think we philosophers need to step up, if we are not be leading
> the discussion at least we should not be 50 or 60 years behind. I
> have tried to to close that gap but usually I also get convenently
> ignored =)
>
> I have argued that information originates in symmetry breaking
> (making a difference, if you like, but I see it as a dynamic
> process rather than merely as a representation) Information
> Originates in Symmetry Breaking
> <http://web.ncf.ca/collier/papers/infsym.pdf> (/Symmetry/ 1996).
>
> Very nice paper. I agree on symmetry breaking, I have similar ideas:
>
> https://arxiv.org/abs/1210.1572 <https://arxiv.org/abs/1210.1572>
>
> (published in the journal of Natural Computing)
>
> On how symmetric rules can produce assymetric information.
>
> Best,
>
> Hector Zenil
>
> http://www.hectorzenil.net/
>
>
--
-------------------------------------------------
Pedro C. Marijuán
Grupo de Bioinformación / Bioinformation Group
Instituto Aragonés de Ciencias de la Salud
Centro de Investigación Biomédica de Aragón (CIBA)
Avda. San Juan Bosco, 13, planta 0
50009 Zaragoza, Spain
Tfno. +34 976 71 3526 (& 6818)
pcmarijuan.iacs at aragon.es
http://sites.google.com/site/pedrocmarijuan/
-------------------------------------------------
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listas.unizar.es/pipermail/fis/attachments/20170331/569a350d/attachment.html>
More information about the Fis
mailing list