<div dir="ltr"><div dir="ltr"><div dir="ltr"> Greetings. I sent yesterday this message with the attachment (the paper Pedro asked me about) but did not go through (attachments are not welcome), so if someone is interested let me know and I will send it to you. </div><div>JL</div><div dir="ltr"> ----------------</div><div dir="ltr"><br></div><div dir="ltr">I attach the copy they gave me, they call it author's copy... And I was told it would be open access... oh well. Section 5.1 contains that digression on info processing applied to nervous systems.</div><div> As you said, the crucial matter is ' when defining it, of including the context in which the definition is established'. It is the same as with other measures, like for instance complexity which has dozens of notions and all are fine provided we specify what kind of complexity we measure from what type of variable. And similarly for entropy and various other man-made metrics.</div><div> And by the way... You information experts may want to consider what's going on with this coronavirus business/madness, apparently a normal flu but perhaps due to "information", or mis/dis-information provided by administrators, politicians and media, it has become in the mind of the populace an abnormal flu the likes of ebola disease.</div><div> And on that topic, salud a todos!</div></div><div><br></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Mon, Mar 9, 2020 at 7:38 PM Pedro C. Marijuan <<a href="mailto:pcmarijuan.iacs@aragon.es" target="_blank">pcmarijuan.iacs@aragon.es</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
<div>
<div>Dear Krassimir and FIS colleagues,</div>
<div><br>
</div>
<div>Most of the recent messages look in
Sync. The consensus points, under different names, towards
information as secondary, relative, undefinable... and
particularly towards the need, when defining it, of including the
context in which the definition is established. Personally I am
more interested in the concerns about empirical information
matters in different disciplines: we have heard snippets from
computer science, maths, biology, neuroscience (Jose Luis--could
you please send me that very interesting paper? It it is not an
open access) ... My traditional position is that the
"informational coupling" of life forms with the abduced parts of
their environment (signals, communication, etc.) in the
advancement of their own living cycle should be considered as the
proto-phenomenon of information. Agents, agency, are an
abstraction in other disciplines (computer science, economics)
derived from the former proto-phenomenon, unfortunately depriving
it of some of its most interesting qualities. So rich is in every
generative aspect: Conatus principle (Spinoza), fundamental coding
(Rosen), genetic algorithms (Holland), sel-constructing "machines"
(von Neumann), neural topodynamics (Friston)...<br>
</div>
<div><br>
</div>
<div>The above central "informational
coupling" appears as valid along the complexity growth:
procaryots, eukaryots, multicells, nervous systems, societies...
in each of these realms, and in multiple ways, we can point to
"info definitions" and contexts tailored to the particular
phenomenology. From cellular signalling systems to the bonding
structures of our societies, or to the adaptive role of emotions.
For instance, I have various research papers on the intriguing
info content of laughter and its enigmatic relationship to the
life cycle (it is omnipresent in our lives). Why?? A neural
network can detect with more than 90% success from just a bunch of
your laughs whether you are falling in depression or not (Navarro
et al. 2014). Entropy of the frequencies involved is a major key.</div>
<div><br>
</div>
<div>Developing consistent empirical
research on information "caught in the act" is crucial.</div>
<div><br>
</div>
<div>Best--Pedro<br>
</div>
<div><br></div></div><br>
</blockquote></div></div>