[Fis] defining information - Goal, Methodology, Steps ...

jose luis perez velazquez jlpvjlpv at gmail.com
Wed Mar 11 10:47:31 CET 2020


 Greetings. I sent yesterday  this message with the attachment (the paper
Pedro asked me about) but did not go through (attachments are not welcome),
so if someone is interested let me know and I will send it to you.
JL
 ----------------

I attach the copy they gave me, they call it author's copy... And I was
told it would be open access... oh well. Section 5.1 contains that
digression on info processing applied to nervous systems.
    As you said, the crucial matter is ' when defining it, of including the
context in which the definition is established'. It is the same as with
other measures, like for instance complexity which has dozens of notions
and all are fine provided we specify what kind of complexity we
measure from what type of variable.  And similarly for entropy and various
other man-made metrics.
    And by the way... You information experts may want to consider what's
going on with this coronavirus business/madness, apparently a normal flu
but perhaps due to "information", or mis/dis-information provided by
administrators, politicians and media, it has become in the mind of the
populace an abnormal flu the likes of ebola disease.
  And on that topic, salud a todos!


On Mon, Mar 9, 2020 at 7:38 PM Pedro C. Marijuan <pcmarijuan.iacs at aragon.es>
wrote:

> Dear Krassimir and FIS colleagues,
>
> Most of the recent messages look in Sync. The consensus points, under
> different names, towards information as secondary, relative, undefinable...
> and particularly towards the need, when defining it, of including the
> context in which the definition is established. Personally I am more
> interested in the concerns about empirical information matters in different
> disciplines: we have heard snippets from computer science, maths, biology,
> neuroscience (Jose Luis--could you please send me that very interesting
> paper? It it is not an open access) ... My traditional position is that the
> "informational coupling" of life forms with the abduced parts of their
> environment (signals, communication, etc.) in the advancement of their own
> living cycle should be considered as the proto-phenomenon of information.
> Agents, agency, are an abstraction in other disciplines (computer science,
> economics) derived from the former proto-phenomenon, unfortunately
> depriving it of some of its most interesting qualities. So rich is in every
> generative aspect: Conatus principle (Spinoza), fundamental coding (Rosen),
> genetic algorithms (Holland), sel-constructing "machines" (von Neumann),
> neural topodynamics (Friston)...
>
> The above central "informational coupling" appears as valid along the
> complexity growth: procaryots, eukaryots, multicells, nervous systems,
> societies... in each of these realms, and in multiple ways, we can point to
> "info definitions" and contexts tailored to the particular phenomenology.
> From cellular signalling systems to the bonding structures of our
> societies, or to the adaptive role of emotions. For instance, I have
> various research papers on the intriguing info content of laughter and its
> enigmatic relationship to the life cycle (it is omnipresent in our lives).
> Why?? A neural network can detect with more than 90% success from just a
> bunch of your laughs whether you are falling in depression or not (Navarro
> et al. 2014). Entropy of the frequencies involved is a major key.
>
> Developing consistent empirical research on information "caught in the
> act" is crucial.
>
> Best--Pedro
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listas.unizar.es/pipermail/fis/attachments/20200311/43da818f/attachment-0001.html>


More information about the Fis mailing list