[Fis] A provocative issue

Francesco Rizzo 13francesco.rizzo at gmail.com
Mon Dec 12 10:40:02 CET 2016


Cari Tutti,
le cose da sole o ad una ad una non hanno senso. Questo vale per tutte le
cose, soprattutto per i geni, che acquisiscono senso quando interagiscono
fra loro. Neanche le parole fanno eccezione in al senso perchè sono causa
ed effetto di processi semiotici illimitati e consistenti in una sequela di
interpretanti e interpretati.
Il calcolo differenziale è una matematica "dura" funzionale alla fisica
newtoniana che non sopporta alcuna interdipendenza delle variabili di una
funzione. La gran parte dei sistemi non sono integrabili per una questione
di risonanza che ho poca competenza per trattarla qui.
Eppure gli economisti che hanno operato la rivoluzione marginalistica nel
1871-74 (W. S. Jevons, K. Menger e L. Walras) hanno definito, ad es., una
funzione di utilità, grandezza ordinale (non cardinale), su cui si basa la
teoria del valore dei neoclassici. Sono riusciti anche a calcolare
l'utilità marginale ponendola uguale alla derivata parziale della funzione
di utilità rispetto a un dato bene, fermi restando o parametrizzando gli
altri beni della medesima funzione.
Quindi porsi il problema della misura(zione) dell'informazione, ritenendolo
irrisolvibile, mi sembra non tanto ragionevole almeno per due serie di
motivi:
A.
-L'informazione naturale o termodinamica è funzione diretta della
neg-entropia e inversa dell'entropia;
-l'informazione matematica è funzione dei bit di entropia;
-l'informazione genetica dipende dal DNA o dalla funzione genomica;
-l'informazione semantica è frutto della sovrapposizione di un s-codice
sulla fonte di informazione equiprobabile al fine di renderla comunicabile
mediante la triade significazione, informazione, comunicazione.
Quindi, bisogna essere più cauti quando si esclude che l'informazione possa
essere misurabile.
B.
Con la matematica "dolce" della ricerca operativa o dell'analisi
multi-criteriale è possibile costruire delle matrici interattive
comprendenti variabili quantitative (cardinali), qualitative (ordinali) e
quantitativo-qualitative o qualitativo-quantitative.
Comunque, condivido e rispetto la perplessità o incertezza espressa dagli
studiosi in qualunque contesto disciplinare. Siamo tutti consapevoli che
viviamo nella società in cui l'unica certezza è l'incertezza. ma questo non
ci deve portare all'immobilismo intellettuale o operativo.
Grazie, anche per le critiche che sono più importanti e necessarie delle
concordanze. Dobbiamo educarci all'armonia del disaccordo o
dell'asimmetria, unico momento in cui "le cose vecchie  passano e nascono
quelle nuove".
Francesco, economista poverino esponenziale.

2016-12-12 8:19 GMT+01:00 Loet Leydesdorff <loet at leydesdorff.net>:

> Dear Bob,
>
>
>
> With all respect, I never understood the definition of information in this
> paper (on p. 28) as “first natural selection assembling the very
> constraints on the release of energy that then constitutes work and the
> propagation of organization.”
>
>
>
> 1. I tend to think of information as variation and not a selection
> mechanism;
>
> 2. Constraints can perhaps be modelled as the conditions in a conditional
> probability distribution. For example, when the probability distribution Σ
> i qi is constrained or conditioned by another probability distribution Σi
> pi , the condition generates [Σi qi log(qi / pi)] bits of information. In
> other words, this (Shannon-type) information is generated by the constraint.
>
>
>
> It seems to me that later in the paper, the meaning of “information” is
> strongly associated with organization. But let’s first focus on
> “information as selection environment assembling the very constraints…” How
> can one formalize this? Is it possible to use information theory or does
> one need another calculus? Did you model this? Or is it a philosophical
> contribution?
>
>
>
> A definition of information as “constraints on the release of energy”
> seems limited to systems which are using energy and generating therefore
> entropy. Shannon’s H abstracts from the energy component and is expressed
> in dimensionless bits, whereas entropy is measured in Joule/Kelvin.
>
>
>
> A monetary system, for example, is not based on energy, but on symbolic
> value exchanges. Would information be irrelevant to such a system? Or would
> one focus on the energy such as the transport costs when shipping from one
> place to another? Do we need (wish?) an information concept which is
> constrained to physical and biological systems using energy?
>
>
>
> I would suggest to define information more abstractly as yet dimensionless
> (following Shannon). Information calculus can then be used in specific
> systems and accordingly be provided with different meanings. In addition to
> the mathematical theory of communication, the domain of application allows
> for a special theory of communication. Maturana, for example, stated that a
> biology is generated when molecules are communicated. Similarly, when atoms
> are communicated, one would generate a chemistry and be able to develop a
> chemical theory of communication. The formal theorizing enables us to move
> between domains in terms of heuristics and translations.
>
>
>
> But I am looking forward to your answers.
>
>
>
> Best,
>
> Loet
>
>
> ------------------------------
>
> Loet Leydesdorff
>
> Professor, University of Amsterdam
> Amsterdam School of Communication Research (ASCoR)
>
> loet at leydesdorff.net ; http://www.leydesdorff.net/
> Associate Faculty, SPRU, <http://www.sussex.ac.uk/spru/>University of
> Sussex;
>
> Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
> Hangzhou; Visiting Professor, ISTIC,
> <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
>
> Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University of
> London;
>
> http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en
>
>
>
> *From:* Fis [mailto:fis-bounces at listas.unizar.es] *On Behalf Of *Bob Logan
> *Sent:* Sunday, December 11, 2016 9:21 PM
> *To:* tozziarturo at libero.it
> *Cc:* fis
> *Subject:* Re: [Fis] A provocative issue
>
>
>
> Bravo Arturo - I totally agree - in a paper I co-authored with Stuart
> Kauffman and others we talked abut the relativity of
>
> information and the fact that information is not an absolute. Here is the
> abstract of the paper and an excerpt from the paper that discusses the
> relativity of information. The full papers available at:
> https://www.academia.edu/783503/Propagating_organization_an_enquiry
>
>
>
> Best wishes - Bob Logan
>
>
>
> Kauffman, Stuart, Robert K. Logan, Robert Este, Randy Goebel, David Hobill
> and Ilya Smulevich. 2007. Propagating Organization: An Inquiry. Biology and
> Philosophy 23: 27-45.
>
>                                     *Propagating Organization: An Enquiry
> - *Stuart Kauffman, Robert K. Logan, Robert Este, Randy Goebel, David Hobill
> and lIlya Shmulevich
>
> Institute for Systems Biology, Seattle Washington
>
>  *Abstract: *Our aim in this article is to attempt to discuss propagating
> organization of process, a poorly articulated union of matter, energy,
> work, constraints and that vexed concept, “information”, which unite in far
> from equilibrium living physical systems. Our hope is to stimulate
> discussions by philosophers of biology and biologists to further clarify
> the concepts we discuss here. We place our discussion in the broad context
> of a “general biology”, properties that might well be found in life
> anywhere in the cosmos, freed from the specific examples of terrestrial
> life after 3.8 billion years of evolution. By placing the discussion in
> this wider, if still hypothetical, context, we also try to place in context
> some of the extant discussion of information as intimately related to DNA,
> RNA and protein transcription and translation processes. While
> characteristic of current terrestrial life, there are no compelling grounds
> to suppose the same mechanisms would be involved in any life form able to
> evolve by heritable variation and natural selection. In turn, this allows
> us to discuss at least briefly, the focus of much of the philosophy of
> biology on population genetics, which, of course, assumes DNA, RNA,
> proteins, and other features of terrestrial life. Presumably, evolution by
> natural selection – and perhaps self-organization - could occur on many
> worlds via different causal mechanisms.
>
> Here we seek a non-reductionist explanation for the synthesis,
> accumulation, and propagation of information, work, and constraint, which
> we hope will provide some insight into both the biotic and abiotic
> universe, in terms of both molecular self reproduction and the basic work
> energy cycle where work is the constrained release of energy into a few
> degrees of freedom. The typical requirement for work itself is to construct
> those very constraints on the release of energy that then constitute
> further work. Information creation, we argue, arises in two ways: first
> information as natural selection assembling the very constraints on the
> release of energy that then constitutes work and the propagation of
> organization. Second, information in a more extended sense is “semiotic”,
> that is *about* the world or internal state of the organism and requires
> appropriate response. The idea is to combine ideas from biology, physics,
> and computer science, to formulate explanatory hypotheses on how
> information can be captured and rendered in the expected physical
> manifestation, which can then participate in the propagation of the
> organization of process in the expected biological work cycles to create
> the diversity in our observable biosphere.
>
> Our conclusions, to date, of this enquiry suggest a foundation which views
> information as the construction of constraints, which, in their physical
> manifestation, partially underlie the processes of evolution to dynamically
> determine the fitness of organisms within the context of a biotic universe.
>
>
>
> *Section 4. The Relativity of Information*
>
>  In Sections 2 we have argued that the Shannon conception of information
> are not directly suited to describe the information of autonomous agents
> that propagate their organization. In Section 3 we have defined a new form
> of information, instructional or biotic information as the constraints that
> direct the flow of free energy to do work.
>
> The reader may legitimately ask the question “isn’t information just
> information?”, i.e., an invariant like the speed of light. Our response to
> this question is *no*, and to then clarify what seems arbitrary about the
> definition of information. Instructional or biotic information is a useful
> definition for biotic systems just as Shannon information was useful for
> telecommunication channel engineering, and Kolmogorov (Shiryayev 1993)
> information was useful for the study of information compression with
> respect to Turing machines.
>
> The definition of information is relative and depends on the context in
> which it is to be considered. There appears to be no such thing as absolute
> information that is an invariant that applies to all circumstances. Just as
> Shannon defined information in such a way as to understand the engineering
> of telecommunication channels, our definition of instructional or biotic
> information best describes the interaction and evolution of biological
> systems and the propagation of organization. Information is a tool and as
> such it comes in different forms. We therefore would like to suggest that
> information is not an invariant but rather a quantity that is relative to
> the environment in which it operates. It is also the case that the
> information in a system or structure is not an intrinsic property of that
> system or structure; rather it is sensitive to history and environment. To
> drive home this point we will now examine the historic context in which
> Shannon (1948) information emerged.
>
> Before delving into the origin of Shannon information we will first
> examine the relationship of information and materiality. Information is
> about material things and furthermore is instantiated in material things
> but is not material itself. Information is an abstraction we use to
> describe the behavior of material things and often is thought as something
> that controls, in the cybernetic sense, material things. So what do we mean
> when we say the constraints are information and information is constraints
> as we did in Section 3.
>
> “The constraints are information” is a way to describe the limits on the
> behavior of an autonomous agent who acts on its own behalf but is
> nevertheless constrained by the internal logic that allows it to propagate
> its organization. This is consistent with Hayle’s (1999, p. 72) description
> of the way information is regarded by information science: “It constructs
> information as the site of mastery and control over the material world.”
> She claims, and we concur, that information science treats information as
> separate from the material base in which it is instantiated. This suggests
> that there is nothing intrinsic about information but rather it is merely a
> description of or a metaphor for the complex patterns of behavior of
> material things. In fact, the key is to what degree information is a
> completely vivid description of the objects in question.
>
> This understanding of the nature of information arises from Shannon’s
> (1948) original formulation of information, dating back to his original
> paper:
>
> The fundamental problem of communication is that of reproducing at one
> point either exactly or approximately a message selected at another point.
> Frequently the messages have meaning; that is they refer to or are
> correlated according to some system with certain physical or conceptual
> entities. These semantic aspects of communication are irrelevant to the
> engineering problem. The significant aspect is that the actual message is
> one selected from a set of possible messages. The system must be designed
> to operate for each possible selection, not just the one that will actually
> be chosen since this is unknown at the time of design. If the number of
> messages in the set is finite then this number or any monotonic function of
> this number can be regarded as a measure of the information produced when
> one message is chosen from the set, all choices being equally likely.
>
>  A number of problems for biology emerge from this view of information.
> The first is that the number of possible messages is not finite because we
> are not able to prestate all possible preadaptations from which a
> particular message can be selected and therefore the Shannon measure breaks
> down. Another problem is that for Shannon the semantics or meaning of the
> message does not matter, whereas in biology the opposite is true. Biotic
> agents have purpose and hence meaning.
>
> The third problem is that Shannon information is defined independent of
> the medium of its instantiation. This independence of the medium is at the
> heart of a strong AI approach in which it is claimed that human
> intelligence does not require a wet computer, the brain, to operate but can
> be instantiated onto a silicon-based computer. In the biosphere, however,
> one cannot separate the information from the material in which it is
> instantiated. The DNA is not a sign for something else it is the actual
> thing in itself, which regulates other genes, generates messenger RNA,
> which in turn control the production of proteins. Information on a computer
> or a telecommunication device can slide from one computer or device to
> another and then via a printer to paper and not really change, McLuhan’s
> “the medium is the message” aside. This is not true of living things. The
> same genotype does not always produce the same phenotype.
>
> According to the Shannon definition of information, a structured set of
> numbers like the set of even numbers has less information than a set of
> random numbers because one can predict the sequence of even numbers. By
> this argument, a random soup of organic chemicals has more information that
> a structured biotic agent. The biotic agent has more meaning than the soup,
> however. The living organism with more structure and more organization has
> less Shannon information. This is counterintuitive to a biologist’s
> understanding of a living organism. We therefore conclude that the use of
> Shannon information to describe a biotic system would not be valid. Shannon
> information for a biotic system is simply a category error.
>
> A living organism has meaning because it is an autonomous agent acting on
> its own behalf. A random soup of organic chemicals has no meaning and no
> organization. We may therefore conclude that a central feature of life is
> organization—organization that propagates.
>
>
>
> ______________________
>
>
>
> Robert K. Logan
>
> Prof. Emeritus - Physics - U. of Toronto
>
> Fellow University of St. Michael's College
>
> Chief Scientist - sLab at OCAD
>
> http://utoronto.academia.edu/RobertKLogan
>
> www.physics.utoronto.ca/Members/logan
>
> www.researchgate.net/profile/Robert_Logan5/publications
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> On Dec 11, 2016, at 10:57 AM, tozziarturo at libero.it wrote:
>
>
>
>
>
> Dear FISers,
>
> I know that some of you are going to kill me, but there’s something that I
> must confess.
>
> I notice, from the nice issued raised by Francesco Rizzo, Joseph Brenner,
> John Collier, that the main concerns are always energetic/informational
> arguments and accounts.
>
> Indeed, the current tenets state that all is information, information
> being a real quantity that can be measured through informational entropies.
>
> But… I ask to myself, is such a tenet true?
>
> When I cook the pasta, I realize that, by my point of view, the cooked
> pasta encompasses more information than the not-cooked one, because it
> acquires the role of something that I can eat in order to increase my
> possibility to preserve myself in the hostile environment that wants to
> destroy me.  However, by the point of view of the bug who eats the
> non-cooked pasta, my cooked pasta displays less information for sure.
> Therefore, information is a very subjective measure that, apart from its
> relationship with the observer, does not mean very much…  Who can state
> that an event or a fact displays more information than another one?
>
> And, please, do not counteract that information is a quantifiable,
> objective reality, because it can be measured through informational
> entropy… Informational entropy, in its original Shannon’s formulation,
> stands for an ergodic process (page 8 of the original 1948 Shannon’s
> seminal paper), i.e.: every sequence produced by the processes is the same
> in statistical properties, or, in other words, a traveling particle always
> crosses all the points of its phase space.  However, in physics and
> biology, the facts and events are never ergodic.  Statistical homogeneity
> is just a fiction, if we evaluate the world around us and our brain/mind.
>
> Therefore, the role of information could not be as fundamental as
> currently believed.
>
>
>
> P.S.: topology analyzes information by another point of view, but it’s an
> issue for the next time, I think…
>
>
>
>
>
>
>
> *Arturo Tozzi*
>
> AA Professor Physics, University North Texas
>
> Pediatrician ASL Na2Nord, Italy
>
> Comput Intell Lab, University Manitoba
>
> http://arturotozzi.webnode.it/
>
>
> _______________________________________________
> Fis mailing list
> Fis at listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
>
>
> _______________________________________________
> Fis mailing list
> Fis at listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listas.unizar.es/pipermail/fis/attachments/20161212/640dc14f/attachment.html>


More information about the Fis mailing list