[Fis] Fw: Krassimir's Information Quadruple and GIT. Quintuples?

Francesco Rizzo 13francesco.rizzo at gmail.com
Sun Aug 31 17:16:42 CEST 2014


Cari Tutti e Caro Krassimir, anche per la qualità del Tuo stile,
mi rendo conto delle difficoltà che Vi provoco. Ma non è certo per mancanza
di rispetto che uso la lingua italiana. Anche se avessi una maggiore e
migliore conoscenza dell'inglese, non riuscirei mai a comunicarvi tutto ciò
che il mio pensiero pensante produce con un linguaggio diverso dal mio.
Continuerò a seguire la discussione-dibattito a cui date luogo e non vi
costringerò più a leggere versi in lingua italiana. Nel salutarVi Tutti, Vi
ringrazio per quel che mi avete insegnato  e m'insegnerete ancora.
Francesco Rizzo.


2014-08-25 15:51 GMT+02:00 Stanley N Salthe <ssalthe at binghamton.edu>:

> Bob wrote:
>
> Recall that some thermodynamic variables, especially work functions like
> Helmholz & Gibbs free energies and exergy all are tightly related to
> information measures. In statistical mechanical analogs, for example, the
> exergy becomes RT times the mutual information among the molecules
>
> S: So, the more organized, the more potential available energy.
>
>
> I happen to be a radical who feels that the term "energy" is a construct
> with little ontological depth.
>
> S: I believe it has instead ontological breadth!
>
> It is a bookkeeping device (a nice one, of course, but bookkeeping
> nonetheless).
> It was devised to maintain the Platonic worldview. Messrs. Meyer & Joule
> simply
> gave us the conversion factors to make it look like energy is constant.
>
> S: It IS constant in the adiabatic boxes used to measure it.
>
>  *Real* energy is always in decline -- witness what happens to the work
> functions I
> just mentioned.
>
> S: In decline in the actual material world that we inhabit.  That is, the
> local world -- the world of input and dissipation.  I think the information
> problem may be advanced if we try to explain why the energy efficiency of
> any work is so poor, and gets worse the harder we work. This is the key
> local phenomenon that needs to be understood.
>
> STAN
>
>
> On Mon, Aug 25, 2014 at 4:40 AM, John Collier <collierj at ukzn.ac.za> wrote:
>
>> Nice post, Bob. I agree pretty much. Brooks and Wiley got slammed by
>> Morowitz for using the *Real* energy in their book, which being about
>> biology is the only sensible notion of energy.
>>
>> There is still a need for a clear dimensional analysis of the relation(s)
>> between information and energy. I work on that periodically, but only
>> minimal progress so far. Perhaps I can focus on it better now that I am
>> retired.
>>
>> John
>>
>> At 02:11 AM 2014-08-22, Robert E. Ulanowicz wrote:
>>
>>> Dear Joseph,
>>>
>>> Recall that some thermodynamic variables, especially work functions like
>>> Helmholz & Gibbs free energies and exergy all are tightly related to
>>> information measures. In statistical mechanical analogs, for example, the
>>> exergy becomes RT times the mutual information among the molecules.
>>>
>>> I happen to be a radical who feels that the term "energy" is a construct
>>> with little ontological depth. It is a bookkeeping device (a nice one, of
>>> course, but bookkeeping nonetheless). It was devised to maintain the
>>> Platonic worldview. Messrs. Meyer & Joule simply gave us the conversion
>>> factors to make it look like energy is constant. *Real* energy is always
>>> in decline -- witness what happens to the work functions I just
>>> mentioned.
>>>
>>> Well, enough heresy for one night!
>>>
>>> Cheers,
>>> Bob U.
>>>
>>> > Dear Mark and All,
>>> >
>>> > I return belatedly to this short but key note of Mark's in which he
>>> > repeats his view, with which I agree, that  Energy is a kind of
>>> > information and information is a kind of energy.
>>> >
>>> > My suggestion is that it may be useful to expand this statement by
>>> looking
>>> > at both Information and Energy (mass-energy) as emergent properties of
>>> the
>>> > universe. Since we agree they are not identical, we may then look at
>>> how
>>> > the properties differ. Perhaps we can say that Energy is an extensive
>>> > property, measured primarily by quantity, and Information is an
>>> intensive
>>> > property. The difficulty is that both Energy and Information themselves
>>> > appear to have both intensive and extensive properties, measured by
>>> vector
>>> > and scalar quantities respectively. I am encouraged to say that this
>>> > approach might yield results that are compatible with advanced theories
>>> > based on the sophisticated mathematics to which Mark refers.
>>> >
>>> > I would say then that in our world it is not the question of which is
>>> more
>>> > fundamental that is essential, but that Energy and Information share
>>> > properties which are linked dynamically. In this dialectical
>>> > interpretation, the need for a 'demon' that accomplishes some
>>> function, as
>>> > in the paper referred to in John's note, is a formal exercise.
>>> >
>>> > Thank you and best wishes,
>>> >
>>> > Joseph
>>> >
>>> >
>>> > ----- Original Message -----
>>> > From: Burgin, Mark
>>> > To: Joseph Brenner
>>> > Sent: Friday, August 01, 2014 9:19 PM
>>> > Subject: Re: [Fis] Krassimir's Information Quadruple and GIT.
>>> Quintuples?
>>> >
>>> >
>>> > Dear Joseph and Colleagues,
>>> > An answer to "the perhaps badly posed question of whether information
>>> or
>>> > energy is more fundamental" is given in the book M.Burgin, Theory of
>>> > information. The answer is a little bit unexpected:
>>> > Energy is a kind of information and information is a kind of energy.
>>> > It's a pity that very few researchers read books with advanced theories
>>> > based on sophisticated mathematics.
>>> >
>>> >  Sincerely,
>>> > Mark Burgin
>>> >
>>> >
>>> >
>>> >
>>> > On 7/31/2014 2:40 AM, Joseph Brenner wrote:
>>> >
>>> >   Dear Krassimir and Colleagues,
>>> >
>>> >   I have followed this discussion with interest but not total
>>> agreement.
>>> > As I have commented to Krassimir previously, I feel that his system,
>>> > based on symbols as outlined in his paper, is too static to capture the
>>> > dynamics of complex information processes and their value (valence). It
>>> > suffers from the same problems as that of Peirce and of set-theoretic
>>> > approaches, namely, a certain arbitrariness in the selection and number
>>> > of independent elements and their grounding in nature (or rather
>>> absence
>>> > of grounding).
>>> >
>>> >   If you will permit a naïve but well-intentioned question, why not
>>> have a
>>> > theory whose elements are quintuples? Would this not be a 'better',
>>> more
>>> > complete theory? This opens the possibility of an infinite regress, but
>>> > that is the point I am trying to make: the form of the theory is, to a
>>> > certain extent, defining its content.
>>> >
>>> >   The /development/ of any GIT should, from the beginning I think,
>>> > recognize the existence in real time, so to speak, of any new
>>> > suggestions in the context of other recent contributions of a different
>>> > form, such as those of Luhn, Hofkirchner, Marijuan, Deacon,
>>> > Dodig-Crnkovic, Wu and so on. Several of these already permit a more
>>> > directed discussion of the perhaps badly posed question of whether
>>> > information or energy is more fundamental. Otherwise, all that work
>>> will
>>> > need to be done at the end. In any case, the GIT itself, to the extent
>>> > that it could be desirable and useful, would also have to have some
>>> > dynamics capable of accepting theories of different forms. 20th Century
>>> > physics sought only identities throughout nature and the balance is now
>>> > being somewhat restored. I think keeping the diversity of theories of
>>> > information in mind is the most worthwhile strategy.
>>> >
>>> >   One of the values of Krassimir's approach is that it recognizes the
>>> > existence of some of these more complex questions that need to be
>>> > answered. I simply suggest that process language and a recognition of
>>> > dynamic interactions (e.g., between 'internal' and 'external') could be
>>> > part of the strategy.
>>> >
>>> >   Best wishes,
>>> >
>>> >   Joseph
>>> >
>>> >
>>> >
>>> >
>>> >     ----- Original Message -----
>>> >     From: Krassimir Markov
>>> >     To: Jerry LR Chandler ; FIS ; Pridi Siregar
>>> >     Sent: Saturday, July 26, 2014 10:42 AM
>>> >     Subject: [Fis] Information quadruple
>>> >
>>> >
>>> >     Dear Jerry, Pridi, and Colleagues,
>>> >
>>> >     Thank you for the nice comments!
>>> >
>>> >     To answer to questions I have to present next step from the GIT
>>> > (General Information Theory) we are developing.
>>> >
>>> >     Let remember in words (below "Infos" is abbreviation from
>>> "Information
>>> > Subject", it is an intelligent natural or artificial agent (system)):
>>> >
>>> >     Information is quadruple (Source, Recipient, Evidence, Infos) or
>>> > formally i = (s, r, e, I)
>>> >
>>> >     The nest step is to define elements of the quadruple:
>>> >
>>> >     s and r are structured sets;
>>> >     e is a mapping from s in r which preserves (all or partial)
>>> structure
>>> > of s and resolves any information expectation of I
>>> >
>>> >     I expect new questions:
>>> >     - what is an "intelligent agent"
>>> >     - what is "information expectation"
>>> >     - ...
>>> >
>>> >     If it is interesting, answers to these questions may be given in
>>> > further letters.
>>> >
>>> >     ***
>>> >
>>> >     Now I want to make some comments to letters received (their full
>>> texts
>>> > are given below my answers).
>>> >
>>> >     Pridi: "information cannot be viewed in any absolute sense but as
>>> > internal representations of "external patterns""
>>> >     Kr.:  Yes, the "reflection" is a property of Matter, "information"
>>> is
>>> > a reflection for which the information quadruple exists. But
>>> > information is not "internal representations of "external patterns" ".
>>> > It is result from resolving the subjective information expectation
>>> > which is process of comparing of internal and external patterns. I
>>> > know, this will cause new questions
>>> >
>>> >     Pridi: In this framework then, it seems that "information" cannot
>>> be
>>> > conceptualized without reference to the both "something out there" and
>>> > the "internal structures" of the receptor/cognitive system.
>>> >     Kr.: Yes.
>>> >
>>> >     Pridi: How can we really quantify meaningful (semantic) information
>>> > ... ?
>>> >     Kr.: By distance between "external patterns" and "information
>>> > expectation" (sorry to be not clear but it is long text for further
>>> > letters).
>>> >
>>> >     Pridi: All "objective" measures (entropy, negentropy,...) are
>>> actually
>>> > totally dependant of I1 and I2 and can never be considered as
>>> > "absolute".
>>> >     Kr.: Yes, but the world humanity is an Infos and its information
>>> > expectations we assume as "absolute".
>>> >
>>> >     Pridi: ... some researchers that posit that "information" may be
>>> more
>>> > fundamental than the fundamental physical (mass, time, space, amps).
>>> >     Kr.: Yes, there are other paradigms which are useful in some cases,
>>> > but in our paradigm "information" is not fundamental but "reflection"
>>> > is the fundamental.
>>> >
>>> >     Pridi: ... no "absolute truth" (whatever this means) is really
>>> gained.
>>> > "Only" a richer more complete (subjective but coherent) world-view .
>>> >     Kr.: Yes.
>>> >
>>> >     Jerry: ... assertion of a quadruple of symbols is rather close to
>>> the
>>> > philosophy of C S Peirce (hereafter "CSP")
>>> >     Kr.: Our paradigm is nor opposite to what science has explored till
>>> > now. All already investigated information theories (Shannon,Peirce,
>>> > etc) have to be a part or intersection of a new GIT.
>>> >
>>> >     Jerry: ... moves these 'definitions' of individual symbols into the
>>> > subjective realm. (CSP's notion of "interpretation?)
>>> >     Different researchers have the freedom to interpret the evidence as
>>> > they choose, including the relationships to engineering terms such as
>>> > "bandwidth".
>>> >     Kr.: Yes. But not only researches, everybody has such freedom.
>>> Because
>>> > of this there exist advertising processes ... but for this we have to
>>> > talk in further letters.
>>> >
>>> >     Jerry: Pridi's post appropriately recognizes the tension between
>>> > objective scientific theories and subjective judgments about evidence
>>> > by different  individuals with different professional backgrounds and
>>> > different symbolic processing powers.
>>> >     Kr.: Yes, there will be tension if we assume world as plane
>>> structure.
>>> > But it is hierarchical one and what is assumed as "subjective" at one
>>> > level is assumed as "objective" for the low levels.
>>> >
>>> >     Jerry: ... to show that these definitions of symbols motivate a
>>> > coherent symbol system that can be used to transfer information
>>> > contained in the signal from symbolic representations of entities. It
>>> > may work for engineering purposes, but is it extendable to life?
>>> >     Kr.: The goal of work on GIT is to create a coherent symbol system
>>> > which is equal valid for life creatures and artificial agents.
>>> >
>>> >     Jerry: ... this requires the use of multiple symbol systems and
>>> > multiple forms of logic in order to gain the functionality of transfer
>>> > of "in-form" between individuals or machines.
>>> >     Kr.: Yes, at least on three levels - Information, Infos,
>>> Inforaction
>>> > (Information interaction)
>>> >
>>> >     Jerry: Anybody have any suggestions on how this quadruple of
>>> symbols
>>> > can be formalized into a quantitative coherent form of communication?
>>> >     Kr.: A step toward this I give above in the beginning of this
>>> letter
>>> > but it is very long journey ...
>>> >
>>> >     Thank you for creative discussion!
>>> >     Friendly regards
>>> >     Krassimir
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >     -----Original Message-----
>>> >     From: Jerry LR Chandler
>>> >     Sent: Wednesday, July 23, 2014 8:57 PM
>>> >     To: FIS
>>> >     Cc: Krassimir Markov ; Pridi Siregar
>>> >     Subject: Re: [Fis] Re to Pridi: infinite bandwith and finite
>>> > informationcontent CS Peirce and Chemical Nomenclature
>>> >
>>> >     Pridi, Krassimir,  List:
>>> >
>>> >     (In order to place this comment in context, and for reference, I
>>> have
>>> > copied Krassimir's "definition" of information below. My comments
>>> > follow the excellent post of Pridi.)
>>> >
>>> >     > In physical world there exist only reflections but not
>>> information.
>>> >     >
>>> >     > Information " i " is the quadruple:
>>> >     > i = (s, r, e, I)
>>> >     > where
>>> >     > s is a source entity, which is reflected in r
>>> >     > r is the entity in which reflection of s exists
>>> >     > e is an evidence for the subject I which proofs for him and only
>>> for
>>> > him that the reflection in r reflects just s , i.e. the evidence
>>> > proofs for the subject what the reflection reflects .
>>> >     > I is information subject who has possibility to make decisions in
>>> > accordance with some goals - human, animal, bacteria, artificial
>>> > intelligent system, etc.
>>> >     >
>>> >     > In other words, information is a reflection, but not every
>>> > reflection is information - only reflections for which the quadruple
>>> > above exist are assumed as information by the corresponded subjects.
>>> >     >
>>> >     > For different I , information may be different because of
>>> subjects'
>>> > finite memory and reflection possibilities.
>>> >     > Because of this, a physical event with an infinite bandwidth may
>>> > have finite information content (for concrete information subject) .
>>> >     On Jul 23, 2014, at 6:45 AM, Pridi Siregar wrote:
>>> >
>>> >     > Dear Krassimir,
>>> >     >
>>> >     > Thank you for your explanation. It does give me a better
>>> > understanding of how information (beyond Shannon) can be formalized!
>>> > However, a closer look at the formalism and its semantic does raise
>>> > new questions:
>>> >     >
>>> >     > From the definition you have given, it appears that information
>>> > cannot be viewed in any absolute sense but as internal
>>> > representations of "external patterns" whose meaning depends on the
>>> > subject capturing/interpreting/storing the said patterns. In this
>>> > framework then, it seems that "information" cannot be conceptualized
>>> > without reference to the both "something out there" and the
>>> > "internal structures" of the receptor/cognitive system.
>>> >     >
>>> >     > In other words the concept of "information" lies within some
>>> > "subjective" (albeit rational) realm. I'm sure that I'm stating the
>>> > obvious for most of FIS members but a question arised upon reading
>>> > your formalism: How can we really quantify meaningful (semantic)
>>> > information beyond Shannon (that disregards semantics) and his
>>> > purely statistical framework? Or beyond Boltzmann's
>>> > entropy/Information based on micro-macro states ratios?
>>> >     >
>>> >     > When we formalize i = (s, r, e, I) there is  a "meta-level"
>>> > formalisation that is only apparent since even (s,r) reflect our own
>>> > (human) subjective world-view. We could actually write (I1(s),
>>> > I1(r), e, I2) where I1 and I2 are two distinct cognitive systems and
>>> > both of which lie at the OBJECT level of the formalizing agent which
>>> > is NEITHER I1 or I2. All "objective" measures (entropy,
>>> > negentropy,...) are actually totally dependant of I1 and I2 and can
>>> > never be considered as "absolute".
>>> >     >
>>> >     >
>>> >     > This leads me to a second question (sorry for the lengthy
>>> message):
>>> > there are some researchers that posit that "information" may be more
>>> > fundamental than the fundamental physical (mass, time, space, amps).
>>> > This appears (and perhaps only appears) to be at the opposite end of
>>> > the above-mentioned view. Indeed, in this framework some kind of
>>> > "universal" or "absolute" notions must be accepted as true.
>>> >     >
>>> >     > One apparent way out would be to demonstrate that information
>>> > somehow logically entails the fundemantal physical entities while
>>> > accepting that we are still within a human-centered  world view. And
>>> > thus no "absolute truth" (whatever this means) is really gained.
>>> > "Only" a richer more complete (subjective but coherent) world-view .
>>> >     >
>>> >     > Am I making anys sense? Any thoughts?
>>> >     >
>>> >     > Best
>>> >     >
>>> >     > Pridi
>>> >     >
>>> >
>>> >     Pridi's comment concur with many of my views wrt the concept of
>>> > information.
>>> >
>>> >     Krassimir's assertion of a quadruple of symbols is rather close to
>>> the
>>> > philosophy of C S Peirce (hereafter "CSP") in one context.
>>> >
>>> >     S as symbol represents an external source of signal, that which is
>>> > independent of the individual mind and being.  This is analogous to
>>> > CSP's term "sinsign".
>>> >
>>> >     R is a thing itself.  That is, R generates S.
>>> >
>>> >     E as evidence is a vague term which infers an observer (2nd Order
>>> > Cybernetics?) that both receives and evaluates the signal (S) from the
>>> > thing (R).  CSP categorizes evidence as icon, index or symbol with
>>> > respect to the entity of observation.
>>> >
>>> >     I  as Krassimirian information is a personal judgment about the
>>> > evidence.  (Correspondence with CSP's notion of "argument" is
>>> > conceivable.)
>>> >
>>> >     Krassimir's assertion that:
>>> >     > For different I , information may be different because of
>>> subjects'
>>> > finite memory and reflection possibilities.
>>> >     > Because of this, a physical event with an infinite bandwidth may
>>> > have finite information content (for concrete information subject) .
>>> >
>>> >
>>> >     moves these 'definitions' of individual symbols into the subjective
>>> > realm. (CSP's notion of "interpretation?)
>>> >     Different researchers have the freedom to interpret the evidence as
>>> > they choose, including the relationships to engineering terms such as
>>> > "bandwidth".
>>> >
>>> >
>>> >     Pridi's post appropriately recognizes the tension between objective
>>> > scientific theories and subjective judgments about evidence by
>>> > different  individuals with different professional backgrounds and
>>> > different symbolic processing powers.
>>> >
>>> >     The challenge for Krassimirian information, it appears to me, is to
>>> > show that these definitions of symbols motivate a coherent symbol
>>> > system that can be used to transfer information contained in the
>>> > signal from symbolic representations of entities. It may work for
>>> > engineering purposes, but is it extendable to life?
>>> >
>>> >     (For me, of course, this requires the use of multiple symbol
>>> systems
>>> > and multiple forms of logic in order to gain the functionality of
>>> > transfer of "in-form" between individuals or machines.)
>>> >
>>> >     Pridi writes:
>>> >     >  How can we really quantify meaningful (semantic) information
>>> beyond
>>> > Shannon (that disregards semantics) and his purely statistical
>>> > framework?
>>> >
>>> >     One aspect of this conundrum was solved by chemists over the past
>>> to
>>> > two centuries by developing a unique symbol system that is restricted
>>> > by physical constraints, yet functions as an exact mode of
>>> > communication.
>>> >
>>> >     Chemical notation, as symbol system, along with mathematics and
>>> data,
>>> > achieves this end purpose (entelechy) of communication, for some
>>> > entities, such as the meaning of an "atomic number" as a relational
>>> > term and hence the meaning of a particular integer as both quantity
>>> > and quality.
>>> >
>>> >     This requires a dyadic mathematics and synductive logic for
>>> > sublations.
>>> >
>>> >
>>> >     Pridi writes:
>>> >
>>> >     > It does give me a better understanding of how information (beyond
>>> > Shannon) can be formalized!
>>> >
>>> >     Can you communicate how this "better understanding...   ...
>>> > foramlized"  works?
>>> >
>>> >     It is not readily apparent to me how Krassimirian information can
>>> be
>>> > formalized.
>>> >
>>> >     Anybody have any suggestions on how this quadruple of symbols can
>>> be
>>> > formalized into a quantitative coherent form of communication?
>>> >
>>> >     Cheers
>>> >
>>> >     Jerry
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >
>>> > ------------------------------------------------------------
>>> ----------------
>>> >     _______________________________________________
>>> >     Fis mailing list
>>> >     Fis at listas.unizar.es
>>> >     http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>>> >
>>> >
>>> >
>>> >
>>> > _______________________________________________
>>> > Fis mailing list
>>> > Fis at listas.unizar.es
>>> > http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>>> >
>>> > _______________________________________________
>>> > Fis mailing list
>>> > Fis at listas.unizar.es
>>> > http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>>> >
>>>
>>
>>
>> ----------
>> Professor John Collier
>> collierj at ukzn.ac.za
>> Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
>> Africa
>> T: +27 (31) 260 3248 / 260 2292       F: +27 (31) 260 3031
>> Http://web.ncf.ca/collier
>>
>>
>> _______________________________________________
>> Fis mailing list
>> Fis at listas.unizar.es
>> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>>
>
>
> _______________________________________________
> Fis mailing list
> Fis at listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listas.unizar.es/pipermail/fis/attachments/20140831/63130858/attachment.html>


More information about the Fis mailing list