<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<html>
<head>
</head>
<body bgcolor="#ffffff" text="#000000">
<br>
<u><big><b>Steps to a theory of reference & significance in
information<br>
</b></big></u><big><b>FIS discussion paper by Terrence W. Deacon (2015)</b></big><br>
<br>
This is the link to download the whole paper:
<a class="moz-txt-link-freetext"
href="https://www.dropbox.com/s/v5o8pwx3ggmmmnb/FIS%20Deacon%20on%20information%20v2.pdf?dl=0">https://www.dropbox.com/s/v5o8pwx3ggmmmnb/FIS%20Deacon%20on%20information%20v2.pdf?dl=0</a><br>
<br>
<i>"The mere fact that the same mathematical expression - Σ pi log pi
occurs both in statistical<br>
mechanics and in information theory does not in itself establish any
connection between these<br>
fields. This can be done only by finding new viewpoints from which
thermodynamic entropy and<br>
information-theory entropy appear as the same concept." </i>(Jaynes
1957, p. 621)<br>
<br>
<i>"What I have tried to do is to turn information theory upside down
to make what the<br>
engineers call 'redundancy' [coding syntax ] but I call 'pattern' into
the primary<br>
phenomenon. . . . “</i> (Gregory Bateson, letter to John Lilly on his
dolphin research, 10/05/1968)<br>
<br>
<b>Introduction</b><br>
In common use and in its etymology the term ‘information’ has always
been associated with<br>
concepts of reference and significance—that is to say it is about
something for some use. But<br>
following the landmark paper by Claude Shannon in 1948 (and later
developments by Wiener,<br>
Kolmogorov, and others) the technical use of the term became almost
entirely restricted to refer<br>
to signal properties of a communication medium irrespective of
reference or use. In the<br>
introduction to this seminal report, Shannon points out that although
communications often have<br>
meaning, “These semantic aspects of communication are irrelevant to the
engineering problem”<br>
which is to provide a precise engineering tool to assess the
computational and physical demands<br>
of the transmission, storage, and encryption of communications in all
forms.<br>
<br>
The theory provided a way to precisely measure these properties as well
as to determine<br>
limits on compression, encryption, and error correction. By a sort of
metonymic shorthand this<br>
quantity (measured in bits) came to be considered synonymous with the
meaning of<br>
‘information’ (both in the technical literature and in colloquial use
in the IT world) but at the cost<br>
of inconsistency with its most distinctive defining attributes.<br>
<br>
This definition was, however, consistent with a tacit metaphysical
principle assumed in the<br>
contemporary natural sciences: the assertion that only material and
energetic properties can be<br>
assigned causal power and that appeals to teleological explanations are
illegitimate. This<br>
methodological framework recognizes that teleological explanations
merely assign a locus of<br>
cause but fail to provide any mechanism, and so they effectively mark a
point where explanation<br>
ceases. But this stance does not also entail a denial of the reality of
teleological forms of<br>
causality nor does it require that they can be entirely reduced to
intrinsic material and energetic<br>
properties.<br>
<br>
Reference and significance are both implicitly teleological concepts in
the sense that they<br>
require an interpretive context (i.e. a point of view) and are not
intrinsic to any specific physical<br>
substrate (e.g. in the way that mass and charge are). By abstracting
the technical definition of<br>
information away from these extrinsic properties Shannon provided a
concept of information that<br>
could be used to measure a formal property that is inherent in all
physical phenomena: their<br>
organization. Because of its minimalism, this conception of information
became a precise and<br>
widely applicable analytic tool that has fueled advances in many
fields, from fundamental<br>
physics to genetics to computation. But this strength has also has
undermined its usefulness in<br>
fields distinguished by the need to explain the non-intrinsic
properties associated with<br>
information. This has limited its value for organismal biology where
function is fundamental, for<br>
the cognitive sciences where representation is a central issue, and for
the social sciences where<br>
normative assessment seem unavoidable. So this technical redefinition
of information has been<br>
both a virtue and a limitation.<br>
<br>
The central goal of this essay is to demonstrate that the previously
set aside (and presumed<br>
nonphysical) properties of reference and significance (i.e.
normativity) can be re-incorporated<br>
into a rigorous formal analysis of information that is suitable for use
in both the physical (e.g.<br>
quantum theory, cosmology, computation theory) and semiotic sciences
(e.g. biology, cognitive<br>
science, economics). This analysis will build on Shannon’s
formalization of information, but will<br>
extend it to explicitly model its link to the statistical and
thermodynamic properties of its<br>
physical context and to the physical work of interpreting it. It is
argued that an accurate analysis<br>
of the non-intrinsic attributes that distinguish information from mere
physical differences is not<br>
only feasible, but necessary to account for its distinctive form of
causal efficacy.<br>
<br>
Initial qualitative and conceptual steps toward this augmentation of
information theory have<br>
been outlined in a number of recent works (Deacon 2007, 2008, 2010,
2012; Deacon &<br>
Koutroufinis, 2012; Deacon , Bacigaluppi & Srivastava, 2014). In
these studies we hypothesize<br>
that both a determination of reference and a measure of significance or
functional value can be<br>
formulated in terms of how the extrinsic physical modification of an
information bearing<br>
medium affects the dynamics of an interpreting system that exhibits
intrinsically end-directed<br>
and self-preserving properties.<br>
<br>
[...]<br>
<br>
<big>A model system</big><br>
To test these principles and their relationship to reference and
significance, I and my<br>
colleagues have conceived of an empirically realizable and testable
thought experiment. As in<br>
most efforts to formalize basic physical properties it is useful to
begin with a simple model<br>
system in which all aspects of the process can be unambiguously
represented. For our purposes<br>
we describe a theoretical molecular system called an autogen, which
maintains itself against<br>
degradation by reconstituting damaged components and reconstituting
system integrity. This<br>
model system involves an empirically realizable molecular complex
described previously<br>
(Deacon 2012; also in Deacon & Cashman 2012; and also called an
autocell in Deacon 2006a,<br>
2007; 2009; and Deacon & Sherman 2008).<br>
<br>
[...]<br>
<br>
In this way we can use formal and simulated versions of autogenesis to
develop a measure of<br>
relative significance, in the form of “work saved.” I hypothesize that
this simple model system<br>
exemplifies the most basic dynamical system upon which a formal
analysis of informational<br>
interpretation and significance can be based.<br>
<br>
[...]<br>
<br>
In both forms, modifications of the autogenic process is provided with
information referring<br>
to its own preservation via boundary conditions (external or internal)
that are predictive of<br>
successful self-preservation. The significance of information of either
sort is assessed by the<br>
relative minimization of work per work cycle, and therefore the
decreased uncertainty of selfreferential<br>
constraint preservation. In this way interpretation is analogous to the
decrease in<br>
uncertainty that is a measure of received information in Shannonian
theory, but at a teleodynamic<br>
system level.<br>
<br>
Using these three variants of a simple model system I claim that we can
precisely analyze the<br>
relationships between information medium properties, intrinsically
end-directed work, and the<br>
way these enable system-extrinsic physical conditions to become
referential information<br>
significant to system ends. These relationships are not only simple
enough to formalize, but they<br>
can be simulated by computer algorithms at various levels of logical
and physical detail. I<br>
believe that creating and experimenting with these simulated autogenic
systems will enable us to<br>
reframe the mysteries of reference and significance as tractable
problems, susceptible to exact<br>
formal and empirical analysis. This is still a far cry from a theory of
information that is<br>
sufficiently developed to provide a basis for a scientific semiotic
theory much less than an<br>
explanation of how human brains interpret information, but it may offer
a rigorous physical<br>
foundation upon which these more complex theories can be developed.<br>
<br>
<div dir="ltr"><br>
<div><b>— Terry</b></div>
</div>
<div class="gmail_extra">Professor Terrence W. Deacon<br>
<div class="gmail_signature">University of California, Berkeley</div>
</div>
<br>
<pre class="moz-signature" cols="72">--
-------------------------------------------------
Pedro C. Marijuán
Grupo de Bioinformación / Bioinformation Group
Instituto Aragonés de Ciencias de la Salud
Centro de Investigación Biomédica de Aragón (CIBA)
Avda. San Juan Bosco, 13, planta X
50009 Zaragoza, Spain
Tfno. +34 976 71 3526 (& 6818)
<a class="moz-txt-link-abbreviated"
href="mailto:pcmarijuan.iacs@aragon.es">pcmarijuan.iacs@aragon.es</a>
<a class="moz-txt-link-freetext"
href="http://sites.google.com/site/pedrocmarijuan/">http://sites.google.com/site/pedrocmarijuan/</a>
-------------------------------------------------
</pre>
</body>
</html>