<div dir="ltr"><div dir="ltr">
<p class="gmail-p1"><span style="line-height:1.5">Pedro has written:</span><br></p><p class="gmail-p1"><span class="gmail-s1">Thanks to Arieh and Stan. I agree that given the sheer variety of cases --or systems-- where physical entropy may be meaningfully considered, and calculated, a common interpretation for all of them is problematic. </span></p>
<p class="gmail-p2"><span style="line-height:1.5">But thinking in the ordinary chemical reactions and molecules present in life, my proposal of "freedom" or "dynamic freedoms of the system" has the advantage that brings to light --and dovetails-- the other factor in Gibbs free energy: the enthalpy, the energy change in bonds. So on the one side we have the increased (or decreased) motions status (position & momenta, also including vibrations & oscillations), thus the degrees of freedoms. And on the other side the change in internal energy associated </span>to that<span style="line-height:1.5"> "making and breaking of bonds". Freedoms versus bonds. Entropy versus enthalpy. And depending on the respective signs they should be added, rested, etc. At least it has the advantage of being clearly </span>explainable<span style="line-height:1.5"> to students without the usual mystical-mystery-tour.</span><br><span class="gmail-s1"></span></p>
<p class="gmail-p1"><span class="gmail-s1"> S: For living systems, I think we can assume a relatively stable enthalpy, with which we can then see that energy changes created by system activity will result in a USUAL and continuing entropy production, which will institute a necessary search for more sources of energy by the system. In the best cases (the living system survives) that search is successful. Enthalpy considerations seem to me to delver no freedom from energy loss that must be replenished (by way of, at least temporarily, producing MORE loss (entropy)</span></p>
<p class="gmail-p2"><span style="line-height:1.5">PM: What Stan says is OK, but it has </span>implicit the proverbial<span style="line-height:1.5"> (scientific? informational?) observer. We have had plenty of discussions on that...</span><br><span class="gmail-s1"></span></p>
<p class="gmail-p1"><span class="gmail-s1"> S: It is observations (of one kind or other) that generate information, giving this discussion club much </span><span style="line-height:1.5">to consider!</span></p>
<p class="gmail-p2"><span class="gmail-s1"> PM: </span><span style="line-height:1.5">Joseph and Jerry made mention, if I don't remember it wrongly, to Pauli´s exclusion principle as counteracting entropy.</span></p>
<p class="gmail-p1"><span class="gmail-s1"> S: Since we here descend into the Quantum level, it’s not clear to me that the observable entropy of an ordinary system will be significantly changed by PALPABLE CHANGES at this level. Is there evidence for that changeably adding to the disorder of a system?</span></p>
<p class="gmail-p2"><span style="line-height:1.5">PM: By opening the possibility of bonding between heterogeneous atoms & molecules, there is evident entropy decreases, but at the same time the heterogeneity of the matter systems is remarkably increased, at least as "entropy of mixing."</span><br><span class="gmail-s1"></span></p>
<p class="gmail-p1"><span class="gmail-s1"> S: So here we have more entropy production! I don’t see how these considerations negate my championing *what I would call” the Realm of Entropy!</span></p>
<p class="gmail-p1"><span class="gmail-s1">STAN </span></p></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Jan 21, 2021 at 7:45 AM Pedro C. Marijuan <<a href="mailto:pcmarijuan.iacs@aragon.es">pcmarijuan.iacs@aragon.es</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex">
<div>
<div>Dear All,</div>
<div><br>
</div>
<div>Thanks to Arieh and Stan. I agree that
given the sheer variety of cases --or systems-- where physical
entropy may be meaningfully considered, and calculated, a common
interpretation for all of them is problematic. But thinking in the
ordinary chemical reactions and molecules present in life, my
proposal of "freedom" or "dynamic freedoms of the system" has the
advantage that brings to light --and dovetails-- the other factor
in Gibbs free energy: the enthalpy, the energy change in bonds. So
on the one side we have the increased (or decreased) motions
status (position & momenta, also including vibrations &
oscillations), thus the degrees of freedoms. And on the other side
the change in internal energy associated to that "making and
breaking of bonds". Freedoms versus bonds. Entropy versus
enthalpy. And depending on the respective signs they should be
added, rested, etc. At least it has the advantage of being clearly
expalinable to students without the usual mystical-mystery-tour.<br>
</div>
<div>What Stan says is OK, but it has
implicit the proverbial (scientific? informational?) observer. We
have had plenty of discussions on that... <br>
</div>
<div>Joseph and Jerry made mention, if I
don't remember it wrongly, to Pauli´s exclusion principle as
counteracting entropy. By opening the possibility of bonding
between heterogeneous atoms & molecules, there is evident
entropy decreases, but at the same time the heterogeneity of the
matter systems is remarkably increased, at least as "entropy of
mixing." It may be tricky depending on the initial/final
conditions considered to establish the entropy balance. Shu-Kun
worked years ago on the paradoxes associated to that entropy of
mixing.<br>
</div>
<div><br>
</div>
<div>Best greetings to all!</div>
<div>--Pedro</div>
<div><br>
</div>
<div> it is El 18/01/2021 a las 21:28,
Stanley N Salthe escribió:<br>
</div>
<blockquote type="cite">
<div dir="ltr"> Pedro -- Regarding entropy definitions in respect
to information, I have found it useful to consider that what in
Information Theory
<div>is often/typically <span style="line-height:1.5">referred to
as 'information' ought, instead, to be referred to as
'information carrying </span>capacity<span style="line-height:1.5">', while a sample taken </span></div>
<div><span style="line-height:1.5">from this would then be
'information' neat. In other words, any system we encounter
will have a capacity to deliver information upon</span></div>
<div><span style="line-height:1.5">investigation. </span></div>
<div><span style="line-height:1.5">STAN</span></div>
</div>
<br>
<div class="gmail_quote">
<div dir="ltr" class="gmail_attr">On Mon, Jan 18, 2021 at 6:32
AM Arieh Ben-Naim <<a href="mailto:ariehbennaim@gmail.com" target="_blank">ariehbennaim@gmail.com</a>> wrote:<br>
</div>
<blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex">Dear
Pedro,
<div>Sorry for the long delay in my answer.</div>
<div>I will answer briefly since a long answer would take too
much time and space.</div>
<div>1. I fully agree that entropy and SMI should be given
different names. The fact that Shannon called his measure (
the SMI ) entropy, was a grave mistake.</div>
<div>This has caused great confusion in both thermodynamics
and information theory. ( confusing entropy with SMI and SMI
with “information” is very common)</div>
<div>2. I agree that in some specific processes the entropy
change may be associated with increase in “freedom” or with
“spreading.” </div>
<div>This is also true for “disorder” however, none of these
interpretations can be derived from the definition of
entropy. So, in general, they are not correct interpretation
of entropy.</div>
<div>3. Of course the Gibbs energy is the quantity which is
more fundamental in a system at constant T and P. The
formulation of the second law in terms of entropy is valid
only for isolated systems.</div>
<div>Best wishes</div>
<div>Arieh</div>
<div><br>
On Wednesday, January 13, 2021, Pedro C. Marijuan <<a href="mailto:pcmarijuan.iacs@aragon.es" target="_blank">pcmarijuan.iacs@aragon.es</a>>
wrote:<br>
<blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex">
<div>
<div>Dear Arieh and FIS Colleagues,</div>
<div><br>
</div>
<div>Again, regarding the interpretation of entropy as
information or uncertainty (respect locations &
momenta), my suggestion is that we should call
"something else" to the extensive magnitude we are
measuring. Otherwise, we easily give room to all those
pseudo-interpretations you wonderfully describe and
the general confusion with information. When I was
teaching a bio-discipline for engineer students, I was
using the term "freedom". You are measuring the
degrees of freedom of the system, I was telling, how
much they have changed either increasing or decreasing
along the process. And that quantity of degrees of
freedom when multiplied by the temperature gives the
energy gained or lost via entropy change. So, the
application of SMI to the system what tells us is
about the degrees of freedom statistical measurement.
In short: the new freedom status of the system. It may
not be a terrific theoretical interpretation, but my
students loved it (I confess that previously I was
using Lambert's "dispersal", until I saw your
criticisms in the web).</div>
<div><br>
</div>
<div>All of this is well depicted in Gibbs' formula for
the free energy of chemical reactions, with the part
due to ENTHALPY change (basically internal energy),
and the part due to ENTROPY change x Temp. Given than
most biological --all!-- occur to room temperature (or
far lower), the resulting contribution of entropy is
pretty moderate. Actually I was checking chemical
tables of many biological reactions, and the
contribution of entropy was generally an order of
magnitude lower than enthalpy--around 10% or less. I
wonder --following Howard's comment days ago-- all the
obsession with entropy and the universal neglect of
enthalpy. It is enthalpy (or internal energy, or free
energy) the big concern of all biological systems,
"the food of today", rather than the negentropy or low
entropy you graciously comment. <br>
</div>
<div><br>
</div>
<div>I had also a comment regarding Joseph's and Jerry's
on Pauli Exclusion Principle, but better for another
day.</div>
<div><br>
</div>
<div>Best wishes to all--Pedro<br>
</div>
<div><br>
</div>
<div>El 08/01/2021 a las 19:22, Arieh Ben-Naim escribió:<br>
</div>
<blockquote type="cite">
<div>
<div dir="auto">Dear Pedro and everyone else
interested.</div>
<div dir="auto"><br>
</div>
<div dir="auto">I will briefly answer your
questions, as I have discussed these in more
details in my previews publications.</div>
<div dir="auto">Regarding the interpretation of
entropy as a measure of information and as
uncertainty.</div>
<div dir="auto">In my view these are the only valid
interpretations of entropy, provided that one
specifies information about what, and uncertainty
with respect to what. Both of these are discussed
in my book on “Information Theory” and in
“Entropy: the Greatest Blunder in the History of
Science.”</div>
<div dir="auto"><br>
</div>
<div dir="auto">Regarding the question of dimensions
for entropy, I have also discussed that in my book
on “Farewell to Entropy.”</div>
<div dir="auto">In my view this is a result of
historical “accident” that temperature was given a
special dimensions that were not used before in
physics:”degrees” </div>
<div dir="auto">If temperature was given the
dimensions of energy, then entropy would become
dimensionless. This would ease the acceptance of
the only correct interpretation of entropy either
as a measure of information or a measure of
uncertainty (in both cases the information and
uncertainty with respect to the distribution of
locations and momenta of all particles of the
system at equilibrium)</div>
<div dir="auto"><br>
</div>
<div dir="auto">In my recent book I have criticized
both Shermer’s and Pinker’s misuses of the concept
of entropy in their books.</div>
<div dir="auto"><br>
</div>
<div dir="auto">I hope I have answered your
questions, though very briefly.</div>
<div dir="auto">Best regards</div>
<div dir="auto">Arieh</div>
<br>
<div class="gmail_quote">
<div>On Fri, Jan 8, 2021 at 15:40 Pedro C.
Marijuan <<a href="mailto:pcmarijuan.iacs@aragon.es" target="_blank">pcmarijuan.iacs@aragon.es</a>>
wrote:<br>
</div>
<blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex">Dear
Arieh & FISers,<br>
<br>
My appreciation for your elegant and lively
contribution. Reading about <br>
labeling the food on their "negentropy content"
or advising on low <br>
entropy food (icy water!) was a most humorous
way to criticize all the <br>
nonsense and "mystical mystery tours" around
entropy. Particularly, <br>
Atkins (& Dawkins) quasi-religious views
have always been unswallowable <br>
for me. Anyhow, two further aspects I would like
to hear from you.<br>
<br>
First, to put it most briefly, what basic,
visual image would you convey <br>
for an intuitive understanding of entropy beyond
the technicality of <br>
SMI? Uncertainty, chaos, disorder, diffusion,
dispersal... some many <br>
terms around! I could see years ago some heated
exchanges you had with <br>
Frank Lambert, the champion of entropy as energy
dispersal. There is <br>
also a longish comment on how Steven Pinker
approaches <br>
entropy-information-evolution-cognition as the
four fundamental concepts <br>
in our scientific understanding of the world. He
is wrong (IMO) in all <br>
of them! Although it he is an extraordinary
social psychologist the <br>
basic sci. background he proposes is tainted. If
we could give a more <br>
sound "tetrad" even with those same concepts, it
would be a great <br>
outcome. Clarifying entropy in all aspects we
could advance would a <br>
terrific step.<br>
<br>
And second (maybe to be discussed later on) do
you accept "dimensions" <br>
for entropy? The units of jules/degreeK aren't
they derived from having <br>
entered the Boltzmann's constant? Why do we need
at all separate units <br>
for temperature and energy?<br>
<br>
Thanks for all the exchanges so far!<br>
<br>
Best--Pedro<br>
<br>
-- <br>
-------------------------------------------------<br>
Pedro C. Marijuán<br>
Grupo de Bioinformación / Bioinformation Group<br>
<br>
<a href="mailto:pcmarijuan.iacs@aragon.es" target="_blank">pcmarijuan.iacs@aragon.es</a><br>
<a href="http://sites.google.com/site/pedrocmarijuan/" rel="noreferrer" target="_blank">http://sites.google.com/site/pedrocmarijuan/</a><br>
-------------------------------------------------<br>
<br>
<br>
-- <br>
El software de antivirus Avast ha analizado este
correo electrónico en busca de virus.<br>
<a href="https://www.avast.com/antivirus" rel="noreferrer" target="_blank">https://www.avast.com/antivirus</a><br>
<br>
_______________________________________________<br>
Fis mailing list<br>
<a href="mailto:Fis@listas.unizar.es" target="_blank">Fis@listas.unizar.es</a><br>
<a href="http://listas.unizar.es/cgi-bin/mailman/listinfo/fis" rel="noreferrer" target="_blank">http://listas.unizar.es/cgi-bin/mailman/listinfo/fis</a><br>
----------<br>
INFORMACIN SOBRE PROTECCIN DE DATOS DE CARCTER
PERSONAL<br>
<br>
Ud. recibe este correo por pertenecer a una
lista de correo gestionada por la Universidad de
Zaragoza.<br>
Puede encontrar toda la informacin sobre como
tratamos sus datos en el siguiente enlace: <a href="https://sicuz.unizar.es/informacion-sobre-proteccion-de-datos-de-caracter-personal-en-listas" rel="noreferrer" target="_blank">https://sicuz.unizar.es/informacion-sobre-proteccion-de-datos-de-caracter-personal-en-listas</a><br>
Recuerde que si est suscrito a una lista
voluntaria Ud. puede darse de baja desde la
propia aplicacin en el momento en que lo desee.<br>
<a href="http://listas.unizar.es" rel="noreferrer" target="_blank">http://listas.unizar.es</a><br>
----------<br>
</blockquote>
</div>
</div>
-- <br>
<div dir="ltr">Prof. Arieh Ben-Naim<br>
Department of Physical Chemistry<br>
The Hebrew University of Jerusalem<br>
Jerusalem, 91904<br>
Israel</div>
<br>
<fieldset></fieldset>
<pre>_______________________________________________
Fis mailing list
<a href="mailto:Fis@listas.unizar.es" target="_blank">Fis@listas.unizar.es</a>
<a href="http://listas.unizar.es/cgi-bin/mailman/listinfo/fis" target="_blank">http://listas.unizar.es/cgi-bin/mailman/listinfo/fis</a>
----------
INFORMACIÓN SOBRE PROTECCIÓN DE DATOS DE CARÁCTER PERSONAL
Ud. recibe este correo por pertenecer a una lista de correo gestionada por la Universidad de Zaragoza.
Puede encontrar toda la información sobre como tratamos sus datos en el siguiente enlace: <a href="https://sicuz.unizar.es/informacion-sobre-proteccion-de-datos-de-caracter-personal-en-listas" target="_blank">https://sicuz.unizar.es/informacion-sobre-proteccion-de-datos-de-caracter-personal-en-listas</a>
Recuerde que si está suscrito a una lista voluntaria Ud. puede darse de baja desde la propia aplicación en el momento en que lo desee.
<a href="http://listas.unizar.es" target="_blank">http://listas.unizar.es</a>
----------
</pre>
</blockquote>
<p><br>
</p>
<pre cols="72">--
-------------------------------------------------
Pedro C. Marijuán
Grupo de Bioinformación / Bioinformation Group
<a href="mailto:pcmarijuan.iacs@aragon.es" target="_blank">pcmarijuan.iacs@aragon.es</a>
<a href="http://sites.google.com/site/pedrocmarijuan/" target="_blank">http://sites.google.com/site/pedrocmarijuan/</a>
------------------------------------------------- </pre>
<div><br>
<table style="border-top-width:1px;border-top-style:solid;border-top-color:rgb(211,212,222)">
<tbody>
<tr>
<td style="width:55px;padding-top:18px"><a href="https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=emailclient" target="_blank"><img src="https://ipmcdn.avast.com/images/icons/icon-envelope-tick-round-orange-animated-no-repeat-v1.gif" alt="" style="width: 46px; height: 29px;" width="46" height="29"></a></td>
<td style="width:470px;padding-top:17px;color:rgb(65,66,78);font-size:13px;font-family:Arial,Helvetica,sans-serif;line-height:18px">Libre
de virus. <a href="https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=emailclient" style="color:rgb(68,83,234)" target="_blank">www.avast.com</a> </td>
</tr>
</tbody>
</table>
<a href="#m_8619682629624782441_m_7084316037359452892_m_-2797674480543180183_DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2" width="1" height="1"> </a></div>
</div>
</blockquote>
</div>
<br>
<br>
-- <br>
Prof. Arieh Ben-Naim<br>
Department of Physical Chemistry<br>
The Hebrew University of Jerusalem<br>
Jerusalem, 91904<br>
Israel<br>
_______________________________________________<br>
Fis mailing list<br>
<a href="mailto:Fis@listas.unizar.es" target="_blank">Fis@listas.unizar.es</a><br>
<a href="http://listas.unizar.es/cgi-bin/mailman/listinfo/fis" rel="noreferrer" target="_blank">http://listas.unizar.es/cgi-bin/mailman/listinfo/fis</a><br>
----------<br>
INFORMACIÓN SOBRE PROTECCIÓN DE DATOS DE CARÁCTER PERSONAL<br>
<br>
Ud. recibe este correo por pertenecer a una lista de correo
gestionada por la Universidad de Zaragoza.<br>
Puede encontrar toda la información sobre como tratamos sus
datos en el siguiente enlace: <a href="https://sicuz.unizar.es/informacion-sobre-proteccion-de-datos-de-caracter-personal-en-listas" rel="noreferrer" target="_blank">https://sicuz.unizar.es/informacion-sobre-proteccion-de-datos-de-caracter-personal-en-listas</a><br>
Recuerde que si está suscrito a una lista voluntaria Ud. puede
darse de baja desde la propia aplicación en el momento en que
lo desee.<br>
<a href="http://listas.unizar.es" rel="noreferrer" target="_blank">http://listas.unizar.es</a><br>
----------<br>
</blockquote>
</div>
<br>
<fieldset></fieldset>
<pre>_______________________________________________
Fis mailing list
<a href="mailto:Fis@listas.unizar.es" target="_blank">Fis@listas.unizar.es</a>
<a href="http://listas.unizar.es/cgi-bin/mailman/listinfo/fis" target="_blank">http://listas.unizar.es/cgi-bin/mailman/listinfo/fis</a>
----------
INFORMACIÓN SOBRE PROTECCIÓN DE DATOS DE CARÁCTER PERSONAL
Ud. recibe este correo por pertenecer a una lista de correo gestionada por la Universidad de Zaragoza.
Puede encontrar toda la información sobre como tratamos sus datos en el siguiente enlace: <a href="https://sicuz.unizar.es/informacion-sobre-proteccion-de-datos-de-caracter-personal-en-listas" target="_blank">https://sicuz.unizar.es/informacion-sobre-proteccion-de-datos-de-caracter-personal-en-listas</a>
Recuerde que si está suscrito a una lista voluntaria Ud. puede darse de baja desde la propia aplicación en el momento en que lo desee.
<a href="http://listas.unizar.es" target="_blank">http://listas.unizar.es</a>
----------
</pre>
</blockquote>
<p><br>
</p>
<pre cols="72">--
-------------------------------------------------
Pedro C. Marijuán
Grupo de Bioinformación / Bioinformation Group
<a href="mailto:pcmarijuan.iacs@aragon.es" target="_blank">pcmarijuan.iacs@aragon.es</a>
<a href="http://sites.google.com/site/pedrocmarijuan/" target="_blank">http://sites.google.com/site/pedrocmarijuan/</a>
------------------------------------------------- </pre>
</div>
_______________________________________________<br>
Fis mailing list<br>
<a href="mailto:Fis@listas.unizar.es" target="_blank">Fis@listas.unizar.es</a><br>
<a href="http://listas.unizar.es/cgi-bin/mailman/listinfo/fis" rel="noreferrer" target="_blank">http://listas.unizar.es/cgi-bin/mailman/listinfo/fis</a><br>
----------<br>
INFORMACIÓN SOBRE PROTECCIÓN DE DATOS DE CARÁCTER PERSONAL<br>
<br>
Ud. recibe este correo por pertenecer a una lista de correo gestionada por la Universidad de Zaragoza.<br>
Puede encontrar toda la información sobre como tratamos sus datos en el siguiente enlace: <a href="https://sicuz.unizar.es/informacion-sobre-proteccion-de-datos-de-caracter-personal-en-listas" rel="noreferrer" target="_blank">https://sicuz.unizar.es/informacion-sobre-proteccion-de-datos-de-caracter-personal-en-listas</a><br>
Recuerde que si está suscrito a una lista voluntaria Ud. puede darse de baja desde la propia aplicación en el momento en que lo desee.<br>
<a href="http://listas.unizar.es" rel="noreferrer" target="_blank">http://listas.unizar.es</a><br>
----------<br>
</blockquote></div>