[Fis] Brenner 2020 New Year Lecture

Howard Bloom howlbloom at aol.com
Sat Jan 4 00:52:07 CET 2020


excellent example, michel.
one social group's truth is another's lie.

different social groups see reality through different perceptual lenses.  different social groups see through the lenses of different world views.
again, one group's truth is another group's lie.
with warmth and oomph--howard




-----Original Message-----
From: Michel Petitjean <petitjean.chiral at gmail.com>
To: Mark Johnson <johnsonmwj1 at gmail.com>
Cc: fis <fis at listas.unizar.es>
Sent: Fri, Jan 3, 2020 7:25 am
Subject: Re: [Fis] Brenner 2020 New Year Lecture

Dear Joe, Jose, Stan, Mark, and All,

First let me congratulate Joe for his New Year Lecture.
Providing definitions of information, misinformation and
disinformation, is extremely useful.
It is an excellent starting point.
- "Misinformation is false information": ok, but the status true vs.
false is an ideal view, because it cannot be always decided, may be
apart in specific cases (e.g. a math theorem with clear conditions of
applications).
It can be time dependant, e.g. it may happen that a scientific fact
can be true one day, then false the next day due to more accurate
measures or else.
Some "facts" are a matter of opinions as long as they cannot be proved
(only arguments).
An optical illusion can lead to classify as a fact something that is not a fact.
- Disinformation: the "bad vs. good" intentionality is extremely
difficult to establish (Jose did pertinent remarks about that).
Let me propose a polemical example: somebody hears the voice of God
telling him to kill somebody else (e.g. Abraham and Isaac, fanatic
believers of any religion, etc.), and nobody is placed to stop him.
Is he right to perform the action that he is convinced to do?
And he may think that even if he is not sure to be right, at least he
does what God said, and anyway those who deserved to survive will
survive in the paradise.
Discussing this situation is not the goal here, it is just a polemical example.
Possibly classifying information into categories is as difficult as
classifying animals or plants, or classifying living organisms vs. not
living organisms (plants? virus? etc.).
In any case, the fisrt step is to propose definitions.
They can be criticized, but at least they exist and they are a basis
to move forward.
So, many thanks to you, Joe!

Best wishes and Happy New Year.

Michel.

Michel Petitjean
Université de Paris, BFA, CNRS UMR 8251, INSERM ERL U1133, F-75013 Paris, France
Phone: +331 5727 8434; Fax: +331 5727 8372
E-mail: petitjean.chiral at gmail.com (preferred),
        michel.petitjean at univ-paris-diderot.fr
http://petitjeanmichel.free.fr/itoweb.petitjean.html


Le ven. 3 janv. 2020 à 04:16, Mark Johnson <johnsonmwj1 at gmail.com> a écrit :
>
> Dear Joseph and Stan,
>
> First of all, thanks to Joseph for this provocation about disinformation. I think it's an excellent assessment of our discussions on FIS, and draws attention to a very practical problem of truth.
>
> I'm interested in Stan's comments:
> 1) the idea that disinformation is a "report".
> 2) the idea that the "searcher determines the comprehensibility of information for the searcher"
>
> 1) raises a question about whether information is form or process. Is a report a "form"? (albeit one that then contributes to a communication process). Stan - do you mean to say that such a "report" is a kind of artefact? I'm sympathetic to this because it is one of the mistakes of "big data" (IMO) that sight is lost of the analytical "document" which is the end-result of data processing. It is given the "status" of truth by the agencies involved in its creation (much in the way Searle describes "status functions")
>
> 2) raises a question about meaning, and specifically, where and how the selection mechanism for determining meaning is constructed. Immediately we back to the difference between (dis)information and meaning. If we put emphasis on the agents involved (whether searcher or producer) I can't see how we end up with a theory of (dis)information. Rather we get a theory of agency (or a theory of a transcendental subject). Isn't it reasonable to suggest that a theory of information must be trans-individual in the sense that someone like Simondon (or more recently, Yuk Hui, whose "Recursivity and Contingency" spells this out most elegantly) envisages?
>
> Following this, I have some questions. Might a trans-individual perspective throw some light on disinformation as something which destroys the coherence of the "information environment"? (While I agree with Joseph's judgement on Floridi, I think Floridi's information ethics, which he sees as a variety of environmental ethics, is good). Secondly, is coherence in some way essential to the construction of a selection mechanism for determining meaning? If disinformation destroys the operation of that selection mechanism, then disinformation can be a very powerful weapon.
>
> Finally, I want to raise a question about the last point the Joseph makes. What is an institution? I'm not sure about this definition. What about families, churches, orchestras, universities, businesses? They're not reducible to their messages. They may be reducible to their capacity to conserve information in their operations. I think we have to be careful here: the idea of creating an "information police" is unattractive, or open to abuse (as the Conservative party in the UK showed in the recent election when they masqueraded as a "Fact Checking" service on Twitter!)
>
> Best wishes,
>
> Mark
>
>
> On Thu, 2 Jan 2020 at 20:49, Stanley N Salthe <ssalthe at binghamton.edu> wrote:
>>
>> Joseph -- Here I post a pdf version of your writing, which makes it easier to comment upon. In ot I comment upon your introduction
>> only. The rest does not in my view involve concepts, merely examples.
>>
>> STAN
>>
>> On Thu, Jan 2, 2020 at 3:34 AM Joseph Brenner <joe.brenner at bluewin.ch> wrote:
>>>
>>> Dear FIS Friends and Colleagues,
>>>
>>>
>>>
>>> My best wishes for a healthy, happy and productive New Year!
>>>
>>>
>>>
>>> As requested by Pedro, following a dialogue with him on the subject of disinformation, I attach below a few pages that I have prepared on the subject. I have also attached the file, but the system may not accept it. If anyone needs a separate Word copy, please let me know.
>>>
>>>
>>>
>>> I look forward to your comments, criticisms and suggestions of examples. I will let the format for summaries ‘emerge’ from your responses and the subsequent discussion.
>>>
>>>
>>>
>>> Cheers,
>>>
>>>
>>>
>>> Joseph a.k.a Joe
>>>
>>>
>>>
>>> STRUCTURES OF INFORMATION AND DISINFORMATION
>>>
>>>
>>>
>>> Joseph Brenner
>>>
>>>
>>>
>>> These notes summarize some of my recent thoughts about disinformation as a valid subject of discussion within FIS. They have emerged in part from the massive amounts of disinformation produced by, among others, the current Administration of the United States and its most partisan supporters. The notes are not intended for publication as such, but, as usual to generate exchanges. I certainly urge readers to provide their own examples of forms of disinformation to complete the few noted below.
>>>
>>>
>>>
>>>
>>>
>>> INTRODUCTION
>>>
>>>
>>>
>>> 1 The Structure of Information
>>>
>>>            Those of us who have been able to learn from the FIS discussions of the last, now, 20 years will realize that they have not led to a fully agreed-upon definition of information. This is perhaps an indication that a single ‘clear’ definition is neither possible nor desirable, but even this meta-question has not resulted in a consensus.
>>>
>>>            A key related concept, only touched on in prior discussion, is the structure of information.  In the comments to the subject “Revisiting the Fluctuon Model”, of which I was one of the two organizers. Loet Leydesdorff wrote (25 Sep 2010, in part): “In the Informational Structural Realism of Floridi, reality is an informational structure. The It-part (of the It-from-Bit model) is in the “structure” which assumes the specification of a system of reference. In evolutionary terms: structure is deterministic/selective; Shannon-type information measures only variation/uncertainty.” The immediate corollary is that the structure of information is both real and dynamic. It is a meaningful process, in my opinion insufficiently recognized (cognized) as such. The idea that structure is an ontological/dynamic process is to be found in the work of Stéphane Lupasco “Qu’est-ce qu’une structure?” In contrast, Floridi’s description is static, epistemological only. More familiar to most readers will be the work of Anthony Giddens who captured the dynamic properties of processes by the terms ‘structuring’ or ‘structuration’, also used in French by Lupasco. Other key structural properties of information include - breadth: a scalar measure applicable to categorization and comprehension (or comprehensibility): presumably a higher dimensional parameter.
>>>
>>>            In this period of 2011 and after, additional seminal ideas about the structural aspects of information were presented by Mark Burgin, Terrence Deacon and Stuart Kauffmann and their colleagues which centered on the concept of information as a constraint on the evolution of processes. Deacon went further in relating information to absence rather than only to the uncertainty in the original concept of Shannon. I expanded this to the duality absence-presence. Today, I would ask what can we say about the structure of information that is new and that we have learned in the last 9+ years?
>>>
>>>
>>>
>>> 2 The Structure of Disinformation
>>>
>>>            Some people have suggested that disinformation is radically different in kind from information. I believe that disinformation has a structure close to if not identical to that of information. The big differences lie in the intentionality behind it and its meaning content and its consequences. For discussion, we may try to see if there are ‘signs’ of the falsity and intent to deceive that are perceptible and hence may characterize disinformation. In any case, its consequences can be same as for misinformation, but the intentionality is clearly different, as indicated below.
>>>
>>>
>>>
>>>
>>>
>>> DEFINITIONS
>>>
>>>
>>>
>>> 1. Information
>>>
>>>              For the purposes of this exercise, I will give my own definition of information as a process of informing, a transfer of knowledge from one human being to another that is meaningful in the sense of having value for his/her survival or pleasure. It supervenes on the definition of information as data (Floridi). The theory of information includes its communication or messaging, Angeletics in the term of Capurro.
>>>
>>>
>>>
>>> 2. Misinformation
>>>
>>>              Misinformation is false information that has been generated and transferred by accident, without any intention on the part of the sender. Any negative consequences, even if they are disastrous, does not imply negative intent, but the sender may still be held responsible for them. Negligence, at least in a somewhat decent society, cannot be allowed to go without suitable reaction.
>>>
>>>
>>>
>>> 3. Disinformation
>>>
>>>              As I have just learned from Wikipedia, we have Joseph Stalin to thank for the invention (and use) of the term dezinformatsiya, which then entered French and English. Today, disinformation has become a major topic of concern at the level of the European Union as evidenced in this March, 2019 article, https://www.europarl.europa.eu/RegData/etudes/STUD/2019/624279/EPRS_STU(2019), “Regulating disinformation with artificial intelligence. Effects of disinformation initiatives on freedom of expression and media pluralism”.
>>>
>>> For me, disinformation – disinforming - is an intentional process whose objective is to subvert information for criminal and/or selfish purposes. It is characterized by having no meaning, since there is no dialectical relation between message and intent, and any meaning, for the disinformer, is subordinate to his/her underlying – lying – objective. In other words,
>>>
>>> disinformation is a lie, characterized by the logical properties of semantic, mathematical and visual paradoxes, namely, the perceivable oscillation between limiting binary logical states of yes or no, truth or falsity, 0 and 1. In the social domain, disinformation is a tool, a method of attempting domination by any means, ipso facto immoral or unethical.
>>>
>>> My definition can be compared with that of the EU study: “false, inaccurate or misleading information designed, presented and promoted to intentionally cause public harm or for profit”. The difference with misinformation is as in the above in its intentionality.
>>>
>>>
>>>
>>>
>>>
>>> STRUCTURES OF DISINFORMATION
>>>
>>>
>>>
>>> 1. Forms
>>>
>>>              Typical forms of disinformation consist of messages that are incomplete and misleading as well as directly false. Disinformation in this sense is close to lying by omission, and in fact one could consider disinformation as describing lying in the social sphere. People who withhold information about their physical condition in connection with their employment are ‘engaging’ in this form of disinformation, and I point here to the utility of using the verb form instead of the noun.
>>>
>>>
>>>
>>> 2. Domains. Socio-politics of Disinformation
>>>
>>>              Disinformation in all walks of life is so prevalent that it becomes – almost – taken for granted. This is becoming an increasingly greater danger for the society in view of the influence of social media, some of which can now only be described as anti-social media. In fact, the only question may be to what extent political and narrow economic objectives can be maintained without disinformation.
>>>
>>>            There is no obvious solution, as we are very close here to the domain of belief, from which science is excluded. There is no overlap or interaction possible in the information/disinformation content of the following two statements: “Climate change is an impending disaster for which there is almost no remaining time to avoid,” and  “Climate change is a hoax propagated by Communists  to weaken the U. S. economy.”
>>>
>>>
>>>
>>> 3. Philosophy
>>>
>>>              Philosophy and the social sciences in general benefit from the vast capacities for identification of sources that are now available. On the other hand, these are more than compensated by the information explosion, such that finding all relevant references is still a difficult process. Disinformation can come down to a very specific, at least partly intentional process of ignoring easily available references.
>>>
>>> Other methods include swamping of new results by overemphasis on classical sources of only historical value.
>>>
>>>
>>>
>>> 4. Scientific Literature
>>>
>>>            In general in science, disinformation becomes roughly equivalent to fraud, the dissemination of data not obtained by actual experiments. However, for data with major social implications, such as data on climate change, its misuse is a clear example of disinformation including a major ideological component as in 2 above..
>>>
>>>              In addition, false accusations of fraud or plagiarism are usually supported by a mass of disinformation which can become auto-catalytic.
>>>
>>>
>>>
>>> 5. Advertising. Gambling and Lotteries
>>>
>>>              In my opinion, there is a difference between making people aware of the availability of consumer goods and services and aggressive advertising of them. The latter will generally involve recourse to clearly unethical practices based on psychological tools, known since antiquity, but whose effectiveness is unfortunately enhanced by modern technology. ‘Creating demand’ is an accepted professional objective, despite being probably counterproductive for the common good.
>>>
>>>              Promotion of gambling and lotteries always overemphasizes the potential gains compared to their low probability in a specific instance. To be fair, some TV advertising for sports now includes the message “Bet Responsibly”, calling attention to possible, if not probable losses which the bettor might not be able to afford.. This opens up the entire domain of the ethics of production and marketing of goods that are not vital to existence. The authors of disinformation are watching closely the outcome of the related debate
>>>
>>>
>>>
>>> 6. “The Informer”. Délation or Denouncement
>>>
>>>              As a different topic in these notes, I would like to mention the 1935 movie “The Informer”, starring Victor McLaglen. The main character provides a canonical example of a negative transfer of information that is true! What is involved is the treacherous transfer of correct information about one group to its controlling opposition with disastrous results for the former, in this case, during the ‘troubles’ in Ireland. The disinformation, of course, lies in the concealing by the informer of his intentions and actions. The French term délation, and native French-speakers may wish to correct this, always has for me the implication that the denouncement carries disinformation.
>>>
>>>
>>>
>>> 7. Combating Disinformation
>>>
>>>              There are several levels on which disinformation can be combated: 1) on the personal level, correcting false information in one’s personal network; 2) on the institutional level. Let me define an institution as a group that is present in the public domain with sufficient resources to insure the reception of its messages by a wide audience. I separate this from individuals accessing masses of people through social media. Let us assume that the Foundations of Information Science initiative is such an institution. Then its members – we – must, can and should, report instances of disinformation to an organ in the institution that would insure its dissemination.
>>>
>>>              I have no idea whether or not this would ‘work’, but I feel that it could do no harm for anyone with the access to the FIS site to see a regularly up-dated Section listing examples of disinformation which we have encountered. Many further details on regulatory and technological responses to disinformation are provided in the EU study, and some of them should be addressed in the forthcoming discussion
>>>
>>>
>>>
>>>
>>>
>>> _______________________________________________
>>> Fis mailing list
>>> Fis at listas.unizar.es
>>> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>>> ----------
>>> INFORMACIÓN SOBRE PROTECCIÓN DE DATOS DE CARÁCTER PERSONAL
>>>
>>> Ud. recibe este correo por pertenecer a una lista de correo gestionada por la Universidad de Zaragoza.
>>> Puede encontrar toda la información sobre como tratamos sus datos en el siguiente enlace: https://sicuz.unizar.es/informacion-sobre-proteccion-de-datos-de-caracter-personal-en-listas
>>> Recuerde que si está suscrito a una lista voluntaria Ud. puede darse de baja desde la propia aplicación en el momento en que lo desee.
>>> http://listas.unizar.es
>>> ----------
>>
>> _______________________________________________
>> Fis mailing list
>> Fis at listas.unizar.es
>> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>> ----------
>> INFORMACIÓN SOBRE PROTECCIÓN DE DATOS DE CARÁCTER PERSONAL
>>
>> Ud. recibe este correo por pertenecer a una lista de correo gestionada por la Universidad de Zaragoza.
>> Puede encontrar toda la información sobre como tratamos sus datos en el siguiente enlace: https://sicuz.unizar.es/informacion-sobre-proteccion-de-datos-de-caracter-personal-en-listas
>> Recuerde que si está suscrito a una lista voluntaria Ud. puede darse de baja desde la propia aplicación en el momento en que lo desee.
>> http://listas.unizar.es
>> ----------
>
>
>
> --
> Dr. Mark William Johnson
> Institute of Learning and Teaching
> Faculty of Health and Life Sciences
> University of Liverpool
>
> Phone: 07786 064505
> Email: johnsonmwj1 at gmail.com
> Blog: http://dailyimprovisation.blogspot.com
> _______________________________________________
> Fis mailing list
> Fis at listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
> ----------
> INFORMACIÓN SOBRE PROTECCIÓN DE DATOS DE CARÁCTER PERSONAL
>
> Ud. recibe este correo por pertenecer a una lista de correo gestionada por la Universidad de Zaragoza.
> Puede encontrar toda la información sobre como tratamos sus datos en el siguiente enlace: https://sicuz.unizar.es/informacion-sobre-proteccion-de-datos-de-caracter-personal-en-listas
> Recuerde que si está suscrito a una lista voluntaria Ud. puede darse de baja desde la propia aplicación en el momento en que lo desee.
> http://listas.unizar.es
> ----------

_______________________________________________
Fis mailing list
Fis at listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
----------
INFORMACIN SOBRE PROTECCIN DE DATOS DE CARCTER PERSONAL

Ud. recibe este correo por pertenecer a una lista de correo gestionada por la Universidad de Zaragoza.
Puede encontrar toda la informacin sobre como tratamos sus datos en el siguiente enlace: https://sicuz.unizar.es/informacion-sobre-proteccion-de-datos-de-caracter-personal-en-listas
Recuerde que si est suscrito a una lista voluntaria Ud. puede darse de baja desde la propia aplicacin en el momento en que lo desee.
http://listas.unizar.es
----------
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listas.unizar.es/pipermail/fis/attachments/20200103/1058abfc/attachment-0001.html>


More information about the Fis mailing list