Esclusiva

Ottobre 18 2023
EDMO – Preliminary analysis of the Israel/Hamas conflict-related disinformation

The attack of Hamas and the following military reaction of Israel caused a wave of disinformation on social media

The surprise attack of Hamas on Israeli towns at the border with Gaza on the morning of Saturday 7 October, 2023, together with the following military reaction of Israel, caused a considerable wave of disinformation on all major social media platforms.

EDMO reviewed dozens of fact-checking articles published by organizations members of the EDMO network, the numerous links to disinformation content contained in them, plus others found on different platforms through proactive research, as well as different analyses originated both from member organizations and external ones. The following are the major findings of our preliminary analysis.

First of all, disinformation that flooded social media conveyed different disinformation narratives – intended as the clear message that emerges from a consistent set of contents that can be demonstrated as false using the fact-checking methodology – together with a lot of click-baiting (false content that does not push any particular message). It is possible to identify narratives that promote messages favorable or adverse to both the fighting actors.

The main disinformation narratives identified are:

  • Justification for Hamas actions: false content about alleged war crimes committed by Israel was detected (e.g. the supposed bombing of a church in Gaza, or the use of phosphorus bombs), sending the message that Hamas’ actions were somehow justified by Israeli behavior.
  • Dehumanization of terrorists: many false newsvideos and images were about alleged episodes of extreme brutality by Hamas members. It is true that the Islamist terrorists did unspeakable things during the past week, but disinformation narratives exaggerating the cruelty of their actions seem to ultimately aim to portray them as “not human”, therefore justifying any kind of action against them.
  • Exaggerate support for Hamas and its military achievements: false content spread with the goal of exaggerating the support that Hamas’ actions received in the rest of the world and the military victories against Israel.
  • Attacks against those who ask Israel to respect human rights: false content spread to accuse of antisemitism and/or support for terrorists those who criticize Israeli government’s decisions and/or provide help to Palestinian civilians.

Moreover, conspiracy theories circulated on social media together with a significant number of false news targeting Ukraine, in particular claiming that weapons used by Hamas during its terrorist attacks on Israeli civilians were given by Ukraine (Dmitry Medvedev, former prime minister of Russia, tweeted on October 9 a message with this accusation).

Secondly, systems to counter disinformation put in place by social media platforms worked partially and, as reported by organizations of the EDMO network (e.g. in Germany and in Ireland), those systems were not able to significantly prevent false and misleading content from spreading.

  • Labeling, downgrading and providing context about a false or misleading content is at the moment the best practice, from the fact-checkers’ point of view, but is still not fast and widespread enough, in particular for smaller languages of the EU;
  • Removals according to relevant platform policies are usually not fast enough to prevent massive circulation of false content and fact-checking organizations generally consider this practice to present various limits and issues (e.g.: the practice is often not fully transparent, it can cause accusations of being applied arbitrarily, of censorship, etc).
  • So-called “community notes” perform poorly in tackling disinformation about polarizing issues, such as the Israel/Palestine conflict, because of their questionable methodology. Often false news, videos or images are not accompanied by notes, and when notes appear they can be misleading themselves and require further modifications and corrections (e.g. in the case of a video showing children in a cage, originally contextualized as Israeli kids kidnapped by Hamas, saw in a first moment the notes appearing to Italian users claiming without evidence that they were Yazidi children kidnapped by Syrian rebels). This is a problem EDMO highlighted months ago.

The amount of disinformation analyzed by EDMO is already significant, but it is a very small percentage of the total circulating. Looking at links inserted in fact-checking articles can obviously cause distortions. In particular, many fact-checking organizations have agreements with social media platforms (in particular Meta and TikTok), so the disinformation they see and act upon on those platforms is more likely to be labeled or in general dealt with in accordance with the respective policies.

At the time of writing of this analysis (October 16), the crisis in Israel/Palestine is still ongoing and related disinformation is still raging on social media platforms. EDMO will carry out additional investigations and monitoring about this topic in the near future, and a complete analysis is to be expected soon.

Tommaso Canetta, deputy director of Pagella Politica/Facta News and coordinator of EDMO fact-checking activities