Disdain and mocking tone, which are sometimes used when debunking conspiracy theories, can really reinforce in the worldviews of some people by creating an "us against them” divide, Marie Bohner told EURACTIV.ro in an interview.

Marie Bohner is Global Head of Partnerships at First Draft, an organisation whose mission is to protect "communities from harmful misinformation; (...) in this polluted information environment, it’s never been harder to know what to trust, and never easier to be misled". 

EURACTIV.ro: In 2017, citing Danah Boyd, Claire Wardle noted "we are at war” [Wardle, Claire, 2017, February 17, Fake News. It’s complicated, available online, as accessed 15.06.2020]. "An information war. We certainly should worry about people (including journalists) unwittingly sharing misinformation, but far more concerning are the systematic disinformation campaigns.” How would you comment on the systematic disinformation campaigns, especially during COVID-19 crisis?

Marie Bohner: While the word "systematic” means regularly repeating over a sustained period of time, the term "disinformation campaigns” can be seen as a concerted effort being made by a specific group of people to lever an operation against another group of people or institution. In contrast to other organisations who do this in depth work (for example DFRLab or Graphika), First Draft does not specifically focus on state-backed entities and sources in its global monitoring operation.

What we have seen is that the Covid-19 crisis has opened a realm of possibilities for disinformation actors. 4 major coronavirus data voids were established and these still exist:

● its origin : where did it start?

● the way it spreads

● the symptoms and the treatments

● the responses of authorities and citizens.

There has been a lack of tangible replies to these basic questions and because the crisis has hit so many people so hard the coronavirus has become a kind of perfect playground for all disinformation actors. People were looking to the media organisations for answers but those same media organisations sometimes did not make the difference between facts and speculation clear enough in their articles because they themselves were still looking for answers.

Another problem has been the collection of data around coronavirus: the pandemic hit different countries at different times, in different ways, and the countries themselves were producing figures and data using different methods.

This led to comparing figures that were not comparable, thus provided the wrong analysis and presenting deceiving visualisations. Coming back to systematic disinformation campaigns, from our own monitoring we cannot say that we have identified systematic campaigns led by governments.

What we have seen though are coordinated activities around certain narratives, those clearly designed to gain traction, engagement and thus influence. There have been millions of shares of conspiracy theories and narratives concerning Bill Gates and his contribution to the research on vaccines.

These narratives have circulated around the world, in various languages and have been adapted to each local context. Most of these narratives underpinned the theory that Bill Gates remains a manifestation of a Western corrupt system.

This always comes back to one of our main questions in terms of disinformation: what is the motivation of the actors sharing this kind of disinformation?

Sometimes state-actors such as Russia won’t explicitly relay disinformation but will subtly frame issues and narratives that are vulnerable to disinformation to further fuel anti-western, institutional corruption narratives. These details then get filtered through the web by other sources in a more explicit manner.

EURACTIV.ro: The Operation Infektion was revealed in a reference documentary of The New York Times as part of strategic malinformation. What did the First Draft News organisation noticed during the pandemic, based on internal monitoring and/or credible reports? Is the strategy similar? Are the US blamed of biological weapons, as it was during the Operation Infektion or more recent Operations? [Ellick, B. Adam, Westbrook, Adam, 2018, "Operation Infektion - Russian Disinformation, From Cold War to Kanye” (The KGB Spies Who Invented Fake News, The Seven Commandments of Fake News, The Worldwide War on Truth), The New York Times 11.12.2018, available online, as accessed 15.06.2020]

Marie Bohner: I would not talk about "strategic malinformation” in the context of your question, but more of "strategic disinformation”. In our framework of definitions malinformation is when authentic information is shared to harm people or institutions.

Misinformation is when false information is shared with no intention to harm, and disinformation - which could be a more accurate word to use in this case - refers to false information shared with the intention to harm or deceive.

Actual malinformation is sometimes used in the context of disinformation systemic campaigns, as well shown in the Operation Infektion documentary. Graphika has documented a Russian campaign, but specifically during the Coronavirus crisis we also saw the US blamed by Chinese officials for bringing the virus to Wuhan during the military Olympic Games.

Similarly Trump and the US Secretary of State have blamed China for the creation of the virus and for lying about the origins of the virus. All of the above is official and well documented, but has also found tremendous echoes into online chatter.

That is what we observed in our monitoring. 

EURACTIV.ro: "Agents of disinformation have learned that using genuine content — reframed in new and misleading ways — is less likely to get picked up by AI systems,” according to the cited article, Information disorder. Is the new form of disinformation a manifestation of the old dezinformatsiya or active measures strategy?

Marie Bohner: In our monitoring of the French social web my colleague Seb Cubbon observed that there was a notable amount of fabricated content used in the misinformation tactics during the presidential election campaign in 2017, but that the amount used in 2019 was almost nonexistent.

Disinformation actors evolved along with the evolution of monitoring techniques and platform algorithms and they realised that the information needed to have a kernel of truth, something authentic, so as to make them able to disseminate wider and in a more efficient way. Old content, when reframed falsely in the present context, can resurface over and over again, jumping from one country and one platform to the next.
 

This is what we call zombie misinformation : it can sometimes return, even if it has been debunked, because of the debunking having amplified it. 

EURACTIV.ro: Is debunking a solution? If so, who shall have the mission to debunk: the media, the NGOs, the education system, the government? Also, if not made with the maximum of caution and professionalism, can the debunking process to create rather a boomerang effect?

Marie Bohner: Debunking is key, taking into account that we also need to measure the tipping point: has the information been shared a lot? Has it traveled across platforms? Has it been shared by online influencers or politicians?

These are questions every debunker has to ask themselves before publishing something that might otherwise give more oxygen to a circulating rumor. And of course if the tipping point is reached, then we should then debunk as much as we can. We have been working mainly with media organisations on these actions, as this type of work features regularly in the professional journalistic world. However, since the coronavirus crisis, we have been contacted more by other actors who also realise that they need to be able to debunk and identify online misinformation: NGOs, health authorities are notable examples here.

In a way it is great that more people are willing to share this never ending task. But you are right, it is essential to do that in a cautious way, if we want to avoid backlashes. Fact-checking or debunking can create data voids. If you simply say that the information is false without sharing any support details, you will double the risk level:

- people might look for another explanation, even more false than the one you just debunked

- people might also be offended in their worldviews if you label it as "false” without any further information.

Disdain and mocking tone, which is sometimes used when debunking conspiracy theories, can really reinforce some people in their own worldviews by creating an "us against them” divide.

This is why at First Draft we also talk about "prebunking’’, that is sharing accurate information with your close network and the surrounding communities, ahead of an election for example.

EURACTIV.ro: First Draft provides a set of tools for debunking. Are those tools conceived mainly for journalists, to be able to check on their sources and on distributed material over the internet or is it useful for everyone? 

Marie Bohner: Until the coronavirus outbreak, most of our work at First Draft was designed to support journalists. We provided them with useful tools and concepts to analyse and understand information disorder. We have created Essential Guides, which are currently available in English and which will soon be accessible in French, Spanish, German, Italian, Brazilian Portuguese and Hindi.

These guides are for journalists to find their way and tips and practices around online verification, news-gathering and monitoring, messaging apps, Facebook ads and closed groups, but also theoretical and ethical frameworks to understand information disorder and aim for a more ethical reporting to slow down the spread of misinformation.

They are practical guides and have been customised to suit a journalist's specific needs. Since the coronavirus outbreak, to support the work of journalists in such crucial times, we also created webinars in several languages, and we continue to create new ones, for example on the monitoring of a trending app like TikTok or on mental wellness for journalists covering Covid-19.

We have created an online course for journalists reporting on coronavirus, also available in 7 languages, thought in bite-size so that journalists can navigate it easily, whatever their level of skills and expertise. Another feature used a lot by journalists are our online toolkits, basic and advanced. We update it constantly. This really makes a difference.

The coronavirus outbreak, and the panic it generated with the wave of "infodemic” that came along, convinced us to also create some tools for the wider audiences. This is quite a new area for us, but it also made sense: nowadays social web users are content creators as much as consumers.

My colleague Ali Abbas Ahmadi wrote a piece about how to talk to your family and friends about the misinformation messages you can find on WhatsApp and elsewhere. Because we strongly believe that it is everyone’s responsibility to engage, not to ignore what we see online, especially when our close ones are sharing these misinformations. And we also know, from a personal experience, that this is not an easy thing to do.

We thus created an online course for the wider public to navigate the infodemic, called "Too much information”. Then we realised that an even better way to get in touch with the wider audiences would be to do a simple 14-days course by text messages or on WhatsApp. We did a first version of that for the lead to the US elections, as a first experience, and it is having tremendous success. To be continued!

EURACTIV.ro: In Romania, uniquely in the European Union to our knowledge, during the lockdown caused by the coronavirus crises, the Romanian government shut down the websites believed to distribute false information. After the lockdown had ended, the websites [EURACTIV Network, Romania shuts down websites with fake COVID-19 ] became again accessible and they continued their activities like before. How would you comment on such an extreme measure; is it in line with the freedom of expression? Is it efficient or, on the contrary, is a solution that a government shall not take?

Marie Bohner: News is both a crucial need and a right for citizens to get access to accurate information in important times, such as the Coronavirus health crisis. We have seen how the lack and the constant evolution of accurate information around Coronavirus has been difficult to face for journalists and for citizens, but also for the governments themselves in their ways to respond. Nevertheless, the conversation about the regulation of information disorder by state laws against disinformation did not start with this specific crisis.

As much as we, at First Draft, understand the will and the responsibility for governments to take action against disinformation - which is often designed to jeopardize the balance of a country and to fuel disputes and distrust - it is also clear that many of the laws passed until now against disinformation have also been used to put political opponents, activists or journalists to silence.

You can’t protect democracy by reducing it.

Also, and we have seen that in other countries like India for example, internet blackouts or shutdowns can sometimes have the opposite effect to the intended one: because the information, or the misinformation, is not available anymore, it creates a void which can be filled by even worse conspiracy theories. Nature abhors a vacuum.

The first actors to tackle information disorder should be the actors of online information themselves : the platforms, the media, but also us, as individuals and communities. We need to find organic responses rather than censorship.

EURACTIV.ro: How would you comment on country specific disinformation campaigns, i.e. mapping of disinfo themes/strategies differentiated per country, especially in Eastern and Central Europe, in countries like Romania, Poland - as opposed to other countries?

Marie Bohner: It is difficult for us as an organisation to comment on country specific disinformation campaigns in Eastern and Central Europe, as we don’t have the internal resources to monitor these areas for now in terms of information disorder. This is something we regret and want to change in the near future as we know how essential these countries are for Europe as a whole.

But we are also conscious that these countries all have different languages and cultures, which is what makes them rich and diverse but also calls for the need of specific language and cultural understanding to be able to fully get what is happening in their online spaces. We are thus working now on inviting further media organisations from, for example, the Balkans and Romania, which we have been working with before, in our global collaborative work around mis- and disinformation narratives, the CrossCheck community.

This community and our own monitoring already showed us how some misinformation narratives around Coronavirus circulate from one country to another. We saw an image around PCR testing claiming that these tests are dangerous and could lead to brain leaks, circulating in various languages, from English, Spanish and French to Romanian.

This is where having a good knowledge of the language becomes key: sometimes messages are obviously translated automatically, and when these messages are designed to trigger a strong emotional reaction, it should make us skeptical for two reasons: because the use of the language is automated rather than genuine, but also because the design is made to shock.

EURACTIV.ro: Do you arguably see any link between the trust in media, on one hand, and the increasing amount of disinformation, on the other hand, and if so, why? The preliminary conclusions of the Digital News Report 2020 by the Reuters Institute of the University of Oxford will reveal the evolution of the trust in media. How would you comment on this evolution? [The DNR Report was launched on 16th of June, and its conclusions are expected to include significant references to the evolution of the media trust during the pandemic; the trust in media significantly fell down in the divided societies.]

Marie Bohner: What we have seen at the beginning of the Coronavirus outbreak in the UK for example is a good sign for media organisations: there was an increase of media consumption.

People were looking to the media to find accurate information in the moment of crisis. But we also saw then that the infodemic created a media fatigue.

Too much information, too difficult to navigate. The timing of media is not the same at all than the timing of scientific research. We have to collectively learn how to tackle this in a better way, and be more educational about it. Another issue is the amplification of disinformation. Many media organisations are conscious of this now and are more cautious in the stories they produce and more transparent about their investigation steps.

The biggest issue around trust in media is probably linked to the media economy, and the power of click baits. Shocking images and headlines are efficient on a very short term basis : they have an immediate result to generate the wanted click. But in the long run they also have a deadly impact of fear-mongering and undermining credible news. This erodes the trust that audiences have in the media, and damages their economy even more. Media organisations have to find a way out of that. It is a tough journey.

EURACTIV.ro: In addition to the previously sent Qs, perhaps it is possible to have some comments on the statement made in London by the US Secretary of State Mike Pompeo. "You can’t threaten countries, or bully them.” US Secretary of State Mike Pompeo accuses China of bullying its neighbours, saying the country has used the coronavirus pandemic "to further its own interests”. (source Channel 4). In Central and especially Eastern Europe, I don't think we have journalists specialised in China, and it has become highly complicated to distinguish between information and various forms of disinformation coming from China; if a lot of people are reluctant to Russian propaganda (because the historical context), China is seen far away, and very few people in Romania are able to even distinguish China's action in the propaganda or what Mr. Pompeo calles "bullying".

Marie Bohner: As explained at the beginning of this interview, we have seen how the coronavirus crisis has been used in the "blame game” between countries as part of their war of influence. We see how narratives on essential issues such as human rights or health can be used to fuel anger and deception.

The fact that Russia and China are clearly identifying disinformation as a part of their political strategy does not mean that the other countries are not using it as well. We try to distinguish facts from political agendas.

At First Draft we are at war against information disorder.

We are not politically partisan, and we need to remain very careful about that. Our work is designed to support journalists and help citizens navigate information better and make informed choices, for example when they vote. In that sense we have a profound attachment to democracy. We do not want to play the same game as bad actors and influence people’s votes. It is a thin line, but an essential one.