Just as the Chan master used riddles to sharpen the student’s perception in spite of distracting hearsay, the Chan painter used a simple and spontaneous, albeit elusive, brush style to capture his fleeting vision of truth.
On February 4, 2014, an anonymous source posted a conversation between the US Assistant Secretary of State, Victoria Nuland, and the US Ambassador to Ukraine, Geoffrey Pyatt. Amidst all the political banter, one line stood out, “Fuck the EU.”
The leaked audio was a crude effort to sow dissension between the US and its EU allies during a tense moment in Ukraine - a move that would clearly help Russian interests. Back when I used to teach Cyberspace and National Security, Russia’s invasion of Ukraine, meddling in the US elections, and the dissemination of NotPetyaThe NotPetya malware first appeared in 2017, on the eve of Ukrainian Constitution Day. It affected thousands of systems in over 65 countries. Maersk, the Danish shipping company, lost $300 million in revenues and was forced to replace 4,000 servers. Even still, Ukraine was the malware’s clear target, suffering 80% of all infections.
made the nation an exemplar in hybrid information warfare.
The tactics have only become more refined and the tools easier to use. Non-state actors are increasingly effective.
Yesterday, I attended a presentation by Gabrielle Lim, Researcher at Harvard’s Shorenstein Center. She presented Disinformation Strategies and Tactics at the Internet Freedom Festival’s Community Knowledge Share.
Lim was candid about the difficulties facing researchers and policymakers. Even the terms used to study the phenomenon are loosely defined, fluid targets. They range from internet slang like trolling, copypasta, and spam to more formal concepts like media manipulation. Claire Wardle and Hossein Derakhshan offer a useful way to categorize this type of information:Claire Wardle, & Derakhshan, Hossein, Council of Europe, Information Disorder: Toward An Interdisciplinary Framework For Research And Policy Making (Strasbourg Cedex, 2017).
- Misinformation: information that is false, but not created with the intention of causing harm.
- Disinformation: information that is false and deliberately created to harm a person, social group, organisation or country.
- Malinformation: information that is based in reality, but is used to inflict harm on a person, organisation or country.
It’s unclear if and when governing bodies should enforce rules. For example, just because there is no evidence of God, Lim argued, does not mean we should restrict the claim of its existence.
Examples demonstrate how the spread of mis/dis/malinformation could conceivably be categorized in multiple ways.
- COVID-19 Copypasta: misinformation shared about the corona virus pandemic in encrypted Whatsapp groups continues to be a problem. This is especially insidious because it is difficult for researchers and social scientists to monitor and combat its spread. Lim called this phenomenon an example of “hidden virality.”
- Endless Mayfly (Iran): a disinformation honeypot attempting to bait activists and journalists into spreading polarizing stories and rumors about Saudi Arabia, the United States, and Israel.
- Its Okay To Be White (4-Chan): an American malinformation campaign conceived to exploit political polarization exemplified by television pundits on profit-driven cable news channels.
I also found Lim’s example of an ACLU disinfographic to be interesting. The original informs people of their rights if a US Immigration and Customs Enforcement (ICE) agent knocks on their door. The altered infographic provides potentially damaging advice while sounding legitimate and authoritative.The original infographic by the ACLU. The false infographic (see Disinformation Strategies and Tactics, slide 9).
We talked a little bit about terminology and tools at the end. Lim told me that Snopes and Wikipedia, while seemingly not effective to combat the growth of mis/dis/malinformation, are used as sources for some algorithms that flag dubious campaigns.
I haven’t heard anyone who is particularly optimistic about automated efforts like IFCN Fact Checking Organizations on WhatsApp (which is also used by Facebook). But they will play a role whether we like it or not. The scope of the problem is too large and the incentive structure of shareholder owned social media platforms don’t align with more robust solutions.