The weaponisation of information

misinformation

The war between Russia and Ukraine has commanded the world’s attention for several reasons — from impacting semiconductor supply chains to driving up the price of oil. But another important dimension of this conflict has been the information war being waged online through not just Russians and Ukrainians, but people from all over the world. Lotem Finkelstein, head of threat intelligence at Check Point Software, said something apt about the situation. “For the first time in history, anyone can join a war.”

Take this news report, for example. The Russian newspaper Pravda claimed to have “identified” three American soldiers who died in the war. The report, soon debunked by the US, offered specific details of the soldiers. Or these claims that the war was a false flag operation, and Ukrainian actors were putting on makeup to show blood and injury. 

The Cyber Peace Institute and several other entities have curated timelines of the many cyberattacks and misinformation campaigns being run by both sides, and make one wonder about the scale and reach of these attempts. 

The reason behind this spread of disinformation is simple. It creates chaos and confusion and aims to sway public opinion‌. And winning in the theatre of public opinion has become as critical as winning in the actual theatre of war.

It isn’t just fake news articles anymore. One is subjected to a barrage of video, audio, and textual content. Things like deepfakes — videos that look extremely real and convincing — are a big part of the new-age disinformation campaigns. Because of the proliferation of social media and the thousands of new upstarts, content is everywhere. It is difficult to identify the real from the made-up.

It will take years to understand fully and grasp the repercussions of these ongoing misinformation campaigns. However, we can draw lessons on how to cope with information overload and separate the grain from the chaff.

Let’s take a step back

Misinformation, as a concept, is nothing new. It has existed as part of the media for centuries and has played a role in furthering propaganda in times of wars, elections, and social movements. 

The added dimension now is digital and social media. Much of the Russia-Ukraine war-related misinformation is being spread through TikTok, Twitter, Facebook, and Instagram.

With this being the first war to have started in the age of pervasive social media, and content exploding across multiple sources, platforms are also struggling to understand how to align their existing policies to these new forms of misinformation.  

Nothing exemplified misinformation more than what we saw during the Covid-19 pandemic. From inhaling hot air from a hairdryer or drinking bleach to kill the coronavirus to claims that vaccines alter human DNA, the spread of fake information was relentless, and global health authorities constantly warned against the “infodemic”. 

Studies have shown that political beliefs played a significant role in how this misinformation spread and became mainstream. Misinformation in elections and politics through online channels is a theme that is now more and more mainstream. The 2016 US Presidential Election saw how Cambridge Analytica helped Donald Trump win by placing targeted political ads to influence voters. Closer home, India saw a deluge of fake news before the 2019 elections. 

However, misinformation or disinformation online is not a manual process. Much of it exploits the way platforms are designed, using their algorithms to amplify harmful content. It further gets amplified by targeting people who already hold certain beliefs.

Breaking the cycle

A recent study found “that the internet may act as an ‘echo chamber’ for extremist beliefs; in other words, the internet may provide a greater opportunity than offline interactions to confirm existing beliefs”. This has led to an increase in radicalisation globally. 

There is a reason ‌you keep seeing content related to things you like. Social media platforms have algorithms that are learning from you. If you watch or “like” a dance or cooking video, for instance, the algorithm will know you liked it and offer you more content choices catering to your preferences. This is the reason behind the rise of platforms like TikTok. 

If it was just dance videos, it wouldn’t have mattered much. But people also watch political content or get introduced to political ideas through content that interests them. Once the algorithm figures that’s the ‌content you like, it will show you similar stuff. The more you watch, the more it learns. The more you see, the more time you spend on the app. Thus creating an echo chamber, leaving little room for alternative or contrary points of view. 

A breakdown in trust in mainstream media is also an oft-cited reason for people to believe what they see and hear on social media platforms, further amplifying extreme views. The spread of misinformation through closed platforms like WhatsApp, Signal, or more recently, Discord is also a challenge that is being tackled at some level by fact-checking organisations.

The world is only getting more online. As consumers of news, it is important to build a culture of objectivity and ask the right questions. If a piece of news comes to you on WhatsApp, instead of accepting it as the gospel, an ideal way would be to look into other news sources.

Apart from being sceptical of news sources, there are a few things you can do to not fall prey to disinformation campaigns: 

  1. Learn when to question certain content. If it looks outlandish and evokes a strong reaction, making you want to amplify it, it’s time to do a quick search for primary sources before hitting the share button. 
  2. Satire often gets amplified as news, especially on chat apps. So checking the original source of the text through a quick online search is another way to check for authenticity.
  3. Fact-checking organisations like Snopes, BOOM, FactCheck.org, provide tools to fact-check content. Some even have WhatsApp helpline numbers where you can share a forwarded message and get a near-instant fact check done.
  4. Read beyond the headline in the news because they can often be misleading. 
  5. If someone forwards you an article, make it a habit to check the date. Misinformation campaigns often amplify old news to create chaos in a new context. If there is no date, that’s another red flag. 

Amplifying a fake message makes us part of the chain of disinformation that can have catastrophic consequences. So getting to the source of a piece of content is an effort worth making. 

Our internal biases will always tempt us to believe information that conforms. This new digital world will require a significant amount of rewiring in how we approach what we see online. It is easy to surrender to our biases, but for the sake of the future, let’s not pick the easy way out.  

  •  
  •  
  •  
  •  
  •  
  •  
  •  

Be the first to comment

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.