Analyst1 > Resources > Blog > Disinformation Campaigns: The New Cyber Attack and Weapon of Mass Distraction
Dark Mode
Disinformation Campaigns: The New Cyber Attack and Weapon of Mass Distraction

Disinformation Campaigns: The New Cyber Attack and Weapon of Mass Distraction

Disinformation campaigns. Cyber warfare. Nation-state attacks. Scary terminology – even if you aren’t sure what it means. Sometimes visuals help us better understand concepts we can’t grasp, and over the years, the creative minds in the arts and motion picture industry have crafted vivid images to help us realize such expressions. But what if those illustrations don’t characterize the hidden danger? The result? Turmoil.

Disinformation: deliberately misleading or biased information; manipulated narrative or facts; propaganda.

Disinformation – the reality of the world in which we live today. What’s terrifying? These types of campaigns are the latest in an arsenal of nation-state tools wreaking havoc on the globe. Whether it’s election interference, compromising a brand, or just wanting to create good old-fashioned chaos, disinformation campaigns have grown exponentially and these “alternative facts” are cultivated and amplified through the explosive vehicle of social media. In fact, the World Economic Forum found that tweets containing disinformation consistently outperformed those containing truth.

Common Forms of Disinformation

Election Disinformation

According to a 2019 study by the University of Oxford there is evidence of organized social media manipulation in at least 70 countries where at least one political party or government agency is using social media to shape public opinions.

Disinformation Targeting Businesses

According to consulting firm Deloitte, state-actors, criminal groups, aggrieved employees, and business rivals have begun utilizing tools (software bots, machine learning), to create disruption in the form of fake reviews, phony stories about discrimination in the workplace, faux executive statements, and manipulated videos.

Deepfakes and Disinformation

Innovative AI and machine-learning tools are making it much easier for people to manipulate video, image, and audio content in almost undiscernible ways. Studies have shown this type of manipulated media, or “deepfakes,” is becoming pervasive as a method to discredit and disrupt.

Public Emergency-Related Campaigns

The plethora of news with regards to the ongoing pandemic, is the latest example of how social media platforms have made it easy for bad actors to propagate confusion and undermine public confidence during a health crisis or national emergency.

NewsGuard, a startup initiated by two former journalists that vets the internet for misinformation, has identified 217 websites in Europe and the United States that publish “materially false” information about COVID-19.

Disinformation Services

Services are becoming available on the dark web that allow anyone to launch a disinformation campaign against a target for a nominal fee. The services allow bad actors to discredit victims while unloading the dissemination of lies to third parties.

One of the reasons these defamatory campaigns are so damaging, is that they are easy to launch and inexpensive to amplify. It takes a small group of people working together to intentionally create a false narrative – no coding or hacking required because there are other avenues to propagate the deception.

Disinformation attacks also cause an extreme loss of value – to brands, public perception, and human integrity. The difference between disinformation and other cybersecurity threats, however, is the extent to which it can be contained. There is no firewall that can stop it. So what is the right approach to defending against a new strain of cyber-attacks, or weapons of mass distraction?

With the goal to cause significant harm to public perception, disinformation does not discriminate. The number of bad actors, volume of content from hundreds of millions of users, make eliminating every manifestation of disinformation astonishingly difficult.

There are steps we can follow as individuals to stop the spread of disinformation. Through our online presence, advertisers and publications learn our likes and dislikes and as such, serve content based on our bias. Do research before sharing an article, or limit sharing altogether. Sensationalized stories are far more interesting to share than news based on facts, but if we all took the time to arm ourselves with the truth, and limit the transmission of false narratives, we will be taking “one small step for man.”