Cybersecurity News that Matters

Cybersecurity News that Matters

State-sponsored actors escalate disinformation to full-fledged operations, expert warns

Franky Saegerman, a social media expert who spent 31 years at the North Atlantic Treaty Organization (NATO), is revealing the latest narrative attacks at the Black Hat 2024 Conference on August 7. Photo by Dain Oh, The Readable

by Dain Oh

Aug. 09, 2024
5:08 AM GMT+9

Las Vegas, NV―Black Hat 2024―A former NATO analyst specializing in social media has called for increased awareness of disinformation campaigns conducted by state-sponsored actors. These campaigns, particularly from China and Russia, are part of highly orchestrated efforts to achieve long-term strategic goals. Meanwhile, the younger generation is becoming increasingly dependent on social media for news consumption, making them more vulnerable to these operations.

On Wednesday, Franky Saegerman, a social media expert who spent 31 years at the North Atlantic Treaty Organization (NATO), took the stage at the Black Hat 2024 Conference to expose state-sponsored information operations designed to manipulate global narratives. During his 30-minute presentation, Saegerman dissected the tactics behind recent campaigns, offering real-world examples. The risk intelligence company Blackbird.AI refers to these threats as “narrative attacks.”

According to Saegerman, disinformation refers to deliberately distorted information that is secretly inserted into communication channels to deceive and manipulate its target. He also introduced the term ‘FIMI,’ an acronym for Foreign Information Manipulation and Interference. While FIMI operations are often not illegal, they exhibit deliberate patterns of behavior that threaten or potentially impact the values, procedures, and political processes of the targeted victims. Unlike disinformation campaigns, which may not always be coordinated, FIMI occurs in a manipulative, intentional, and organized manner.

These narrative attacks follow recurring patterns: exploiting societal vulnerabilities, creating a big lie, wrapping that lie around a kernel of truth, concealing the actors behind the disinformation, using ‘useful idiots’ to spread falsehoods, and when caught, denying, distracting, and distorting the facts—all while playing the long game. The ABC model, also known as the ABCDE model, proposed by James Pamment at Lund University in Sweden, provides analysts with a framework to identify disinformation campaigns by focusing on five key elements: actors, behavior, content, distribution, and effect.

Disinformation campaigns pose significant threats to public discourse, especially given the growing reliance on social media. “The younger generation is increasingly getting their news from platforms like TikTok, with many being exposed to disinformation,” said the former NATO analyst, citing statistics on news consumption by age groups. “This becomes a risk if young people no longer question which news is real and which isn’t.”

The expert shared several real-world examples. Doppelganger, a cyber threat group operated by Russia, mimics legitimate websites, including those operated by The Guardian and NATO, to spread pro-Russian content and discredit Ukraine. Another example is ‘Pink Slime’ websites, which are fake sites that mimic local news outlets to distribute disinformation. According to the expert, there are over 1,265 such websites targeting the United States, surpassing the number of genuine local newspapers, which stood at 1,213 as of mid-2024. Similarly, Operation Paperwall, orchestrated by China, disguises itself as local news outlets. This operation disseminates pro-Beijing narratives through at least 123 websites that appear to be local news sources in more than 30 countries, including South Korea, Japan, and Turkey.

Operation Overload aims to overwhelm fact-checkers with fake tips, draining their time and resources, while spreading pro-Kremlin narratives, according to Franky Saegerman’s presentation at the Black Hat Conference on August 7. Photo by Dain Oh, The Readable

Operation Overload, which spreads pro-Kremlin narratives and fuels societal divisions, aims to overwhelm fact-checkers with fake tips, draining their time and resources. By sending politically sensitive but false images through X (formerly Twitter) and emails, the operation forces fact-checkers to spend extensive time verifying these false materials, ultimately distracting them from more important tasks. The analyst noted that over 800 organizations have been targeted by this group so far.

Artificial intelligence technology, including AI-powered deepfakes, has been worsening the challenges society faces today. Among the various types of AI-generated content, the expert predicts that audio deepfakes will become a major concern. “Anyone can generate an audio deepfake,” said Saegerman. “They are easier and cheaper to create than video deepfakes. Additionally, they are simpler to spread on social media and messaging platforms.”

Hybrid threats, which combine traditional military tactics with information operations and now increasingly leverage disinformation, were also highlighted by the expert. In a statement, NATO noted that hybrid methods of warfare—such as propaganda, deception, sabotage, and other non-military tactics—have long been used to destabilize adversaries. However, their speed, scale, and intensity have been amplified by rapid technological advancements and global interconnectivity in recent years. “Hybrid threats are political exploits, typically targeting vulnerable nations,” the analyst explained. Hybrid warfare has been observed globally, including in Bosnia, Moldova, Georgia, and Estonia.

To address these threats, Saegerman proposed increasing public awareness about highly organized and orchestrated operations. “To counter disinformation, we need more robust, coordinated international efforts” the expert emphasized. He recommended developing systems to filter disinformation, reinforcing platform security, and providing greater access for security researchers to social media platforms. “These measures will empower the public to better understand what is happening on these platforms,” he added.

Subscription

Subscribe to our newsletter for the latest insights and trends. Tailor your subscription to fit your interests:

By subscribing, you agree to our Privacy Policy. We respect your privacy and are committed to protecting your personal data. Your email address will only be used to send you the information you have requested, and you can unsubscribe at any time through the link provided in our emails.

  • Dain Oh
    : Author

    Dain Oh is a distinguished journalist based in South Korea, recognized for her exceptional contributions to the field. As the founder and editor-in-chief of The Readable, she has demonstrated her expe...

    View all posts
Stay Ahead with The Readable's Cybersecurity Insights