Know your risk level: At the entrance of many US National Parks you will find the iconic Smokey the Bear sign displaying a fire warning level based on present conditions. A forest in drought conditions is high risk. When it comes to disinformation narratives many organizations are unaware of the factors that place them in the path of a conspiracy theory. Leaders should take stock of how their organization’s interests sit on or near the fire lines of controversial financial, political and societal issues.
In a world of complex tensions, organizations that take a position on a controversial issue, criticize the policies of a foreign government, or simply are on one side of a domestic political dispute, can end up being the subject of disinformation. The first task in a counter-disinformation playbook should be an audit of vulnerabilities an organization has within the landscape of conspiracy narratives. A vulnerability audit can identify several potential sources of attack, including:
- Past controversies involving the organization—these could provide the basis for legitimate criticisms that in turn might lend credibility to new falsehoods.
- Social media controversies surrounding a sector—these could easily lead to false narratives being lobbed at your company (for example, anti-vaccination advocates attacking pharmaceutical companies manufacturing COVID-19 vaccines).
- Specific scenarios such as management reshuffling, sensitive financial situations, or contentious run-ins with nation-state interests—any company is at a higher risk of being harassed by fake news and runaway rumors during such transitions.
Prioritize high risk narratives: Fighting forest fires requires choosing which fires to fight and which ones to let burn themselves out. Those that threaten key or sensitive areas are a priority. Likewise, the fake narratives that matter most are the ones that threaten your license to operate, while others are only worth monitoring. We recommend that a disinformation playbook include a risk assessment framework to decide which narratives can do the most damage. To determine high-risk narratives requires considering three key questions:
-
What is the potential impact of the narrative? If it catches fire and people believe it, can it jeopardize your goals, operations and partnerships?
-
Is the narrative credible to key audiences? While the information may be false, is it believable/credible enough to a wide enough array of people?
-
What is the penetration of the narrative? Has the storyline broken out of echo chambers into the wider public?
If the answer is yes to all three you are likely looking at a high-risk narrative. If no, you may have a medium or low risk narrative depending on the assessment. Having identified the high- and medium-risk narratives, you now know where to focus your attention. Actions to fight fires include steps such as: mobilizing validators, arming supporters with factual and compelling information to counteract false information, and using search engine optimization so people find accurate details online first.
Build resilience: Building firebreaks—obstacles to slow or limit the spread of wildfires—is a critical mitigation technique. Smart action can also potentially contain the spread of disinformation. This requires understanding your audiences, identifying your advocates and those who are undecided but persuadable, and then optimizing your communications to ensure these audiences know your narrative. Organizations that take the time to learn how to address and mobilize these audiences can build resilience against false information.
At its heart, disinformation is aimed at eroding trust in organizations and individuals by assaulting motives and intentions. Disinformation narratives are often emotionally engaging. The counter to this is ensuring that key audiences understand your values and motivations—the “why” of what you do. These should be clearly communicated in a compelling and authentic fashion. Understanding the “why” builds reservoirs of good will and trust that are harder for fake narratives to penetrate.
Communications insight tools such as message testing, survey work, focus groups, etc., can help you understand what concerns people have about your organization, why they might be drawn to disinformation about you. These devices can help companies learn where people get their information, what validators they find to be most or least credible, and what messages resonate with them. For global organizations, these methods can also help you understand local variations in disinformation. What works in North America may not work in Africa and vice versa. With this knowledge, companies can put in place communications that inoculate their key audiences against false narratives.
Fire Extinguishers: The common thread through all of these considerations is preparation. Companies do not have to sit and wait to be a victim of disinformation. As with all corporate communications crises, it is better to have a plan of action in mind before an actual need arises—in this case, a disinformation playbook. Such a guide should identify critical team players for combatting disinformation and outline steps they should take to study the landscape, methodically assess the exposure to potential risk, cultivate a supportive stakeholder network and broadcast your organizational values.
A detailed disinformation playbook—even if it needs to be adjusted to handle unforeseeable events—can be a critical, calming guide for handling a fake news inferno. Ultimately, such preparation will also help clear away the combustible material, making you less appealing for the fire-starters of false information and conspiracy, and more resilient in the event your organization becomes a target.
--
Preston Golson is a Director with Brunswick specializing in cybersecurity, technology and disinformation. A former CIA analyst and spokesperson, he served as Chief of CIA’s Public Communication Branch in its Office of Public Affairs, and as Chief of Communications for the agency’s Directorate of Digital Innovation.
Illustration by David Plunkert.