Emergency agency communicators are our bulwark against this current threat. Are your comms teams ready?
The Bondi shooting AI-generated Australian Federal Police press conference video1 was a shock to us all on top of a pile of disinformation swirling about every aspect of that terrible event. It brings the disinformation menace right into the emergency management sector bailiwick and I think we are not ready for it.
It’s not like there have not been warnings. Ten years ago, east coast American communities affected by Superstorm Sandy were hit with fake images of sharks swimming in inland streets, photos that had been altered to exaggerate storm damage, emergency instructions that were wrong, news that the New York Stock Exchange was under water2 (it wasn’t) and that the power company had disconnected electricity to Manhattan (it hadn’t). Superstorm Sandy turned out to be ground zero for natural hazard disinformation.
Hurricanes Harvey and Irma (2017) saw exponential disinformation evolution when fake news was crossed-pollinated with wild conspiracy theories and reposted by the US President and other politicians. For example, ‘illegal’ immigrants were receiving all the disaster relief and the government was geo-engineering the weather. And then there was the report about Black Lives Matter blocking disaster aid, the Mayor of Houston missing in action and mosques that were refusing shelter to non-Muslim Houston residents and hoarding aid. You can see a complete list on the Media Matters website.3
From a distance, it is ridiculous. But it has real effects for response and recovery operations and agencies. Their communication teams need to pay attention.
In Kerala, India, in 2018, widespread flooding killed nearly 500 people and some of those deaths were thought to be as a result of the way people and agencies responded to disinformation spread on WhatsApp.4 Among the worst was that a dam was about to burst; Peechi Dam gates were about to be opened; Kerala ‘doesn’t need money’; most of the affected people are from rich or middle-class families and don’t need help; donations to the Chief Minister’s Disaster Relief Fund will be misused; and sundry scams encouraged people to donate to bogus relief organisations. Emergency operations were halted and worried householders clogged roads in efforts to unnecessarily evacuate out of what they thought would be the path of the dam water. False reports of road closures also hindered passage of rescuers to the affected areas.
In Australia, there is a general feeling of being protected but we regularly see cloud seeding5 and other weather manipulation and ‘chemtrails’ narratives pop up in online conversations about cyclones, especially if they are followed by a rain bomb. The old ‘plague of arsonists’ climate crisis denial narrative comes out every bushfire season.
The Australian Associated Press and other sites do a great job of identifying and debunking fake news, like December’s fake ‘breaking news’ of Australian storms with video of a typhoon in the Philippines standing in for a storm in Brisbane.6 But politicians and agencies in many countries are strangely silent, even though there are important ways to get ahead of these narratives. This is shown to best effect by the Indian Government during the Kerala floods in 2018.
Dealing with disinformation in disaster
The disinformation fight should kick off before the impact phase of a disaster using ‘inoculation’ techniques that have been shown to work since the idea was first proposed in 1961. Inoculation is simple to integrate into existing communication approaches using 2 tools: prebunking and (if you have the resources) debunking. Information inoculation is literally a ‘vaccine against brainwash’.
Studies show it is successful against climate disinformation, propaganda and conspiracy theories. In a study by Traberg et al. (2022),7 testing inoculation against climate change fake news, a ‘passive’ communication effort (such as social media posts, media advisories and spokesperson warning of what to expect in a video clip) decreased belief in and sharing of disinformation by 33%. In the same study, a more active method decreased belief and sharing by 75%.
Active methods include workshops or community engagement programs or a game that helps people identify when and how they are being misled. Then you need platforms you can use to flag these scenarios and a creative way of implementing your prebunking efforts.
Once you become good at the prebunking, debunking – that is, trying to refute disinformation that is already out there – becomes easier because the prebunking efforts enlist an army of disinformation warriors in your followers who will pounce on disinformation wherever it arises.
Where does it come from?
Disinformation has a great framework of the C5 interaction model.8 This 2025 model describes the context, causes, content and the cycle of amplification as well as the consequences of disinformation and how these interact. The ‘causes’ component of the model helps us understand both the creators of disinformation and their motives.
Creators can be explained using 3 categories: human/non-human (bots and AI swarms), individual/organisation and non-state/state actors. It’s easy to assume that all originators are human and individual but Needham (2025)9 reported that sowing division by state actors is alive and well in Australia.
Motives are classified as ideological (political, religious, some other belief system such as libertarian or sovereign citizen) or financial. Cloud seeding is ideological coming from a conspiracy theory. Claims that only immigrants will get flood relief is based on political ideologue, and scamming ‘donations for flood survivors’ is financially motivated.
What better place than an unfolding disaster to sow political or ideological division or to set up a scam than when people are under stress and, in many cases, looking for somewhere or someone to lay the blame?
It’s a simple approach that can easily fold into normal communication programs. However, it can be daunting because disinformation, especially any bot-driven versions, seems unbeatable and we want to believe that it can’t happen during disaster. But communities are intelligent and caring, and, after inoculation efforts, can be empowered with information and confidence.