Key concepts for experimenting with AI to address the challenges of disasters and climate change

In November 2023, the Melbourne School of Population and Global Health hosted a workshop with disaster and climate change practitioners and researchers to explore the possible uses, risks, ethics and opportunities of AI to mitigate mental health and wellbeing effects of disasters and climate change. This paper provides the main findings from the workshop.

There are rapidly increasing climate disasters happening in a time of surging Artificial Intelligence (AI) technologies. Simultaneously, there is a growing recognition of the health risks of climate change and discussion of ways to potentially address these risks through use of AI.1,2 For example, there is growing discourse on the risks and benefits of AI use within health care systems.3,4 However, specific uses of AI to manage the health and wellbeing effects of disasters (which are projected to increase in frequency and severity due to climate change1) remain understudied.

At the November 2023 workshop, key concepts were identified from practitioner experimentation with a Large Language Model (LLM) to address gaps in knowledge found in a rapid literature review. This review found that speculation predominates on the use of AI for climate change and there is limited literature on immediate AI applications for practitioners. Our experiment was to use GPT4 and the AskYourPDF Plugin to prepare a grant application to support recovery and climate adaptation in a disaster-affected community that experienced great material loss and the death of children and a teacher. The grant opportunities were real,5,6 evidence-based resources were uploaded as guidance7,8 and the disaster-affected community was a fictional compilation of real cases.

The experiment highlighted concerns and opportunities about the LLMs. Participants indicated that GPT4 was ‘good for summarising, brainstorming and getting things started’. Risks noted included concerns that use of LLMs to construct grant ideas would ‘lead people to bypass asking the community what their primary concerns are [in disaster recovery]’. For some participants, it made them ‘feel dead inside’ in the sense of losing aspects of creativity and human-to-human interactions. Logistically, participants noted that LLMs are only as good as the questions asked and they indicted a need for prompt-writing resources. Other concerns were raised, including how to adjust grant processes if using GPT4 becomes a widespread practice, leading to grant applications looking the same. Ethical concerns included the profit-driven setup of OpenAI, perpetuation of racism and sexism by GPT49 and a potential ‘narrowing effect’ if certain ideas are given more precedence than others. For example, GPT4 suggested solar panels to address climate change but did not recommend supporting a community’s grieving during anniversaries of the disaster.

It is critical to confront the practical and ethical complexities of AI use. The concepts described are important areas for continued critique and experimentation within emergency and disaster management, research and planetary health. 

This project was funded by the Melbourne School of Population and Global Health, Artificial Intelligence Grants (Stage 1).


Endnotes

1. IPCC (Intergovernmental Panel on Climate Change) (2021) Climate Change 2021 – The Physical Science Basis: Working Group I Contribution to the Sixth Assessment Report of the Intergovernmental Panel on Climate Change. Cambridge University Press.

2. Filho WL, Wall T, Mucova SAR, Nagy GJ, Balogun AL, Luetz JM, et al. (2022) ‘Deploying artificial intelligence for climate change adaptation’, Technological Forecasting and Social Change, 180:121662.

3. Bloomfield PS, Clutton-Brock P, Pencheon E, Magnusson J and Karpathakis K (2021) ‘Artificial Intelligence in the NHS: Climate and Emissions’, The Journal of Climate Change and Health, 4:100056. https://doi.org/10.1016/j.joclim.2021.100056

4. Bratn T, Heyen NB, Husing B, Marscheider-Weidemann F, Thomann J (2024) ‘Hypotheses on environmental impacts of AI use in healthcare’, The Journal of Climate Change and Health, 16:100299.

5. Foundation for Rural Regional Renewal (n.d.) Community Led Climate Solutions. Foundation for Rural Regional Renewal website https://frrr.org.au/funding/disaster-resilience-and-climate-solutions/community-led-climate-solutions/, accessed 10 August 2024.

6. Foundation for Rural Regional Renewal (n.d.) Strengthening Rural Communities: Prepare and Recover. Foundation for Rural Regional Renewal website https://frrr.org.au/funding/place/src-prepare-recover/, 10 August 2024.

7. Quinn P, Gibbs L, Blake D, Campbell E, Johnston D and Ireton G (2021) Guide to Post-Disaster Recovery Capitals (ReCap). Bushfire and Natural Hazards Cooperative Research Centre, Melbourne, Australia. www.phoenixaustralia.org/disaster-hub/wp-content/uploads/2021/05/ReCap_2022.pdf

8. Victoria State Government Department of Health and Human Services (2020) Tackling climate change and its impacts on health through MPHWP - Guidance for local government 2020. Victoria State Government Department of Health website www.health.vic.gov.au/publications/tackling-climate-change-and-its-impacts-on-health-through-municipal-public-health-and, accessed 10 August 2024.

9. Zack T, Lehman E, Suzgun M, Rodriguez JA, Celi LA, Gichoya J, et al (2024) ‘Assessing the potential of GPT-4 to perpetuate racial and gender biases in health care: a model evaluation study’, Lancet Digital Health, 6(1):e12−e22. https://doi.org/10.1016/S2589-7500(23)00225-X

Gallery