From research outcome to agency change: mapping a learning trajectory of opportunities and challenges

Dr Christine Owen1,4, Dr Noreen Krusel2, Dr Chris Bearman3,4and Associate Professor Benjamin Brooks1,4

  1. University of Tasmania, Hobart, Tasmania.
  2. AFAC, Melbourne, Victoria.
  3. Central Queensland University, Appleton Institute, Adelaide, South Australia.
  4. Bushfire and Natural Hazards CRC, Melbourne, Victoria.

    Submitted: 22 May 2017. Accepted: 4 September 2017.

Peer-reviewed Article

Submitted: 22 May 2017. Accepted: 4 September 2017.


Abstract

A key theme within the Bushfire and Natural Hazard CRC Cognitive Tools and Decision Making project is to understand how practitioners learn from research outcomes and how they can use them. Translatingresearch outcomes into practice is a complex process and can be beyond the control of the project team and end-user representatives. Using ÔlessonsÕ terminology, it is suggested that observations and insights can be identified from reviewing research outcomes. However, the lessons that are derived from insights are only ÔlearntÕ when they instigate sustainable change (Commonwealth of Australia 2013). To create the best conditions for organisational learning a literature review of learning lessons in emergency management was conducted. Practitioners were also interviewed to understand the contexts and challenges faced in implementing research insights and in facilitating change. This paper presents two studies that examine aspects of organisational learning. In the first study, the challenges to learning from action and experience and from reflection and planning are examined. In the second study, the systems for learning used in emergency services organisations are considered and a preliminary theory of research utilisation maturity is proposed. The initiatives reported help to maximise the value of research and supports innovation through utilisation.

Article

Introduction

There is a well-established literature base that points to better understanding of how research outcomes and knowledge learnt is embedded in practice (Atkinson, Crawford & Ward 2006, Elliot & Mihalic 2004, Eskerod & Skriver 2007, Milton 2010, Williams 2008). Learning from research projects is often hard (Atkinson, Crawford & Ward 2006, Duffield & Whitty 2016, Williams 2008). Williams (2008, p. 262) argues that there is a need for ‘... wider research into how lessons can be disseminated throughout an organisation and incorporated into organisational practice’. Emergency management is no exception (Donohue & Touhy 2006).

Drupsteen and Guldenmund (2014) suggest that learning starts with the collection of information, followed by processing and storing. However, it is important to get beyond simply processing and storing ‘lessons’, that is, it is necessary to move from identifying lessons to implementing them. While it is important to have systematic approaches to managing lessons that might be identified, identifying them is not sufficient to bring about improvements. Learning lessons from disasters and crises is important (Borell & Eriksson 2008, Brower, Jeong & Dilling 2009). However, recording, storing and sharing lessons identified does not necessarily infer that anything has in fact (or will subsequently be) learnt (Rostis 2007, Deverell & Hansén 2009). Learning cannot be said to have occurred unless there is change.

Given the dearth of understanding about the processes through which learning in organisations actually occurs and the suggestion that it is so difficult, it seems timely to give attention to the processes of learning and analyse the factors that enable and constrain learning within organisations. This would lead to greater levels of research utilisation.

Owen and colleagues (2015) outlined the findings from an environmental scan identifying what organisations were doing to identify learning opportunities and the changes needed in practice. The report illustrated how much of that work is structured around what organisations characterised as ‘lessons learnt’. Therefore, this paper considers use of research outputs within a lessons learnt framework.

To understand the ways in which agencies might review, assess and learn from research outputs, interviews were conducted with end users to ascertain views on what opportunities and threats could be identified (Study 1) and would be managed if research outputs are to be embedded into organisational learning. These findings are discussed in relation to a parallel study (Study 2) into research utilisation practices employed by organisations in the fire and emergency services sectors.

Study 1

Method

A total of 18 interviews were conducted with personnel engaged in operational roles in emergency management and who have responsibility for lessons management processes. The median level of experience in emergency management was 20 years. Interview questions included:

  • How would you characterise how this agency learns?
  • What kind of processes do you have in place to facilitate organisational learning?
  • What do you believe enables and constrains learning and change?
  • What do you perceive will be the opportunities and threats to support implementation from the research?

Interviews lasted between 25-55 minutes and were recorded. The interviews were coded in a top-down, theory-driven manner based on learning cycles of action and experience and then reflection and planning (example Kolb 2014, Duffield & Whitty 2016).

Results

Learning from action and experience

The participants spoke of how applying research tools in an emergency context can be challenging, in part, because of the unique characteristics presented in an event:

[We] Can’t always have a check sheet ‘this is how it will work’. Things are dynamic. Things will change. (Interviewee with 3-15 years experience)

In addition, while experiences may be similar, each event is based on real-time dynamics and the specifics of each incident. For learning opportunities from research tools to be generalised, the experience from individual cases needs to be systematically documented and the features reviewed to identify further applications.

Learning through reflection and planning

Given the unanticipated nature of managing emergencies it is perhaps not surprising that some personnel see the emergency services culture as a largely reactive one that presents challenges to learning. One end user indicated that:

We are such a reactive culture. If something doesn’t work the first time, we tend to just throw it out. We don’t ask why didn’t it work; just ‘get me a new one’! (Interviewee with 18-20 years experience)

The implication here is that the research utilisation initiatives need to ensure there is attention to evidence-based change management. Trials must be carefully managed to avoid this example of premature dismissal.

Planning based on reflection is influenced by the ways people make sense of their experiences so that generalisations can be made. One of the threats to making sense out of experience and reflection is that there is no universally accepted approach to the development or content of debriefs and reviews.

Some After Accident Reviews are really comprehensive and useful. Others are hard to make out what they [the participants] are driving at. (Interviewee with 6-16 years experience).

Collective sense-making based on reflecting on experience and action requires systematic processes so that alternatives can be envisaged and the implications of other organisational procedures, policy and doctrine can be fully considered.

These qualitative findings on enablers and challenges for learning are supported by Study 2 that reports on a survey on agency practices associated with research utilisation.

Study 2

The Bushfire and Natural Hazards CRC and AFAC have a continuing interest in enhancing research utilisation. Their stakeholders are regularly surveyed to assess how they use research in order to gain maximum benefit from their investment. Surveys were conducted in 2010, 2012, 2014 and in 2016 (Owen 2011, 2014; Owen, Krusel & Bethune 2016). The early surveys revealed opportunities to improve communication, engagement and collaboration. Subsequent research utilisation strategy focused on these areas at the individual and the industry-wide levels. The 2016 research utilisation survey included opportunity for respondents to provide comments on the plans agencies have in place to keep abreast of research. The data forms the basis of this study reported here.

Method

The January 2016 survey was distributed to 50 emergency services agencies. Agency contacts were requested to distribute the survey to 5-15 people, using the following stratified sample:

  • Senior management: the most senior person in the organisation responsible for the following areas:
    • communication
    • training and development
    • operations
    • community safety
    • knowledge management, innovation, research.
  • Five middle managers including regional operational and non-operational personnel (e.g. district managers).
  • Five people in operational or front-line service positions (e.g. volunteers, field operations personnel, community education officers, training instructors).

The purpose of this sampling method was to target personnel who could reasonably be expected to:

  • have an understanding of the strategic planning of the agency
  • have some awareness and involvement in Bushfire CRC and/or Bushfire and Natural Hazards CRC activities
  • be responsible for implementing any changes needed based on research evidence.

In the 2016 sample, 266 responses were received from 29 agencies yielding a response rate of 53 per cent, which is appropriate for online surveys of this type (Baruch & Holtom 2008). There was a median of 22 years of experience in the sector and 13 years in their current agency. Of the participants who answered the question about their position in the agency, 28 (15 per cent) were in senior management positions, 126 (66 per cent) were in middle-management roles and 37 (19 per cent) had front-line responsibilities.

Results

A total of 168 participants provided comments on the processes agencies have in place to keep up to date with research. Initial thematic analyses of the data suggested that participants were reporting qualitatively different types of processes. A sample of the comments provided were coded and discussed between two of the authors drawing on research utilisation practice and innovation found in other sectors, for example health (Baernholdt 2007, Nutley & Davies 2016). Based on the sample, a series of codes were developed and reapplied to a further 30 comments. Once the coders achieved an inter-rater reliability of 88 per cent, the remaining comments were coded and all responses were reviewed and discussed.

Table 1 details the four codes that emerged from the data together with examples. The total number of responses coded to the utilisation maturity level is included in the first column.

Table 1: Research utilisation maturity codes and examples.

Level Description Examples in data

1

N=39; (24%)

Systems are ad hoc and unsystematic. Attempts to keep up to date with research depend on individual effort.

‘Undefined, not clearly communicated within communications. Nil business unit assigned to research and development.’

‘…the onus for keeping up to date is largely upon individuals maintaining an interest, or subscribing to emails.’

2

N=63; (39%)

Some systems and processes are documented which enables research to be disseminated. There is little or no evidence of analysis or impact assessment.

‘We have two people that email CRC updates to staff.’

‘Lots of material is distributed via our portal and email to keep staff and volunteers informed.’

3

N=35; (22%)

There are established processes in place for reviewing research (e.g. dissemination and review either through job responsibilities or an internal research committee). No evidence of how the findings are translated or connected to operational activities.

‘Developed a research committee.’

‘SMEs appointed as capability custodians to ensure up to date best practice.’

4

N=23; (14%)

There is evidence of active connections between research and operational activities. Operational and strategic decisions are informed by assessing research using formal research utilisation processes. These processes and systems are widely understood and embedded in multiple areas of practice.

‘… a process of ensuring results are read by key specialist staff involved in program design and delivery, are interpreted and analysed for their implications and relevance and then used to inform decision-making and strategy through numerous internal fora.’

‘Alignment of evidence-based decision-making in the planning phases of annual planning and the development of indicators around causal factors that inform emergent risk.’

These preliminary findings indicate that it may be possible to develop indicators of organisational maturity pertaining to research utilisation. These findings have been reviewed and discussed with practitioners through the AFAC Knowledge Innovation and Research Utilisation Network. Over the course of three meetings a working model for research utilisation maturity has been developed. A summary of the indicative types of items included is presented in Figure 1.

Figure 1 highlights five core organisational elements identified by stakeholders as important in enhancing utilisation practice. This framework can potentially be used to help end users assess the utilisation strategies for research outputs.

Conclusion

These findings suggest that more attention on how organisations learn, not just from their own experience but also how they learn and change based on research outputs is required. Linking the insights gained from the interviews together with the development of the template for research utilisation maturity allows evaluation and review of the ways research outputs may be systematically embedded and used by organisations.

Implications for future research from these findings suggest there is a need to tease out the elements that comprise learning and innovation cultures and to examine what skills, processes and structures are needed. Further work in identifying how perceived barriers can be overcome in order to increase and strengthen cultures of learning within agencies and the industry is required.

The literature review and research interviews identified many suggestions for improving organisational learning. These included embedding roles and responsibilities for learning, review and follow-up; monitoring and measuring change and linking learning and practice. They also suggest that crises could offer opportunities that support learning by exploiting political attention and drawing knowledge from low-complexity, low-risk events. Another key idea is to invest in quality rather than quantity. This translates into fewer exercises but better training that is well targeted at clear objectives.

Given the significant scrutiny placed on organisations and the emergency services sector as well as the pressure to demonstrate an evidence-base to practice, having a strong learning culture would seem essential. As reported, enhancing understanding of what enables and constrains the assimilation of research into practice is already underway. The next steps will be to trial and evaluate a framework for utilisation maturity so these insights may be generalised to other parts of the sector.

Acknowledgements

The research for Study 1 was supported with funds from the Bushfire and Natural Hazards CRC. Research for Study 2 was supported by AFAC.

References

Atkinson R, Crawford L & Ward S 2006, Fundamental uncertainties in projects and the scope of project management. International Journal of Project Management, vol. 24, pp. 687-698.

Baernholdt M & Lang NM 2007, Government chief nursing officers’ perceptions of barriers to using research on staffing, International Nursing Review, vol. 54, pp. 49-55.

Baruch Y & Holtom BC 2008, Survey response rate levels and trends in organizational research. Human Relations, vol. 61, no. 8, pp. 139-1160.

Borell J & Eriksson K 2008, Improving emergency response capability: an approach for strengthening learning from emergency response evaluations, International Journal of Emergency Management, vol. 5, no. 3/4, pp. 324-337.

Brower R, Choi S, Jeong H & Dilling J 2009, Forms of inter-organisational learning in emergency management networks’, Journal of Homeland Security and Emergency Management, vol. 6, no. 1, pp. 1-16.

Commonwealth of Australia 2013, Australian Emergency Management Handbook 8, Lessons Management.

Deverell E & Hansén D 2009, Learning from Crises and Major Accidents: From Post-Crisis Fantasy Documents to Actual Learning in the Heat of Crisis, Journal of Contingencies and Crisis Management, vol. 17, no 1, pp. 143-145. doi: 10.1111/j.1468-5973.2009.00574.x

Donahue A & Tuohy R 2006, Lessons we don’t learn: A study of the lessons of disasters, why we repeat them, and how we can learn them. Homeland Security Affairs, vol. 2, no. 2.

Drupsteen L & Guldenmund FW 2014, What Is Learning ? A Review of the Safety Literature to Define Learning from Incidents Accidents and Disasters, Journal of Contingencies and Crisis Management, vol. 21, no. 1, pp. 81-96. doi: 10.1111/1468-5973.12039

Duffield SM & Whitty SJ 2016, Application of the Systemic Lessons Learnt Knowledge model for Organisational Learning through Projects. International Journal of Project Management, vol. 34, no. 7, pp. 1280-1293.

Elliott DS & Mihalic S 2004, Issues in disseminating and replicating effective prevention programs. Prevention Science, vol. 5, no. 1, pp. 47-53.

Eskerod P & Skriver HJ 2007, Organizational culture restraining in-house knowledge transfer between project managers—a case study. Project Management Journal, vol. 38, pp. 110-122.

Kolb DA 2014, Experiential learning: Experience as the source of learning and development. FT press.

Milton N 2010, The Lessons Learnt Handbook: Practical approaches to learning from experience. Chandos Publishing, Oxford, UK.

Nutley S & Davies H 2016, Knowledge Mobilisation: creating, sharing and using knowledge in Orr K, Nutley S, Russell S, Bain R, Hacking B & Moran C. (eds). Knowledge and practice in Business and Organisations, Routledge, London.

Owen C 2011, Report on research utilisation consultation, prepared for the Bushfire Co-operative Research Centre, Melbourne.

Owen C 2014, Report on research utilisation consultation, prepared for the Bushfire Co-operative Research Centre, Melbourne

Owen C, Bearman C, Brooks B, Curnin S, Fitzgerald K, Grunwald J & Rainbird S 2015, Decision making, team monitoring and organisational performance in emergency management, report for the Bushfire and Natural Hazards Co-operative Research Centre, Melbourne.

Owen C, Krusel N & Bethune L 2016, Report on research utilisation review, AFAC and Bushfire Natural Hazards Co-operative Research Centre, Melbourne.

Rostis A 2007, Make no mistake: the effectiveness of the lessons-learnt approach to emergency management in Canada, International Journal of Emergency Management, vol. 4, no. 2, pp. 197-210.

Williams T 2008, How do organisations learn lessons from projects–and do they? IEEE Engineering Management Review. vol. 55, pp. 248-266.

About the authors

Dr Christine Owen is an organisational behaviour and learning researcher at the University of Tasmania.

Dr Noreen Krusel is the Director of Research and Utilisation at AFAC and was the Manager of Research Utilisation with the former Bushfire CRC. She has 20 years of emergency services experience.

Dr Chris Bearman is a researcher and project leader for the Bushfire and Natural Hazards CRC decision-making, team monitoring and organisational learning project.

Associate Professor Benjamin Brooks is a human factors researcher and Senior Research Fellow in the Australian Maritime College at the University of Tasmania.