UK government needs a chief risk officer to boost resilience, experts say

By on 04/06/2021 | Updated on 04/06/2021
Resilience after COVID-19: the government must seize the opportunity to prepare for extreme risks. Credit: Matthew Prior/Wikimedia Commons

The UK government should adopt a “three lines of defence” model and establish a chief risk officer (CRO) to improve the country’s resilience, experts have suggested.

Future Proof, a new report from the Centre for Long Term Resilience, considers how the UK government can improve the way it manages extreme risks in the wake of the COVID-19 pandemic.

“While the scale of national tragedy is alive in our minds, the government must seize this opportunity, and ensure we are much better prepared for the next extreme risk event that will devastate lives and economies on a global scale,” the report says.

The authors focus on two specific areas of risk – biosecurity and artificial intelligence – before outlining cross-cutting policy recommendations to boost resilience at system-level, including ideas which could be valuable to officials around the world.

A new approach to risk

The authors suggest government adopts a three lines of defence model for managing risk, as is widespread in the corporate sector. The first line would be risk ownership units embedded in individual departments.

The second line of defence would be establishing an office of risk management and a CRO with specialist expertise. The role would hold overall responsibility for risk management in government.

The final layer would be to establish a “National Extreme Risks Institute” to act as an independent advisory and audit function. The body would conduct “independent, evidence-based assessments of extreme risks” and would allow for “a greater focus on risks that are low-probability but highly destructive.”

One possible drawback of such a structure is risk being siloed, the report concedes. “But without a new CRO, there is no single point of responsibility for risk management. This means that it tends to be deprioritised amidst the ‘tyranny of the urgent’,” it notes.

“Red teaming”, in which experts run scenario exercises to stress-test the UK’s resilience in the face of different challenges, should also be “normalised” across government, the report says.

Collaboration across departments and countries

Governments must ensure improved focus on resilience across departments, and must work together on the global stage, the report argues.

The UK cannot address transnational challenges alone, the authors note. “As countries begin to form their longer-term policy responses to COVID-19, there may never be a better moment to put extreme risks at the top of the international agenda and go beyond simply ‘preparing to fight the last war’ of pandemic preparedness,” the report says.

Recommendations for internationally coordinated action include encouraging governments to establish long-term extreme risks spending commitments, modelled on existing examples like NATO member states’ commitment to spend 2% of GDP on defence. The UK could also push for countries to fund a “Crisis Lookout” function.

Biosecurity and AI

While Covid has drawn attention to biological threats, the report’s authors argue, policymakers must avoid preparing solely for further naturally-occurring pandemics. Accidental and deliberate biological threats also pose serious risk.

The report targets several recommendations at the UK government. Among these are tasking a single body with preparing for the full range of possible biosecurity risks; launching a prize to incentivise the development of “clinical metagenomics”, which can identify new pathogens with just a few infected patients; and ensuring all DNA synthesis is screened for dangerous pathogens.

Meanwhile, the report argues, risks from AI are significant. This technology could be misused by human actors; it could cause accidents when systems behave in unintended ways; and widespread, long-term deployment could wreak harmful changes to society.

Recommendations to mitigate this risk for government include establishing “capacity to anticipate and monitor AI progress and its implications for society”; bringing more relevant technical expertise into government; and keeping AI systems entirely out of nuclear weapons’ command, control and communications infrastructure.

About Josh Lowe

Leave a Reply

Your email address will not be published. Required fields are marked *