top of page

Extreme risks and the UK National Resilience Strategy

Out of the wreckage of the Second World War, the UK transformed itself. It rebuilt its shattered economy. It founded the NHS. It helped establish international institutions like the United Nations, so that the world would never endure a tragedy on this scale again.

Human progress doesn’t come in straight lines. Instead, there are rare moments where big leaps are possible. These moments provide opportunities to achieve decades’ worth of progress in a matter of months.

Such an opportunity for a big leap may now be upon us. As the UK begins to emerge from Covid-19, which cost tens of thousands of lives and over £300 billion in 2020 alone, we have a rare and valuable opportunity to do what we did in 1945 and come back stronger.

The best way to take advantage of this opportunity is to understand not just how to prepare better for pandemics, but how to prepare for all the threats to which we are most vulnerable - threats like climate change, risks from AI, and a nuclear exchange. We need our society and our Government to give these risks the level of attention their seriousness warrants. In short, Britain needs an insurance policy for the most extreme risks we face.

By extreme risks, we mean high-impact threats with a global reach. The most serious kind of extreme risks are ‘existential risks’ - those which could lead to the premature extinction of humanity. One rough estimate for the likelihood of the world experiencing such an existential catastrophe over the next one hundred years is one in six – Russian Roulette. If this is even roughly right, this is an unsustainably high level of risk. We must bring this level of risk down by boosting our resilience to these threats.

To its credit, the Government seems to have understood this. It was encouraging to hear Penny Mordaunt speak this week about the UK’s ambition to become the most resilient nation on earth, and the importance of improving the Government’s understanding of the full range of threats we face, including extreme and existential risks.

To ensure this happens, we need to ensure that extreme risks are systematically understood and proactively managed across Government. Our new report, Future Proof, written in collaboration with a range of extreme risk experts, sets out the steps that are needed.

First, we need to mitigate the risk of another global pandemic. And since the next one could well be human-made, one key priority is to ensure that all synthesised DNA is screened for dangerous pathogens, and that all DNA synthesis machines are regulated. Unless these controls are present, bad actors can too easily get their hands on dangerous or novel pathogens.

The UK is already at the forefront of ‘biofoundries’ which enable the rapid design, construction, and testing of genetically reprogrammed organisms for biotechnology applications and research. This is therefore a natural leadership area for the UK to adopt.

Second, we must urgently improve our current government risk management processes. Extreme risks need to be a central component of the Government’s planned National Resilience Strategy, and be firmly embedded in the upcoming AI strategy and biosecurity review.

Third, to truly be safe from extreme risks, the UK must lead the way internationally. If we become one of the first countries in the world to properly manage extreme risks across the board, we can create and lead a global network of experts sharing best practice across the world. This network could explore why some countries were better prepared for Covid-19 than others, and ensure that all countries’ risk assessments and foresight programmes draw on global expertise.

As we emerge from Covid-19, we must seize this opportunity for transformational change, just as we did in 1945. The Government’s new-found attention to extreme risks provides grounds for optimism. What we need now is a concrete plan of action. We owe this to those who have come before us, those whose lives were lost to Covid-19, and to those who are yet to come.

Guest author: Toby Ord. Toby is a senior research fellow in philosophy at Oxford University, and author of The Precipice: Existential Risk and the Future of Humanity.

Sophie Dannreuther is a Director at the Centre for Long-Term Resilience, and a Research Affiliate at Cambridge University’s Centre for the Study of Existential Risk.

Recent Posts

See All

The near-term impact of AI on disinformation

by Tommy Shaffer Shane Read the full policy paper here: It is rightly concerning to many around the world that AI-enabled disinformation could represent one of the greatest global risks we face, wheth


bottom of page