Shape of a globe
Shape of a globe

Date: Apr 15, 2023

Why we started CLTR

Five years ago this month, Angus Mercer and Sophie Dannreuther began talking about career impact, policymaking, and some of the key emerging challenges that governments would face in the coming years. It was the start of a conversation that would result a year later in the establishment of the Centre for Long-Term Resilience.

In this blog, Angus and Sophie look back on their motivations for founding CLTR. Given the increasing levels of scrutiny that effective altruism and longtermism are receiving in the media, the blog particularly seeks to explain the degree to which these ideas served as an inspiration for creating CLTR. It also explores the reservations that CLTR’s founders have about certain aspects of these movements.

Angus Mercer, Chief Executive:

I was earning my civil service stripes at the UK Department for International Development (DFID) in late 2018 when Sophie and I first started to explore the idea of establishing a specialist think tank focused on extreme risks.

It certainly wasn’t the obvious next career move. I’d only been at DFID for just over a year. As a pretty new civil servant and an even newer dad, there were certainly more stable and conventional paths to take.

Five years on, it’s interesting to look back and reflect on why we started CLTR.

Effective altruism (EA) was clearly one important source of inspiration. The key idea that resonated with me was that of using data, evidence and sound reasoning to help have a positive impact on the world – and in particular the focus on important, neglected and solvable challenges.

I saw people in the EA community donating their time, a significant proportion of their earnings, and in some cases even their kidney in pursuit of helping others – often to help extremely vulnerable people who they would never meet in person. I found that pretty inspiring. I also saw first-hand during my time at DFID how rigorous use of data and evidence could help save the lives of some of the most vulnerable people in the world.

Back in 2018, extreme risks – and government risk management generally – felt to me like an extremely important policy area. It also seemed to be highly neglected – there was every incentive for governments to not give this area the attention it deserved, and instead to focus on issues that were more pressing in the short term. To my mind, starting a think tank to help governments think about how to better manage these risks could be extremely valuable.

The core EA idea of using data and evidence to make progress on important, neglected and solvable challenges was therefore an important part of CLTR’s origin story.

But as I look beyond this core idea and increasingly learn more about some of the broader aspects of culture and behaviour that have emerged within EA, there is quite a lot that I do not support or identify with.

Recent reports of women experiencing harassment and sexual assault within EA, and accusations that the community has fostered an environment in which such behaviour is tolerated, are deeply troubling and need to be taken very seriously. There also seems to be a tendency for some people within EA to strive for the absolute highest ‘expected value’ outcome in all circumstances, and to use this ambition to justify highly risky – or even downright immoral – behaviour. And a culture within parts of EA to work in a silo and not seek out wider expertise is something I have personally found problematic.

When it comes to longtermism, I agree with the core idea that future generations matter – and that we should see their interests as a key moral priority. I care deeply about what the world will look like for my daughters, for all those in their generation, and for their children and grandchildren.

But longtermism clearly doesn’t have a monopoly on caring about the welfare of future generations. And I think there are good theoretical and practical reasons to be wary of strong interpretations of longtermism, particularly in a policy context.

On the theoretical side, if longtermist ideas are taken to their extreme, they risk elevating concerns about future people, and hypothetical future risks to those people, above the very real and pressing concerns of people alive today. This is not a position that I support (and my understanding is that this is not a position that all longtermists support either).

On the more practical side, the further out we look in terms of our time horizon, the less convinced I am that we can say with any confidence what concrete policy steps we can take right now to try to improve the world or make it safer.

And at CLTR it is these concrete policy steps that we are most interested in.

When we talk about governments taking a longer-term view and boosting its resilience to extreme risks, we’re very much not talking about taking a thousand or million-year view. We’re talking about the concrete, actionable steps that governments can take to ensure that over the next five, 10, 20 years, we become much more resilient to the threats that extreme risks, like AI and biological risks, will bring. This may not sound like “the long term” for those involved in longtermism, but to those of us in and around the world of government policy, 10 or 20 years is very much a long-term time horizon.

To achieve this long-term resilience to extreme risks, we need to build a coalition of support in the UK and in the wider world.

EA was Sophie’s and my first exposure to the world of extreme risks, but these days we don’t see CLTR as an EA or a longtermist organisation.

Firstly, this is because we are trying to recruit people with a broad range of expertise, beliefs and worldviews, and CLTR is as much theirs as it is ours.

Secondly, it is because of the personal misgivings we have about important aspects of EA and longtermism, such as the concerns mentioned above around a culture of justifying risky or immoral behaviour in the name of impact, and reports of a hostile culture towards women.

And thirdly, it is because we want as diverse a group of people as possible working with us to boost resilience to extreme risks, to help ensure that our organisational goals, team values and policy ideas are as robust as possible.

This is really important to us – if we are going to achieve our mission of transforming global resilience to extreme risks, we need to be an organisation that reflects the expertise, ideas and values of everyone working in this space. This includes:

  • The academic world of global catastrophic and existential risk;
  • The private sector risk management community;
  • Social justice communities (in particular, communities who champion and give voice to those likely to suffer the most from extreme risks);
  • Those working in the areas of technology governance, civil service reform and global health security.

I’m proud of the steps we are taking to increase the range of people and organisations in our network. On the academic side, for instance, we’ve worked with experts at Oxford, Cambridge, Bath, Birmingham, King’s College London, Pennsylvania, Georgetown and MIT over the last year. We know we need to further expand and diversify the range of institutions we partner with, which we are actively doing. I’m also excited that we’re in the process of expanding and diversifying our board, and our funding sources to ensure they better reflect who we are as an organisation.

We’re in the early stages of a journey at CLTR but our destination is clear – we want to play our part in creating a safer world by helping governments be far better prepared for future extreme risk events.

And we want this to be a mission that people of many different backgrounds, experiences and worldviews will help shape, contribute to and rally behind. That means continuing to work proudly with extreme risks experts whose values we share, both inside and outside the EA community. At a time where the world is grappling with AI, pandemics and other extreme risks, this has rarely been as important as it is today.

Sophie Dannreuther, Director and Head of External Affairs:

Just as Gus has said, I’m proud to be at a think tank working hard on addressing the global problem of extreme risks which affect everyone in the world. I’m immensely proud of our team who are passionate about keeping people as safe as possible, and working across sectors to do so.

The EA movement has been a part of me personally getting here. Through learning more about data and evidence and discovering a community where many people are committed to charity work, I became more confident and better able to work on areas that I think are important.

By reading articles and listening to talks from experts, it’s also where I initially learned about the risks posed by AI and the range of biological risks that we face. I’ve also found it motivating to be around other people who are working hard on complicated problems, many of whom I have met through the EA movement.

That said, our sole focus at CLTR is transforming global resilience to extreme risks, and trying to bring more policy attention to them. This isn’t an EA-specific mission – it’s a mission that is shared by everyone who works in this space and everyone who wants a resilient world.

I’ve had less experience of longtermism but I do know that I don’t agree with ends-justify-the-means approaches to building a better future, or with a narrow set of people determining what a good future looks like.

In my view, whilst many positive social movements often start with a small group of people, the next step has to be broader engagement with others. We need the future to be shaped by a wide range of people, not just those with the most power.

That said, I do think we shouldn’t ruin future generations’ chances of having good, meaningful lives, just as people alive today deserve to.

I’ve been saddened to see major problems emerge within EA and longtermism. I hope that these movements will take these problems seriously, and listen and engage more with critics and experts from other fields.

Recent Posts

New grant announcement and match fund now live

We are pleased to announce today that Open Philanthropy is supporting our work to transform global resilience to extreme risks.  Open Philanthropy is a philanthropic funder with the mission “to help others as much as we can with the resources available to us.” Open Philanthropy has committed to a grant to CLTR of £4 million […]

Oct 10, 2024

The Centre for Long-Term Resilience (CLTR) is recruiting a freelance communications contractor

In this role, you will be responsible for researching, writing, and submitting grant proposals to secure funding for our various research initiatives, and policy development programs and creating impact reports and other external communication pieces. 

Oct 2, 2024

Expression of Interest: Strategic Proposal Writer (contractor)

In this role, you will be responsible for researching, writing, and submitting grant proposals to secure funding for our various research initiatives, and policy development programs and creating impact reports and other external communication pieces. 

Aug 23, 2024