Date: Aug 30, 2023
Our work related to the UK AI Safety Summit
Topic/Area:
Artificial IntelligenceDate: Aug 30, 2023
Topic/Area:
Artificial IntelligenceThe UK government has committed to holding an international summit on AI safety at Bletchley Park this November.
Over the past couple of months, I’ve been having various conversations inside and outside of the UK government, to try and better understand what the summit might aim to achieve, to form my own views on what it should aim to achieve, and to identify ways we and others can help make the summit a success.
A lot of the details here are still in flux, but I wanted to start communicating more publicly about my thoughts around the summit, and the work we at CLTR are doing related to it.
I’m particularly excited about the idea that the UK AI Safety Summit could start to move AI governance discussions and commitments in the direction of greater enforcement and accountability: i.e. beyond companies making broad, voluntary commitments to safety and ethics, towards a world where governments can hold those companies meaningfully accountable for safe and responsible AI development.
This is a broad aim – some more concrete examples of what this could look like in terms of summit outcomes include:
To support this move towards enforcement in AI governance, it is particularly important that independent experts and civil society play a central role in the AI summit and preparations for it. I recently wrote a policy brief with recommendations in this direction, available here. I’m concerned that by default, the summit will centre discussions between the CEOs of AI companies and national leaders. But the event will have a better chance of resulting in substantive progress in AI governance if independent third parties, who can provide an important source of expertise and a counterweight to industry interests, are also centrally involved. I think this is as important in the lead-up to the summit as it is at the event itself: many of the substantive agreements and commitments will likely be reached beforehand, and independent scrutiny will be essential to getting these right.
I also think that, to establish real leadership internationally, the UK needs to be clear about how it plans to update its domestic regulatory approach.
We’re speaking to a range of teams across the Department of Science, Innovation and Technology (DSIT) to better understand how we can best support work related to the summit.
Some work we’re currently doing at CLTR includes:
If you’re working on or interested in something related to these topics or anything else discussed in this post, we’d love to hear from you: you can get in touch at info@longtermresilience.org
We are pleased to announce today that Open Philanthropy is supporting our work to transform global resilience to extreme risks. Open Philanthropy is a philanthropic funder with the mission “to help others as much as we can with the resources available to us.” Open Philanthropy has committed to a grant to CLTR of £4 million […]
In this role, you will be responsible for researching, writing, and submitting grant proposals to secure funding for our various research initiatives, and policy development programs and creating impact reports and other external communication pieces.
In this role, you will be responsible for researching, writing, and submitting grant proposals to secure funding for our various research initiatives, and policy development programs and creating impact reports and other external communication pieces.