Date: Jan 21, 2025
Majority of the UK public say that the Government should be regulating AI technologies
A nationwide opinion poll reveals that the vast majority (78%) of the UK public believe that the government should be regulating AI technologies.
This comes in the week that the Government has released an ambitious “AI Opportunities Action Plan” to ensure AI benefits all of society. As the plan acknowledges, “well-designed and implemented regulation… can fuel fast, wide and safe development and adoption of AI”.
The poll, commissioned by the Centre for Long-Term Resilience (CLTR) and conducted by Public First, suggests that the public would welcome more details on how the UK government will “make sure this technology is safe”, as the Prime Minister promised this week. It found that the public are optimistic about AI contributing to economic growth, but are also concerned about how the Government will manage risks from AI:
- The vast majority (78%) of the UK public believe that the Government should be regulating AI technologies.
- Nearly half (47%) agree that AI tools will benefit the economy overall, versus a fifth (20%) who disagree.
- Nearly two thirds (64%) of the UK public say that addressing extreme risks from AI — including social biases, military escalation, and misuse by bad actors — should be one of the top priorities of the government.
- Nearly three-quarters (74%) of the UK public state that it’s important that boosting UK science and innovation is a policy priority for the UK, versus just 6% who say it’s unimportant.
- Respondents identified healthcare as the sector of the economy that can benefit most from AI, with 58% of respondents choosing healthcare as the sector that can benefit the most from regulated AI tools – followed by education and manufacturing.
- AI is now perceived to be one of the top three risks to have the largest impact on the world (34%), ranked only behind climate change (49%) and risk of war (48%) — rising from 10th place in 2022 when only 15% of the public placed it in their top three.
CLTR’s Head of AI Policy, Dr Jess Whittlestone, says:
“We’re excited to see so much ambition from the government to ensure AI is used to improve people’s lives across the country. This ambition will only be made possible if the technology is trusted and we’re able to manage the critical risks it poses.
This nationwide poll shows that the UK public understands that AI has potential to bring both huge benefits and substantial risks, and wants to be assured that the government is addressing critical risks. Well-targeted regulation of the most powerful AI models will therefore be key to the UK achieving its growth agenda and responding to the concerns of the UK public.”
The poll also looked more broadly at public perceptions of extreme risks — defined as high-impact, low-probability events with a global reach. It found that:
- Nearly two-thirds (65%) of the public say that the UK is facing extreme risks now, compared to 78% stating that the world is facing extreme risks.
- Three-quarters (75%) of the UK public respondents said they would hold the national government responsible for responding to extreme risks such as AI.
- Nearly half of respondents believe that there is a real possibility in the next fifty years that it will be easier to create new types of bio-engineered virus as a weapon (49%). and that there is a real possibility in the next fifty years that there will be an increased risk of a laboratory leak of a new virus in scientific research (48%).
- Big majorities of the UK public think that the government should be spending significant resources on a range of extreme risks – namely, catastrophic climate change (69%), global pandemic (68%) and nuclear war (63%).
CLTR’s Chief Executive, Angus Mercer, says:
“The UK public clearly thinks that the country is facing extreme risks now, and it holds the government accountable for taking robust action. Such action would be a win-win: it would help keep the public safe, while thoughtful regulation will also help drive forward the government’s growth agenda.”
KEY AI POLLING RESULTS
- The vast majority of the UK public says that the government should be regulating AI technologies
78% of respondents say they think AI tools should be regulated by the government, compared to 12% who say AI tools should not be regulated by the government.
- Nearly two-thirds of the UK public say that regulating AI should be one of the top priorities of the government
64% of respondents say that this should be among the top priorities of the government
- A clear majority of the UK public say that the government should be spending significant resources on preparing for uncontrolled AI – doubling since 2022
57% of respondents believe that the government should spend significant resources on this, up from a quarter in 2022.
- A majority of the public now thinks it’s realistic that AI might unexpectedly harm millions through a human-led error.
54% of people said it’s realistic to imagine that an AI does exactly what a human tells it to do, but in a way that is highly unexpected and harms millions of people, up from 35% in 2022.
- AI is now ranked as one of the top three issues to have the largest impact on the world (34%), behind climate change (49%) and risk of war (48%).
Comparatively, the top three answers in 2022 were climate change (58%), economic inequality (26%), and risk of war (26%). The rise of artificial intelligence came 10th, with only 15% of the public placing it in their top three.
OTHER KEY FINDINGS
- The vast majority of the UK public have heard of certain types of extreme risks
Over 90% of respondents say they have heard or understand phrases such as climate change and artificial intelligence
- Nearly two-thirds of the public say that the UK is facing extreme risks now
65% of respondents state that the UK is facing extreme risks now, with nearly four-fifths of respondents (78%) stating that the world is facing extreme risks
- A large majority of the UK public hold the government accountable for dealing with extreme risks
Three-quarters (75%) of respondents said they would hold the national government responsible for responding to extreme risks
- Big majorities of the UK public think that the government should be spending significant resources on a wide range of extreme risks
Substantial majorities of respondents state that the government should be spending significant resources to prepare for catastrophic climate change (69%), global pandemic (68%), nuclear war (63%), and uncontrolled AI (57%)
Notes to Editors:
Link to datasets
Media enquiries to Toby Orr: toby@shearwater.global / 07736 175311
Public First conducted an anonymous, online survey of 2,011 UK adults from 28 October 2024 – 5 November 2024. All results are weighted using Iterative Proportional Fitting, or ‘Raking’. The results are weighted by interlocking age & gender, region and social grade to Nationally Representative Proportions. Public First is a member of the British Polling Council (BPC) and abides by its rules. For more information please contact the Public First polling team: polling@publicfirst.co.uk.
Recent Posts
2023 Annual Report
We are pleased to publish our 2023 annual report sharing CLTR’s achievements and high-impact activity, with thanks to our funding partners, expert team and collaborators.
The Centre for Long-Term Resilience is recruiting a Senior Adviser in Advocacy and Communications
In this role, you will be responsible for researching, writing, and submitting grant proposals to secure funding for our various research initiatives, and policy development programs and creating impact reports and other external communication pieces.
New grant announcement and match fund now live
We are pleased to announce today that Open Philanthropy is supporting our work to transform global resilience to extreme risks. Open Philanthropy is a philanthropic funder with the mission “to help others as much as we can with the resources available to us.” Open Philanthropy has committed to a grant to CLTR of £4 million […]