Trusted and safe

Artificial intelligence (AI) is rapidly becoming integral to how councils manage rising demand, deliver services and support communities. 

With stretched workforces, complex caseloads and growing expectations for personalised public services, councils are now exploring how AI can streamline customer contact, improve decision-making and free up staff for frontline work.

Local government’s breadth of services makes it one of the most fertile environments for innovation. However, because councils serve everyone, including people receiving social care, temporary accommodation or crisis support, the bar for AI use must be high. 

Efficiency alone is not enough. AI systems used in public services must be safe, transparent and trusted. 

The Horizon IT scandal affecting UK sub-postmasters and the Dutch childcare benefits case show how poorly governed automation can cause serious harm, particularly to our most vulnerable residents. 

As AI begins to influence decisions in social care, housing, planning and benefits, councils must embed and lead responsible use. 

Local government understands cumulative risk better than any other part of the public sector and its democratic mandate gives it a unique responsibility to get this right.

In recognition of this crucial role, the LGA has been appointed to the UK Government Digital Service’s (GDS’s) Responsible AI Advisory Panel. 

Established by the GDS within the Department for Science, Innovation and Technology, the panel brings together expertise from the public sector, industry, academia and civil society to guide the responsible use of AI in government.

The LGA’s presence ensures the realities of local service delivery, community needs and democratic accountability are embedded in the national conversation from the outset.

Councils are already laying strong foundations. 

The LGA’s 2025 State of the Sector AI Survey shows local government is undergoing rapid change, with 95 per cent of participating councils using or exploring AI. 

Almost half (49 per cent) of current AI use sits within health and social care, while 38 per cent supports advice, customer contact and benefits-related services. While this demonstrates ambition, it also highlights exposure to sensitive environments in which mistakes can have serious consequences. 

Encouragingly, councils are responding with strengthened governance: nearly half are adapting existing data-protection frameworks to cover AI; 41 per cent have introduced AI-specific policies; and 38 per cent have appointed a senior responsible owner for AI to ensure oversight and accountability.

Democratic engagement is also emerging as a defining strength. The Liverpool City Region AI Charter is ensuring residents’ voices guide decision-making, while North Yorkshire and Dorset are embedding ethical AI principles directly into corporate governance, demonstrating what responsible innovation looks like in practice.

Despite this progress, responsible AI adoption remains uneven across councils, underpinned by a fragmented national policy landscape. Councils need clearer expectations, practical tools, and improved coordination from government to ensure AI becomes a tool for inclusion rather than exclusion.

Over the coming months, the LGA will engage its membership to feed council perspectives into the panel’s work, ensuring recommendations are grounded in the complexity and diversity of public services at the local level. 

As the UK shapes the next generation of digital public services, the inclusion of local government at the heart of AI policy making will strengthen accountability, transparency and public confidence in this significant transformation.

Previous

Keeping it local

Reforming the planning system

Next