Why are UK schools being slow to adopt AI?
- Mark Hanley-Browne
- Feb 24
- 4 min read
During my visit to the Bett Show in January, in London, I was impressed to see how many teachers, pupils and governors were in attendance and showing huge interest and enthusiasm for what was on offer.
So why, given all this enthusiasm, are schools across the country being slow to adopt AI?
Having discussed this with a few of my colleagues in the education sector and having read some interesting recent reports on the subject, the following themes have emerged.
1. Uneven access to training and expertise
It seems that private schools, in general, are moving faster on AI adoption than many schools in the state sector, and this is largely because their staff are receiving more AI training. In fact a recent study by the Sutton Trust suggests that teachers in private schools are more than twice as likely to have had formal AI training, in 2025, compared with state-school teachers (45% vs 21%). Without training, of course teachers are hesitant to integrate tools into their teaching that they don’t fully understand.
2. A widening AI divide between schools within the state sector
The Sutton Trust reports highlight a growing gap here between schools, particularly those in disadvantaged areas. And this isn’t about the willingness of teachers to engage - it is about capacity. Some schools have fewer resources than others, which means that they don’t have the devices, the infrastructure and the staff development budget needed to be able to empower their teachers to use AI meaningfully in their lessons.
3. Infrastructure issues
The Department for Education acknowledges that many schools still lack the right technology to use AI effectively. Reliable devices, smooth running networks, and a robust digital platform are prerequisites for AI adoption, but many schools simply don’t have this in place yet. https://roadmap-for-modern-digital-government.campaign.gov.uk/ai/ai-in-education/
4. Caution around safety, ethics, and governance
There are other factors operating here too. School leaders and Governors are worried about data protection issues, safeguarding and the unreliability of some AI-generated content. I have heard several early adopters arguing that the biggest risk here is doing nothing. But, unsurprisingly, many school leaders feel they need clearer guidance before acting.
5. Workload pressures and limited time to experiment
Ironically, AI can and should be reducing the workload on teachers. Early adopters are already using it for planning, communication and to streamline admin tasks. But teachers need time to learn how to use these tools before they can fully benefit from them. And the problem is that time is the one thing that teachers don’t have in abundance.
6. Lack of consistent national strategy in practice
The government is ambitious about AI in education. But translating policy into day‑to‑day classroom practice takes time. Schools want clearer evidence on what works, how to implement it safely and how to measure the impact. The DfE is now working on this with schools - but progress has been slow.
During my day-to-day job with The Team Lab I work with a lot of Edtech founders, and I can see their frustration first hand. This is such a shame, because there are a lot of excellent products on the market which will undoubtedly save time for teachers, and give students assistance out of hours (AI tutors such as Solvely.ai) and help to level the playing field for SEND pupils. I have seen some excellent language learning solutions which could help reclaim some of the lost ground over the past decades in the learning of Modern Languages. However, until these wider issues are addressed, Edtech companies will continue to find it hard to get traction with their AI solutions in schools.
So what can we do about this? As is often the case, we can learn a lot by seeing how other countries are dealing with this challenge.
Given their strong Ed‑tech ecosystem and private-sector innovation, set alongside large-scale investment in personalised learning platforms, it is no surprise to see the USA leading the pack with AI adoption in schools according to a recent report by ESS (https://essfeed.com/top-10-countries-adopting-ai-in-education/)
The US is followed closely by China – and, again, no surprises there.
India is also moving fast with AI adoption in schools, integrating AI from primary school onwards and treating it as a foundational skill rather than an optional add‑on. This early‑stage integration is backed by large-scale workforce development and a national push to prepare students for an AI‑enabled economy.
The lessons we can take from these three countries are:
1. Having a clear national strategy (India and China are treating AI literacy as a
strategic priority and to be fair to them, the UK Government is trying to do the
same thing).
2. Curriculum integration early in a child’s learning (so AI is being introduced at
younger ages, normalising its use – but note that, here in the UK, there is some
resistance to this, not only from school leaders but also from many parents).
3. Aggressive investment: Infrastructure, teacher training and ed‑tech
ecosystems are being funded at scale.
The UK is not doing badly. In fact the ESS ranking places the UK 6th in the top 10 countries for AI adoption in education. But it clarifies why we are not at the top of the table. Not only is the lack of investment in IT infrastructure an issue here, but we have a lower tolerance for experimentation with AI tools than the US, China, or India. Those countries are adopting tools quickly, even before solid governance frameworks are put in place.
Speaking as a Governor myself, I understand the caution which exists here in the UK.
I also understand that many parents are not on board with the widespread AI integration into the curriculum for children aged 2 – 11 (or at least not yet).
But I do think that we should be pushing ahead with AI training for teachers and improving the infrastructure which is going to be necessary for AI adoption in our schools in the months and years ahead, as we catch up on the Governance issues. And, speaking personally, I share the view of the Early Adopters that the biggest risk here is doing nothing.

%20(2).png)

Comments