A bad workman always blames their tools?
A bad workman always blames their tools?
The story about West Midlands police chief constable admitting the reason for banning Maccabi Tel Aviv fans was an ‘AI error’ is troubling.
‘Measure twice, cut once’ has stood the test of time for many a carpenter.
Time for a new idiom regarding the use of AI: Read, verify, execute.
Large Language Models (LLMs) are a predictive model built on pattern recognition in language.
An LLM doesn’t read left to right or right to left like humans. It processes tokens sequentially using attention mechanisms and predicts the next most likely token based on patterns in its training data.
So who is at fault?
Microsoft Copilot? No. That’d be like blaming the hammer for hurting your thumb.
The individual who prompted Microsoft Copilot? Hmmm, no.
The organisation? Normally – in business: yes. But in this instance, whilst the West Midlands Police Chief Constable is the ‘CEO’ – they’re only really culpable for trusting their team to provide the correct intelligence. We have to go a stage further up.
Those responsible for funding the organisation? Partly. The UK Government funds policing, but operational decisions on technology adoption and training usually sit with individual forces and the College of Policing.
Ultimately, the buck stops with leadership and policy strategy regarding AI adoption and usage – if one even exists.
Do the West Midlands Police have Microsoft 365 and Copilot tenancies for their officers, within an enterprise model safeguarded and secure? Unknown – there’s no public evidence either way.
Have West Midlands Police trained their force how to use AI? Again, unclear – no official statement confirms this.
Initial police training is 12–15 weeks. After which officers enter a two-year probationary period. The entire journey to become a police officer can be around 12 months.
After this, officers receive ongoing and specialist training: Personal Safety Training, Firearms and Specialist Weapons Training, Driving Courses, Cybercrime and Digital Forensics – you name it – there’s a training course.
Police using AI without training is like putting a chef’s uniform on and turning up to McDonald’s. Or sticking a kid’s police hat on and walking the beat.
The risk of unleashing AI without training results in police officers trying to gather intelligence, badly. This is dangerous for a host of reasons.
Training and upskilling people to know how to use AI is how you, a) avoid embarrassment and b) help your workforce navigate the rapid pace of change facing all of us.
Without it, you will be replaced by a pattern recognition agent of the LLM variety.
West Midlands Police – drop me a DM. We can help train your force to use AI correctly.
---
I’m a curiously optimistic cynic regarding AI. We can’t explore humanity from behind a screen. But when we do - understanding how to work with technology helps.


