After spending a year trying to empty our overcrowded prisons, the Labour government has now decided its best bet is to catch criminals before they strike. Police AI technology will use ‘predictive analytics’ to ‘identify and target the 1,000 most dangerous predatory men who pose the highest risk to women and girls in England and Wales’, reports the Telegraph. A Home Office white paper is set to announce a series of police reforms including expanding the use of AI by forces across the country. Some £4 million has been earmarked to create an interactive AI-driven map of England and Wales crime hotspots by 2030, among other projects being trialled by police chiefs. Rather sinisterly, this follows Home Secretary Shabana Mahmood bragging last month to Tony Blair, now an AI evangelist, of her dreams ‘to achieve, by means of AI and technology, what Jeremy Bentham tried to do with his Panopticon. That is that the eyes of the state can be on you at all times.’ Labour certainly kept that one out of the manifesto.
Will this flip-flopping Labour government really hold its nerve on its ‘Minority Report’ plan after the activist class wakes up to its implications?
Still, how transformative will Mahmood’s robocop dystopia really be? Such initiatives might sound slick and cutting-edge, but as Andrew Orlowski notes, AI-talk is very often employed as a cover for managerial stupidity. After all, it’s not as if Britain’s crime problem requires a supercomputer to comprehend – or to solve. Police might start with, say, not actively recruiting suspected child rapists in order to increase ‘diversity’. It also wouldn’t hurt if our prison service tried to avoid letting out convicted sex offenders early. Before busting out the gizmos, perhaps we could also consider not allowing hundreds of thousands of foreign young men from countries with backward views about women to break into our country, and not then housing them in market towns near schools. Indeed, the most famous police AI use case to date is when West Midlands Police had it hallucinate a phantom match report to justify banning Israeli football fans from sectarian Birmingham, with the result that Chief Constable Craig Guildford has now ignominiously resigned.
Whether couched in AI jargon or otherwise, the thrust of this plan is for the police to use data about where crimes are happening to inform their operational decisions. That is welcome and sensible. But such initiatives are nothing new and traditionally the police have not been allowed to get away with them. The Blob hates transparency when it comes to ethnicity and crime. As long ago as the Brixton Riots of 1981, the fact that the Metropolitan Police were devoting considerable attention to Lambeth’s Railton Road, ‘the capital’s capital of street robbery’, according to one historian, was held to be evidence of flagrant police racism and unfair treatment of its black population. Since the Stephen Lawrence inquiry, forces have been endlessly warned not to engage in ‘stereotyping’, which after all is only the use of experience to make educated guesses.
So it was that over the course of the grooming gangs scandal, forces went out of their way not to acknowledge patterns of offending because of political sensibilities. Likewise, not long ago, a Metropolitan Police database of London gang crime had to be shut down because too many suspects on it were black. As for tech, in a report last year, Amnesty International complains that almost three-quarters of British police forces now predict crime by ‘racially profiling communities’ through automated systems. These are ‘built with discriminatory data and serve only to supercharge racism’, it argued, apparently putting police forces in ‘flagrant breach of the UK’s national and international human rights obligations’. Will this flip-flopping Labour government really hold its nerve on its ‘Minority Report’ plan after the activist class wakes up to its implications?
Even worse for the Home Secretary, there are specific government equalities commitments which will hamper the use of AI if it draws the wrong conclusions. As the ‘Inclusive Britain’ strategy introduced under the last government warned: ‘We do not yet fully understand how the use of AI will impact ethnic minorities, although we do know that bias can enter AI just as it can enter any process.’ To ‘enhance transparency and trust’, the EHRC would have to advise on necessary safeguards, including how the Public Sector Equality Duty applies when a public body uses AI. (Readers may be interested to know that the equalities minister presiding over all this was a certain anti-woke warrior by the name of Kemi Badenoch.) The latest EHRC guidance now warns that even data that might be a ‘proxy’ for protected characteristics is potentially suspect.
The reality is there has always been data about crime at the police’s fingertips – it’s what they deal with every day, after all – but there is now a great firewall of political taboo against even mentioning many of the brute facts, let alone using them operationally. We saw a very public example of this last year, when following a seven-month BBC sting operation, a Met officer was fired for gross misconduct merely for privately acknowledging his own experiences in dealing with ethnic-minority criminals. Whether it’s AI or just the evidence of one’s lying eyes, as long as forces are hemmed in by equalities impact assessments, ‘community policing’, and woke sensibilities, any attempt to pursue crime wherever the data leads will be a non-starter.












