Saudi Press

Saudi Arabia and the world
Sunday, Jun 01, 2025

One in three councils using algorithms to make welfare decisions

One in three councils using algorithms to make welfare decisions

One in three councils are using computer algorithms to help make decisions about benefit claims and other welfare issues, despite evidence emerging that some of the systems are unreliable.
Companies including the US credit-rating businesses Experian and TransUnion, as well as the outsourcing specialist Capita and Palantir, a data-mining firm co-founded by the Trump-supporting billionaire Peter Thiel, are selling machine-learning packages to local authorities that are under pressure to save money.

A Guardian investigation has established that 140 councils out of 408 have now invested in the software contracts, which can run into millions of pounds, more than double the previous estimates.

The systems are being deployed to provide automated guidance on benefit claims, prevent child abuse and allocate school places. But concerns have been raised about privacy and data security, the ability of council officials to understand how some of the systems work, and the difficulty for citizens in challenging automated decisions.

It has emerged North Tyneside council has dropped TransUnion, whose system it used to check housing and council tax benefit claims. Welfare payments to an unknown number of people were wrongly delayed when the computer’s “predictive analytics” erroneously identified low-risk claims as high risk.

Meanwhile, Hackney council in east London has dropped Xantura, another company, from a project to predict child abuse and intervene before it happens, saying it did not deliver the expected benefits. And Sunderland city council has not renewed a £4.5m data analytics contract for an “intelligence hub” provided by Palantir.

A spokesperson for the Local Government Association, which represents councils, said: “Good use of data can be hugely beneficial in helping councils make services more targeted and effective But it is important to note that data is only ever used to inform decisions and not make decisions for councils.”

But Silkie Carlo, the director of the campaign group Big Brother Watch, said the increasing use of algorithms was leaving vulnerable people at the whim of “automated decisions … they have no idea about and can’t challenge”.

Gwilym Morris, a management consultant who works with IT providers to the public sector, said the complexity of the systems meant the leadership of local authorities “don’t really understand what is going on” and this raised questions about how citizens’ data was used.

North Tyneside stopped using TransUnion’s system last month. It automatically processed data about claimants for housing and council tax benefit to determine the likelihood it was fraudulent – “risk based verification”. But most of the cases deemed high risk by the software were in fact lower risk, and benefit claims were wrongly delayed.

A council report concluded: “TransUnion provides no reason for a case meeting a high-risk category and it was found that in most cases, the reason for it being high risk could not be established There was no reason for the payment to be withheld, but claims had been delayed.”

A spokesperson for TransUnion said the classification of risk groups was “ultimately a matter for the local authorities to decide”.

They added: “Each local authority also determines what extra checks are required for claimants falling into any particular category, and can monitor for accuracy so that they can adapt their criteria if necessary.

“The time spent on reviewing ‘high-risk’ claims will equally depend on each local authority’s own policy in terms of processing the additional checks.”

TransUnion said it checked benefit claims for fraud for about 70 local authorities in the UK, and Xantura serves the same number of councils. The combined figure does not include other examples of algorithms found by the Data Justice Lab at Cardiff University.

Sunderland council awarded a contract for a new “intelligence hub” to Palantir in 2014 as part of a plan to make efficiency savings. Last year, it was announced the authority faces a budget gap of about £50m over the next three years.

The hub was used to analyse data to help with the Troubled Families programme, to bring together information on those at risk of sexual exploitation, and to help find areas at risk of flooding. The council said it did not hold a review of the project and did not know how much had been saved.

A council spokesperson said it was always the authority’s intention not to renew the contract and Palantir had worked alongside staff to transfer knowledge so the council would become “self-sufficient”.

Hackney council said “issues of variable data quality meant that the system wasn’t able to provide sufficiently useful insights”.

The Xantura predictive model analyses warning signs about a household, such as a child being expelled from school or a report of domestic violence. The model’s prediction is then passed to a social worker for potential action.

Wajid Shafiq, the chief executive of Xantura, said: “We’re improving the accuracy of our analytics and models continuously but we have never been unable to develop a reliable predictive model.

“There are a number in place right now, adding real value. Not being able to access regular updates of source data to drive the model is a bigger issue – if we don’t get regular feeds, we can’t provide an up-to-date picture of risk factors.”

Simon Burall, a senior associate with the public participation charity Involve, said: “There are never just benefits from these things but risks and harms, namely privacy and data security.

“But also potential wider unintended consequences, including the stigmatisation of communities and unwanted intrusion by particular services. Any benefits must be balanced against those potential risks.”

David Spiegelhalter, a former president of the Royal Statistical Society, said: “There is too much hype and mystery surrounding machine learning and algorithms. I feel that councils should demand trustworthy and transparent explanations of how any system works, why it comes to specific conclusions about individuals, whether it is fair, and whether it will actually help in practice.”
Newsletter

Related Articles

Saudi Press
0:00
0:00
Close
OPEC+ Agrees to Increase Oil Output for Third Consecutive Month
Turkey Detains Istanbul Officials Amid Anti-Corruption Crackdown
Meta and Anduril Collaborate on AI-Driven Military Augmented Reality Systems
EU Central Bank Pushes to Replace US Dollar with Euro as World’s Main Currency
European and Arab Ministers Convene in Madrid to Address Gaza Conflict
U.S. Health Secretary Ends Select COVID-19 Vaccine Recommendations
Trump Warns Putin Is 'Playing with Fire' Amid Escalating Ukraine Conflict
India and Pakistan Engage Trump-Linked Lobbyists to Influence U.S. Policy
U.S. Halts New Student Visa Interviews Amid Enhanced Security Measures
Trump Administration Cancels $100 Million in Federal Contracts with Harvard
SpaceX Starship Test Flight Ends in Failure, Mars Mission Timeline Uncertain
King Charles Affirms Canadian Sovereignty Amid U.S. Statehood Pressure
Iranian Revolutionary Guard Founder Warns Against Trusting Regime in Nuclear Talks
UAE Offers Free ChatGPT Plus Subscriptions to Citizens
Lebanon Initiates Plan to Disarm Palestinian Factions
Iran and U.S. Make Limited Progress in Nuclear Talks
The Daily Debate: The Fall of the Dollar — Strategic Reset or Economic Self-Destruction?
Trump Administration's Tariff Policies and Dollar Strategy Spark Global Economic Debate
OpenAI Acquires Jony Ive’s Startup for $6.5 Billion to Build a Revolutionary “Third Core Device”
Turkey Weighs Citizens in Public as Erdoğan Launches National Slimming Campaign
Saudi-Spanish Business Forum Commences in Riyadh
Saudi Arabia and Spain Sign MoU to Boost SME Sectors
UK Suspends Trade Talks with Israel Amid Gaza Offensive
Iran and U.S. Set for Fifth Round of Nuclear Talks Amid Rising Tensions
Russia Expands Military Presence Near Finland Amid Rising Tensions
Indian Scholar Arrested in Crackdown Over Pakistan Conflict Commentary
Israel Eases Gaza Blockade Amid Internal Dispute Over Military Strategy
President Biden’s announcement of advanced prostate cancer sparked public sympathy—but behind closed doors, Democrats are in panic
A Chinese company made solar tiles that look way nicer than regular panels!
Indian jet shootdown: the all-robot legion behind China’s PL-15E missiles
The Chinese Dragon: The True Winner in the India-Pakistan Clash
Australia's Venomous Creatures Contribute to Life-Saving Antivenom Programme
The Spanish Were Right: Long Working Hours Harm Brain Function
Did Former FBI Director Call for Violence Against Trump? Instagram Post Sparks Uproar
US and UAE Partner to Develop Massive AI Data Center Complex
Apple's $95 Million Siri Settlement: Eligible Users Have Until July 2 to File Claims
US and UAE Reach Preliminary Agreement on Nvidia AI Chip Imports
President Trump and Elon Musk Welcomed by Emir of Qatar Sheikh Tamim with Cybertruck Convoy
Strong Warning Issued: Do Not Use General Chatbots for Medical, Legal, or Educational Guidance
Saudi Arabia Emerges as Global Tech Magnet with U.S. Backing and Trump’s Visit
This was President's departure from Saudi Arabia. The Crown Prince personally escorted him back to the airport.
NVIDIA and Saudi Arabia Launch Strategic Partnership to Establish AI Centers
Trump Meets Syrian President Ahmad al-Shara in Historic Encounter
Trump takes a blow torch to the neocons and interventionists while speaking to the Saudis
US and Saudi Arabia Sign Landmark Agreements Across Multiple Sectors
Why Saudi Arabia Rolled Out a Purple Carpet for Donald Trump Instead of Red
Elon Musk Joins Trump Meeting in Saudi Arabia
Trump says it would be 'stupid' not to accept gift of Qatari plane
Quantum Computing Threatens Bitcoin Security
Michael Jordan to Serve as Analyst for NBA Games
×