Remote face-scanning in probation: innovation or injustice?
Last week, a report from the Public Accounts Committee (PAC) raised alarm about the strain faced by England and Wales’ probation services and their ability to support public protection and reduce reoffending, with the service meeting just seven of 27 performance targets in 2024-25.
This report forms just one part of the wider state of crisis racking the Probation Service. Revolving Doors is a leading voice on the topic, from our 2022 Lived Experience Probation Inquiry to our work on recall.
A functional Probation Service that supports both its staff and those it supervises is vital to breaking cycles of crisis and crime. In our latest blog, Policy Manager and former probation officer Kelly takes a look at recent government plans to introduce face-scanning for those on probation, and the impact this might have on those in the revolving door:
The government recently announced a new pilot to introduce remote face-scanning technology for people under probation supervision. The idea is simple on the surface: individuals will be required to check in via their phones, record short videos that confirm their identity and respond to a few basic questions. Artificial intelligence will then analyse these clips, flagging any failed identity matches or “concerning” responses to probation officers in real time. The stated goal is to reduce reoffending and free up probation staff to focus on the highest-risk cases.
At first glance, the benefits are clear. This could reduce the need for frequent in-person meetings, saving time and resources for a probation service that is consistently judged as inadequate and unable to offer support those it supervises. Indeed, far too many probation appointments consist of a simple ‘hello, and how are things going’ followed by the Probation Officer doing lots of paperwork and that same person wondering what the point of the meeting was.
But should we accept that because probation is not performing as it should, removing human contact is the answer?
From a purely practical perspective, 5% of people in the UK do not own a smartphone, and it seems likely a large percentage of them are people on probation. Others struggle to gain access to Wi-Fi or simply lose phones because of the chaos in their lives, including unstable housing and mental health issues.
There are also profound questions about privacy and rights. Biometric data is among the most sensitive information we have. Who will store these videos, how long will they be kept, and for what purposes might they be used in future? Once a surveillance tool is introduced, the temptation to expand its scope will likely be strong.
It should also be considered that for a lot of people on probation the prospect of handing over any of their data to government agencies holds real fear, based on their past experience of authority. Is it possible that refusal to comply with this technology because of these types of concerns could be seen as non-compliance leading to breach or recall?
We also need to remember that facial recognition technology is notoriously imperfect, and errors can have serious consequences. Lighting, camera quality, or simply a change in appearance could confuse the system. False alerts could lead to unnecessary interventions or even breaches, undermining trust between probation officers and the people they supervise. More troubling still, facial recognition algorithms have been shown to perform unequally across different demographic groups, with higher error rates for people of colour. Without careful oversight, such a system could end up disproportionately punishing those already most disadvantaged.
International comparisons
Internationally, there are lessons to be learned. In parts of the United States, probation services have experimented with smartphone apps requiring photo or facial recognition check-ins. While these tools have offered convenience and flexibility, they have also generated complaints about false positives, digital exclusion, and over-reliance on private companies running the systems. In Australia, electronic monitoring is more common, but controversies around privacy, disproportionate impacts on Indigenous communities, and technical reliability have made clear that technology alone cannot solve deep social challenges.
Even here in the UK, the police use of live facial recognition at public events has already sparked legal challenges, with courts and regulators questioning whether its use complies with human rights standards.
Supervision and the role of relationships
Remote check-ins might be more convenient for some people on probation, particularly those with work or family commitments. But online check-ins also mean the loss of human contact.
This runs counter to clear and consistent evidence that shows the quality of the relationship between probation officers and those they supervise is one of the most important factors in supporting desistance and achieving positive outcomes. Good supervision should not just be about monitoring and enforcement. It requires genuine relationships that demonstrate care for the individual, their future, and their capacity to change. Effective probation officers actively listen, involve the individuals they are supervising in setting meaningful goals, and persistently encourage problem-solving and motivation, even when setbacks occur. They recognise that lapses or breaches are part of a longer desistance journey and respond thoughtfully rather than punitively.
Some evidence does exist that suggests people apply social rules to computers and will self-disclose to them in ways similar to people. Studies comparing chatbot-based mental-health support and human therapy or control conditions also report increased self-disclosure. However, the difference is that people enter into these interactions voluntarily – rather than being compelled by Court Order.
Probation Officers fulfil an almost entirely unique role. It takes real skill to encourage someone to be open and honest whilst holding power over them to breach or recall.
When it works, this relationship is a vital rehabilitative tool. For people with histories of trauma, disclosure of personal worries and past experiences is difficult, but made possible by genuine human connection. This is something artificial intelligence cannot replicate: AI lacks emotional depth, cannot read non-verbal cues, and cannot capture the nuances of human emotion. Supervision taking place effectively by ‘robot’ could serve to further instil a feeling of not being worthy of human contact in people who have histories of feeling processed by organisations – particularly those who are care experienced or have been failed by services repeatedly in the past.
So, whilst remote face scanning in probation may offer opportunities to modernise the justice system, its success will depend on careful design and strong safeguards. Clear limits on data retention, independent scrutiny of algorithmic bias, and robust mechanisms for appeal are essential to protect fairness and accountability.
Ultimately, the true measure of this approach will not be how effectively it catches people out, but how well it supports them to move forward—reducing reoffending while upholding dignity, rights and public trust.
If we are serious about breaking the revolving door of crisis, crime and punishment, we must be honest about what works. Technology should support rehabilitation — not replace relationships, deepen exclusion or expand surveillance in the lives of people already under intense scrutiny.
Revolving Doors urges government, probation leaders and technology providers to pause and rethink this approach. Any use of remote face-scanning in probation must be co-designed with people with lived experience, be genuinely optional and include strong safeguards on data use, algorithmic bias and routes to challenge errors — without the risk of punishment for non-use.
Above all, reform must prioritise what the evidence is clear on: skilled probation staff, manageable caseloads, and trusting, trauma-informed relationships that support desistance and address unmet health and social needs.
Innovation should not automate mistrust. If probation reform is to reduce reoffending and uphold public confidence, it must treat people as human beings — not data points — and strengthen, not weaken, the relational foundations of effective supervision.
