The UK Information Commissioner Elizabeth Denham has expressed concerns about the use of automatic facial recognition technology (FRT) by law enforcement authorities. Noting that, 'there may be significant public safety benefits' from the use of FRT by the police, Denham outlined a series of concerns stemming from such use, ranging from privacy considerations, to a lack of transparency in how the technology is deployed, issues of effectiveness and accuracy, and risks of bias and false positive matches. The commissioner is also worried about the absence of 'a comprehensive governance framework to oversee FRT deployment', and stated that authorities 'must have clear evidence to demonstrate that the use of FRT in public spaces is effective in resolving the problem that it aims to address'. Denham's statements come in the context of a report released by the Big Brother Watch organisation – ‘Face off: The lawless growth of facial recognition in UK policing’ – which argued, among other things, that the use of FRT has led to a 'staggering' number of innocent people being inaccurately flagged as suspects. The report called for UK authorities to stop the use of automated facial recognition software with surveillance cameras, expressing concerns over the impact of such software on 'on individuals’ rights to private life and freedom of expression, and the risk of discriminatory impact'. In response, police authorities argued that their use of FRT is accompanied by safeguards to prevent action being taken against innocent people.