‘Big Brother’ refers to “a government, ruler, or person in authority that has complete power and tries to control people’s behaviour and thoughts and limit their freedom.” Now, it seems as though the London Metropolitan Police Service are ready to embrace what it means to be ‘Big Brother’ by using Live Facial Recognition (LFR) cameras to scan and detect potential criminals amongst the general public despite warnings from advocacy groups.
Despite the London Metropolitan Police Service insisting that the technology will be used with good intentions, LFR has received its fair share of criticism over the last few years. In April of 2019, the University of Essex tested the accuracy of the LFR system and they found that it had an inaccuracy rate of a whopping 81%, and worse yet, in 2018 the same technology was used by South Wales police force and mistakenly identified over 2,000 people innocent people as potential criminals.
— Metropolitan Police (@metpoliceuk) January 24, 2020
The LFR technology was produced by Japanese company NEC and according to a tweet by the MET, “if someone passes through LFR and there is no match, the biometric data is automatically and immediately deleted.” It also mentions that LFR is an entirely closed system and does not interact with CCTV, speed cameras, or body cameras.
Despite the police force ensuring that any pictures which don’t trigger the system will be immediately deleted, advocacy groups have warned against the system.
“This decision represents an enormous expansion of the surveillance state and a serious threat to civil liberties in the UK,” Big Brother Watch director Silkie Carlo said in a statement.