By embracing biometric surveillance, governments across the West are hurtling down a path that could lead us all to a very dark place.
The Chinese Communist Party’s widespread use of facial recognition programs and other forms of artificial intelligence (AI) technologies to track, monitor and, where possible, evaluate the behavior of members of the public has received ample attention from Western media in the past few years. Myriad reports, some overblown, have been published on China’s creeping introduction of a Social Credit System. By contrast, far less attention has been paid to the increasing use of many of the same highly intrusive technologies by so-called “liberal democracies” in the West.
In 2019, IHS Markit released a report showing that while China may be leading the way when it comes to using surveillance cameras to monitor its population, it is far from an outlier. According to the report, the United States had almost as many surveillance cameras as China, with roughly one for every 4.6 people (compared with China’s one for every 4.1). The UK was not far behind with one for every 6.5. The report also forecast that by 2021 there would be more than one billion surveillance cameras on the planet watching us.
EU Plays Catch Up
Now the EU is on the verge of building one of the largest facial recognition systems on planet Earth, as reports the recent Wired UK article, Europe Is Building a Huge, International Facial Recognition System:
The expansion of facial recognition across Europe is included in wider plans to “modernize” policing across the continent, and it comes under the Prüm II data-sharing proposals. The details were first announced in December, but criticism from European data regulators has gotten louder in recent weeks, as the full impact of the plans have been understood.
“What you are creating is the most extensive biometric surveillance infrastructure that I think we will ever have seen in the world,” says Ella Jakubowska, a policy adviser at the civil rights NGO European Digital Rights (EDRi). Documents obtained by EDRi under freedom of information laws and shared with WIRED reveal how nations pushed for facial recognition to be included in the international policing agreement.
The first iteration of Prüm was signed by seven European countries—Belgium, Germany, Spain, France, Luxembourg, the Netherlands, and Austria—back in 2005 and allows nations to share data to tackle international crime. Since Prüm was introduced, take-up by Europe’s 27 countries has been mixed.
Prüm II plans to significantly expand the amount of information that can be shared, potentially including photos and information from driving licenses. The proposals from the European Commission also say police will have greater “automated” access to information that’s shared. Lawmakers say this means police across Europe will be able to cooperate closely, and the European law enforcement agency Europol will have a “stronger role.”
Targeting Children
The UK, while no longer part of the EU, is hurtling down a similar path. And it is targeting the most vulnerable and impressionable members of society: school children. As I reported in October, nine schools in the Scottish region of North Ayrshire started using facial recognition systems as a form of contactless payment in cashless canteens (cafeterias in the US), until a public outcry put paid to the pilot scheme.
But the Tory government is doubling down. According to a new report in the Daily Mail, almost 70 schools have signed up for a system that scans children’s faces to take contactless payments for canteen lunches while others are reportedly planning to use the controversial technology to monitor children in exam rooms. This time round, however, the government didn’t even bother informing the UK Biometrics and Surveillance Camera Commissioner Fraser Sampson of the plans being drafted by the Department for Education (DfE):
Professor Fraser Sampson, the independent Biometrics and Surveillance Camera Commissioner, said his office was unaware the DfE was drafting new surveillance advice in response to the growing trend for cameras in schools.
‘I find out completely by accident a couple of weeks ago by going to a meeting that the Department for Education has drafted a code of practice for surveillance in schools which they are about to put out to the world to consult,’ he told The Mail on Sunday.
‘And they [DfE] said. “What do you think of it?” And I say, “What code?” We had no idea about it. And having seen it, it would have benefited from some earlier sharing.’
Similar facial recognition systems have been used in the US, though usually as a security measure.
Privacy advocates have warned that the growing use of facial recognition systems in school settings could have a much more goal: to condition children to the widespread use of facial recognition systems and other biometric technologies. In my book Scanned I cite Stephanie Hare, author of Technology Ethics, who argues that it is about normalizing children to understand their bodies “as something they use to transact. That’s how you condition an entire society to use facial recognition.”
The Battle for Privacy
As the EU prepares to pass its own long-awaited AI Bill, which will set out the rules for the development, commodification and use of AI-driven products, services and systems across the 27-member bloc, a battle has broken out between proponents of the technologies and privacy advocates. The former, which include intelligence agencies, law enforcement bodies and tech companies, argue that emerging biometric technology such as facial recognition programs are necessary to catch criminals. The latter have called for an outright ban on the technologies due to the threat they pose to civil liberties.
They include Wojtek Wiewiorowski, who leads the EU’s in-house data protection agency, EPDS, which is supposed to ensure the EU is complying with its own strict privacy rules. In November 2021 Wiewiorowski told Politico that European society is not ready for facial recognition technology: the use of the technology, he said, would “turn society, turn our citizens, turn the places we live, into places where we are permanently recognizable … I’m not sure if we are really as a society ready for that.”
In a paper published in March, the EPDS raised a number of concerns and objections about Prüm II:
- The Prüm II framework does not make clear what sort of circumstances will warrant the exchange of biometric data such as DNA or the “scope of subjects affected by the automatic exchange of data”.
- “The automated searching of DNA profiles and facial images should only be possible in the context of individual investigations of serious crimes and not of any criminal offense, as provided for in the proposal.”
- “The alignment of the Prüm framework with the interoperability framework of the EU information systems in the area of justice and home affairs” requires careful analysis of its implications for fundamental rights.”
- “The EPDS considers that the necessity of the proposed automated searching and exchange of police records data is not sufficiently demonstrated.”
Flaws in the System
The problem is not just about privacy; it is about the inherent flaws within facial recognition systems. The systems are notoriously inaccurate on women and those with darker skin, and may also be inaccurate on children whose facial features change rapidly…
Continue reading on Naked Capitalism
Source link
Author Nick Corbishley