Bad People
Can recent advances in technology solve some of our social problems? Wendy M. Grossman examines - from airport facial recognition software to the so-called next generation of "Old Bill" policeman.
Bad People
Image: Birmingham Central Police Station by West Midlands Police via Flickr (CC BY-SA 2.0)
The MAGICAL airport of 2020 doesn't exist, and already I feel trapped in it like the nettlesome fly in the printer in the 1985 movie "Brazil". The vision, presented by Joe Flynn, the director of border and identity management services for Accenture Ireland at this year's Biometrics Conference probably won't sound bad to today's frustrated travelers. He explained MAGICAL - mobility, analytics, gamification, collaboration intelligence, automation, low-touch - but we know the acronym wagged this dog.
MAGICAL goes like this. The data you enter into visa and passport systems, immigration data, flight manifests, advance passenger information, all are merged and matched by various analytics. At the airport, security and immigration are a single "automated space": you move through the scanner into a low-touch, constantly surveilling environment ("massive retail space") that knows who you are and what you're carrying and ensures no one is present who shouldn't be. Your boarding pass may be a biometric.
Later, when you step off the plane, you are assessed against expected arrivals and risk profiles . As you sprint down the people movers elbowing slowpokes out of your way (doesn't everyone do this?) accelerated face-on-the-move recognition systems identify you. You cross an indicator line so you know you've entered another country, but as a known traveler you flow through seamlessly. Only 5 to 10 percent of travelers - the unwashed unknowns - are stopped to go through gates - or be checked by an officer with a mobile device. Intervention is the exception, although you will still have to choose a customs channel.
My question was: what happens when it goes wrong?
"We don't replace the border guards and people," Flynn said reassuringly. Rasa Karbaukaite, a research officer from Frontex, the Polish-based organization that coordinates and develops integrated European border management, noted that "automated" is not "automatic": there will be human supervision at all times.
But I was worrying about the back end. What happens when some database makes a mistake and you get labeled bad news? In "Brazil" that meant the goon squad invaded your home and carted you off. In a modern-day airport, well…what?
This concern re-emerged when Simon Gordon and Brian Lovell, respectively the founder and research leader of Facewatch outlined "cloud-based crime reporting", a marriage of social networking with advanced facial recognition systems to help businesses eliminate low-level crime. Say a handbag is stolen in a pub. The staff can upload the relevant still and moving CCTV images in a few minutes along with a simple witness statement. Police can review the footage, immediately send back the reference number needed to claim on insurance, and perhaps identify the suspect from previous crimes.
Speeding up crime reporting and improving detection aren't contentious, nor are, in and of themselves, the technical advances that can perform facial recognition on the grainy, blurred footage from old CCTV cameras. The many proprietary systems behind CCTV cameras pose an expensive challenge to police; Facewatch overcomes this by scraping the screens so that all uploaded images are delivered in a single readable format.
But then Gordon: "[The system] overcomes privacy issues by sharing within corporate and local groups." It's not illegal for Sainsbury's to share information across all its branches that would effectively blacklist someone. Shopwatch and Pubwatch groups can do the same - and already are. Do we want petty criminals to be systematically banned? For how long? What happens when the inevitable abuse of the system creeps in and small businesses start banning people who don't do anything illegal but annoy other customers or just aren't lucrative enough? Where does due process fit in?
A presentation from Mark Crego, global lead for border and identity management, Accenture Ireland, imagined "Biometrics Bill" - the next-generation "Old Bill" policeman. Here, passively collected multi-modal biometrics and ubiquitous wireless links allow an annoyingly stereotyped old lady who's been mugged to pick her attacker out of a line-up assembled at speed on an iPad from her description (ignoring the many problems with eyewitness testimony and local video feeds and instantly submit a witness statement. On-the-fly facial recognition allows the perpetrator to be spotted on a bus and picked up, shown the watertight case against him on screen, and be jailed within a couple of hours. Case file integration alerts staff to his drug addiction problems and he gets help to go straight. Call me cynical, but it's all too perfect. Technology does not automatically solve social problems.
The systems may be fantasy, but the technology is not. As Joseph Atick, director of the International Biometrics and Identity Association said, "The challenge is shifting from scalability and algorithm accuracy to responsible management of identity data." Like other systems in this era of "big data", identity systems are beginning to draw on external sources of data to flesh out individual profiles. We must think about how we want these technologies deployed.
"We don't want to let the bad people into our country," said Karbauskaite in explaining Frontex's work on creating automated border controls. Well, fair enough, it's your country, your shop, your neighborhood. But where are we going to put them? They have to go somewhere - and unfortunately we've run out of empty continents.
Wendy M. Grossman’s Website has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard or follow on Twitter.
Share this article
Comments
Latest Articles
Featured Article
Schmidt Happens
Wendy M. Grossman responds to "loopy" statements made by Google Executive Chairman Eric Schmidt in regards to censorship and encryption.
ORGZine: the Digital Rights magazine written for and by Open Rights Group supporters and engaged experts expressing their personal views
People who have written us are: campaigners, inventors, legal professionals , artists, writers, curators and publishers, technology experts, volunteers, think tanks, MPs, journalists and ORG supporters.
Comments (0)