The biometrics industry says we are entering a global age of facial verification. Others warn the risk of racial profiling remains.
Christel Macdonald doesn’t forget a face. Her work examining up to 50 identities a week in a University of NSW lab has placed her abilities alongside the best of them.
“It is pretty handy,” the 25-year-old research assistant tells SBS News. “Though sometimes it’s annoying … if I recognise someone but I don’t know where from it can drive me crazy.”
It took Ms Macdonald a weekend to work out how she knew someone she recognised in a crowded bathroom at a Sia concert, for instance.
When Chinese authorities recently used technology to do something similar - scanning the faces of tens of thousands of people attending concerts in eastern China - it took them minutes to nab not just one wanted fugitive, but three.
This is the brave new world of facial recognition software. According to research presented to a conference in Sydney on Wednesday, the best algorithms are now on par with the best humans (like Ms Macdonald) when it comes to matching two different images of the same person.
But as organisations driving the multibillion-dollar industry gather to “share ideas and discuss the opportunities biometrics offer” - including plans for a nationwide facial-matching system in Australia - privacy groups and civil liberty advocates warn the technology’s limitations must not be overlooked.
“The reality is if you are from a particularly small ethnic group just your membership of the group increases the likelihood that you will be falsely identified,” the Australian Privacy Foundation's David Vaile said.
Facial recognition software is based on algorithms that scan faces to pick up individual identifiers, such as the space between a person’s eyes, nose and other features, to create a unique biometric map.
“Biometrics are really exciting technology and it has come a long way,” Biometrics Institute chief executive Isabelle Moeller told SBS News. “But there are still some challenges around it.”
An MIT study published earlier this year found that while three of the biggest commercially available gender classification systems (based on facial analysis) were as much as 99 per cent accurate in relation to white men, errors shot up to as much as 35 per cent when looking at darker-skinned women.
Errors shot up to as much as 35 per cent when looking at darker-skinned women.
They’re the kind of odds that concern Maki Issa, one of six African men involved in a landmark racial profiling case settled out of court by the Victorian Police in 2013.
Mr Issa said young African men already felt targeted by law authorities, and more false positives would only erode relations with the community further.
“Some young African guy commits a crime and pretty much every African person they’ve come across in the last year or so is a suspect now until they find that person,” he said. “Some people get taken into a police station and get told, ‘mistake, you’re somebody else’.”
Mr Issa cited a facial recognition police trial in the UK last year that was wrong 98 per cent of the time.
Patrick Grother from the US National Institute of Standards and Technology told SBS News it was widely accepted within the industry that the demographic data on which a system is trained affects how it performs. That required users to be aware of - and to counter - the idiosyncrasies of their algorithm.
“If an algorithm was built in Russia, it would be implicitly less exposed to dark-skinned individuals than an algorithm that was developed in the United States, and that those data sets selection biases, or constraints, could lead to differentials in recognition performance,” Mr Grother said.
David White, a cognitive psychologist at the University of NSW, said it was also important that the weaknesses of a system were not compounded by the biases of the person using it. “If you don’t pay attention to that there is a dangerous level of false positives,” he said.
Error rates can also be high when the imagery is poor, he added. “Most CCTV imagery, for example, I would class as substandard.”
“I think that’s a challenging task and it’s not just for the Australian government but a challenging task in general. In building these systems so they are as accurate as possible and building these systems so they don’t have biases against certain segments of society.”
Coming to a town near you
The NSW government this week became the latest to back the technology, announcing that $12 million would be invested in facial matching services as part of a proposed national facial recognition system.
Based on an in-principle agreement struck last year, all levels of government would have real-time access to identity data like passport, visa and driver’s licence images in a bid to thwart crimes from terrorism to identity theft. Some private companies, such as banks, could also be granted indirect access to the data too.
The Victorian government has since balked at the “expanded” scope of the legislation introduced into the federal parliament, but Queensland has opted not to wait. The state rushed through its own laws to ensure the technology was scanning Gold Coast crowds during the recent Commonwealth Games.
“Everywhere you look there is the same sort of interest in facial recognition technology,” said Terry Hartmann from German-based face recognition company Cognitec, which has been fielding interest from cities in India to casinos in Macau.
Beyond law enforcement, the growing applications include use in retail to gauge marketing demographics of customers - including age, gender and ethnicity - and by stadiums to single out known hooligans.
“It’s all scalable, we have designed systems with a thousand cameras on one system for example,” said Mr Hartmann.
“Being able to make some judgement on what people’s emotion is when looking at something,” is the new frontier, he said, “particularly from a marketing perspective”.
Mr Hartmann said there was little to fear about the technology, saying his company’s software was developed in Europe where there was wide access to many different ethnicities and skin tones.
“We are not creating a police state here but we are building enhanced security and safety based on facilitation,” he said.
We are not creating a police state here … we are building enhanced security and safety.
The proposed Australian laws are currently being scrutinised by the joint parliamentary committee on intelligence and security, which has heard concerns from groups like the Australian Privacy Foundation.
Mr Vaile, the association’s chairman, said whole populations would potentially face being scanned as possible suspects for even trivial matters - with additional implications for minorities not well-studied by an algorithm.
“That small sample can lead to biases, [a] greater likelihood that you, your family and relatives will be falsely suspected,” he said.
A spokesperson from the Department of Home Affairs, which will oversee the proposed new system, told SBS News it used multiple algorithms and continues to monitor the accuracy of its biometric technology on an ongoing basis.
It had separately told the committee the technology would not specifically collect racial or ethnic data, though “in some cases, this information may be inferred from other information such as a person’s name or facial image”.
“We have in place arrangements to continually tune and train the facial matching algorithms to its image datasets, which reflect the diversity of the Australian population and the Department’s client base,” the spokesperson said.
Mr Issa said he would rather the investment be spent on addressing issues with on-the-ground policing, adding that if he didn’t already have a passport or drivers licence, he may have opted not to.
“I don’t think it’s even a matter of if, it’s a matter of when I’ll be mistaken for somebody else.”