Racial profiling concerns over facial recognition technology

fingerprints and eye scans

Transforming Australia's visa system, flagging collecting biometrics from applicants, like fingerprints and eye scans. Source: Pixabay

Get the SBS Audio app

Other ways to listen

Doubts are being cast over a new national biometric database that will use facial recognition technology to identify faces.


The Federal Government is planning to roll out a national database of people's faces, using images collected from passports and driver's licences.

The ability of computers to track and recognise faces is at the forefront of new biometric technology.

But while the Government pushes ahead into the new world, others fear biases in some algorithms could lead to increased racial profiling.

Make Issah knows what it is like to be picked out of a crowd.


He was one of six young African-Australians who ended up settling a landmark case against Victoria Police over racial profiling in 2013. “We have that problem right now. We're talking about the human, the police. Some young African guy commits a crime, and then pretty much every African person they’ve come across in the last year or so is a suspect now until they find that person,” says Mr Issah.

But with the golden age of computerised facial recognition underway, there are concerns young men like Mr Issah will increasingly end up getting unwanted attention from the authorities.

Facial-recognition software works by scanning faces and measuring the distance between the eyes, nose and mouth to make a biometric pattern. The pattern is then matched against a database of images to find a match.

The facial-recognition company Cognitech's vice president for the Asia-Pacific, Terry Hartmann, says there is huge interest in the technology across the region and beyond. "There's Singapore, Hong Kong, the casinos in Macau and the Philippines. There are stadiums (in) Australia. New Zealand is doing technology. There's law enforcement in all those regions. There's been work with (South) Korea around the Olympics. So, everywhere you look, there's different aspects of facial-recognition technology being applied today. And the same applies for the US and Europe,” Mr Hartmann says.

The Federal Government's proposal is for a national database that police can use, but also third parties such as banks.

Australian Privacy Foundation chairman David Vaile says letting third parties access the data is a huge risk. “Allowing private-sector access to this sort of stuff is potentially quite scary, quite dangerous, and, even more to the point, you lose control of it when you do that,” says Mr Vaile.

Studies from the United States have also shown some algorithms hide a racial bias.

A study published in February by MIT, the Massachusetts Institute of Technology, found three of the biggest commercially available gender-classification systems were up to 99 per cent accurate with white men.

But it found errors rose to as much as 35 per cent when looking at darker-skinned women.

The bias is partially related to groups with fewer members, meaning the computer has less data to learn from.

Mr Vaile says that computerised bias can lead to greater problems, particularly in the area of law enforcement. “If you're from perhaps a particularly small ethnic group, just your membership in that group increases the likelihood that you'll be falsely identified,” he adds.

A cognitive psychologist at the University of New South Wales, David White, has been testing humans' ability to recognise faces, compared with computerised algorithms' accuracy.

He says the computers that performed poorly just a few years ago are catching up and now perform as well at the task as the best humans.

But he says computer algorithms may be importing biases just like humans, who are less able to identify people from different races. “I think that’s quite a challenging task, and it's not specific to the Australian government. It's a challenging task, in general, for building these systems such that they are as accurate as possible and also don't prejudice against specific segments of society,” Mr White says.

Share