Q&A with Ed Santow: Risks and benefits of Facial Recognition
Face Value – part of Vivid Sydney at UTS
Face Value is an interactive installation that lets you experience the double-edged applications of facial recognition technology – from security checks to house approvals.
Test it for yourself – will it help or harm communities?
Give it a try from 27 May-18 June in the foyer of UTS Building 1, 15 Broadway, Ultimo
Technology can be a powerful tool for social progress, but there are also profound risks and threats.
Facial recognition exemplifies the double-edged nature of new and emerging technologies.
During Vivid Sydney, UTS is inviting audiences to come and experience facial recognition technology in action and decide for themselves whether it should be considered ‘tech for good’. Many of us already unlock our phones with facial recognition – a use that carries almost no risk – but when applied to uses like surveillance, profiling, or gatekeeping, the consequences can be catastrophic.
We spoke to Edward Santow, Industry Professor – Responsible Technology, about the use of Facial Recognition Technology (FRT) in Australia, and the risks and benefits associated with it.
How widely is FRT used in Australia?
ES: FRT is widely used in smartphones and tablets as an alternative to a password or PIN. It is increasingly being used to verify the identity of people in some government settings, like SmartGates at Australia’s international airports. FRT is increasingly being used for a range of other things as well, such as by police to identify criminal suspects.
How accurate is FRT in identifying people by reference to demographic factors such as age, gender and skin colour?
ES: FRT is reasonably accurate for one-to-one facial verification, like when unlocking a smart phone, especially when the lighting is good and it’s a high-quality device. Accuracy problems tend to be most severe with one-to-many facial identification, like the police identifying an individual in a crowd of people, using a big database. Here the research shows that there can be high rates of error in identifying people of colour, women, people with a physical disability, and people who are particularly young or old.
Can FRT correctly evaluate a person’s physiological or behavioural tendencies?
ES: Some FRT applications purport to provide this service. It tends to be very unreliable at assessing these sorts of characteristics. There is very little tested and verified science behind such applications.
Do companies have to be transparent about how and when they access your facial biometrics?
ES: There is no dedicated law on FRT or biometric technology more generally. Our current laws, including the Privacy Act, contain broad loopholes, which means that in practice companies generally aren’t required to be completely transparent about when and how they use FRT.
Are there laws or regulations in place to stop the misuse of FRT, particularly against marginalised groups?
ES: Australia does not have a dedicated law dealing with FRT. It has some general laws that deal with some of the risks. For example, our privacy law provides some limited protection against some, but not all, privacy risks associated with FRT. Australia should have an FRT law that encourages positive innovation for public benefit, while setting clear red lines that prohibit harmful use of FRT.
Visit the Facial recognition technology: Towards a model law website for more information on the work UTS is doing in this space.