A new study into AI’s capabilities has found that a computer algorithm can determine if someone is gay or straight based on photos of their faces.
The Stanford University report shows that the algorithm could distinguish gay and straight men 81 per cent of the time, and between gay and straight women 74 per cent of the time, The Guardian has reported.
Gay men and women tended to have “grooming styles”, features and expressions that were “gender-atypical”, meaning men would appear more feminine and women more masculine.
Certain trends emerges in face shapes, including narrower jaws, longer noses and larger foreheads among gay men, and larger jaws and smaller foreheads among lesbians.
The study’s methodology did not account for people who identify as bisexual, transgender, gender non-binary or intersex, nor did it include non-white people.
When human ‘gaydar’ was similarly tested, results were much less successful. Humans identified orientation 61 per cent of the time for men, and 54 percent of the time for women.
The algorithm’s success rate even increased when it tested multiple images of each user, increasing to up to 91 per cent.
The paper suggests that these differences may support the idea that sexuality is genetically determined.
Experts are already concerned that the technology could have troubling implications, including that it could be used to out or persecute LGBTI people en masse.
Nick Rule, an associate professor of psychology at the University of Toronto who has researched gaydar, told The Guardian, “It’s certainly unsettling. Like any new tool, if it gets into the wrong hands, it can be used for ill purposes.”
With facial recognition becoming an increasingly widely-adopted technology, the ethics of such software are being hotly discussed.
“AI can tell you anything about anyone with enough data. The question is as a society, do we want to know?” said Brian Brackeen, CEO of facial recognition software company Kairos.
Rule suggested that while it was important to test the technology, lawmakers needed to be proactive about regulating its use.
“What the authors have done here is to make a very bold statement about how powerful this can be,” he said.
“Now we know that we need protections.”