Technology, AI and ethics.

Opening up Pandora’s box: Algorithms and LGBT rights

 

Opening up Pandora’s box: Algorithms and LGBT rights

The Bank of England revealed on July 15, 2019 the new face of the £50 note. Alan Turing, a British computer scientist and mathematician, not only helped crack codes for the military during WWII, he also developed the Automatic Computing Engine and lends his name to the Turing test, which proposed the question “Can machines think?”.

In 1952, after WWII’s end, Alan Turing was chemically castrated. His crime was ‘gross indecency’ under the Labouchere Amendment, Section 11 of the Criminal Law Amendment Act. In plainer terms, he was gay. Two years later, he died of cyanide poisoning, possibly self-inflicted as a method of suicide. 

While Alan Turing was posthumously pardoned by the Queen in 2013, the same year, the Syariah Penal Code Order 2013 went into effect, which legalized the stoning of same-sex couples in Brunei. According to CNN, 70 UN member states currently criminalize same-sex sexual relationships; seven of which impose the death penalty for consenting legal adults. Globally, human rights do not move at the same pace and do not move in a linear fashion towards progressive politics.

While Turing and other gay people in the UK feared their government’s legal homophobia, LGBT people in 2019 are seeing new, technologically-focused things to fear come into view. As technology progresses, so progresses the ways in which technology can oppress us.

Michal Kosinski and Yilun Wang published a peer-reviewed paper in 2017 titled “Deep neural networks are more accurate than humans at detecting sexual orientation from facial images”. The paper was published in the Journal of Personality and Social Psychology. From the paper description: 

“We used deep neural networks to extract features from 35,326 facial images. These features were entered into a logistic regression aimed at classifying sexual orientation. Given a single facial image, a classifier could correctly distinguish between gay and heterosexual men in 81% of cases, and in 74% of cases for women. Human judges achieved much lower accuracy: 61% for men and 54% for women. The accuracy of the algorithm increased to 91% and 83%, respectively, given five facial images per person.”

The research looks at not only face morphology, but make-up usage, hairstyle, grooming, and self-stylization. Of course, testing facial features for specific traits is reminiscent of physiognomy and phrenology. Physiognomy, the study of facial features, has been debunked, and in many cases was simply a means to justify racism and sexism. 

What are the ethical implications of this study and its claims? Might governments potentially weaponize this kind of neural network to prosecute people who they deem to have features associated with a non-heterosexual sexuality?

An author’s note on the paper, last updated April 2019, discusses the human rights implications of this research and its findings. It reads: “We studied existing facial recognition technologies, already widely used by companies and governments, to see whether they can detect sexual orientation more accurately than humans. We were terrified to find that they do. This presents serious risks to the privacy of LGBTQ people.”

Might governments potentially weaponize this kind of neural network to prosecute people who they deem to have features associated with a non-heterosexual sexuality?

These researchers raise important questions about the future of technology intersecting with human rights, particularly in regards to LGBT individuals. Technology that can be misused to harm LGBT individuals already exists, it can be implemented by anyone willing to recreate and test datasets. As the researchers insist, they did not build anything new, merely only used already existing data. According to the paper methodology, images were collected from an unnamed dating site and from Facebook profile pictures. 

Thus, it is not a baseless prediction to worry that governments or individuals with ill-intent may use neural networks as a way to not only invade privacy, but to create potential targets for hate crimes or other forms of discrimination. It does not only pose problematic for a government with anti-LGBT laws. Of course, all LGBT-progressive countries include homophobic individuals or organizations who would potentially use a persons’ sexual identity as a means to discriminate against them.

Technology as a means to target LGBT people reached mainstream news after Russian LGBT activist Yelena Grigoryeva and member of the Alliance of Heterosexuals and LGBT for Equality was killed in St. Petersburg. Yelena was targeted by a website that shared the personal information of LGBT people, including names and addresses and encouraged “hunting” the individuals. The website turned homophobic hate crimes into a game, promising rewards and prizes to attackers. What if groups such as this one start using facial recognition technology to further their reach and further intrude upon LGBT people’s privacy and right to security?

As technology progresses, so progresses the ways in which technology can oppress us.

With advancing technology must also come discussions about how it will ethically affect us. Facial recognition and surveillance are already used to track individuals. According to an article by Forbes, Chinese facial recognition technology is used to track ethnic minorities. The technology also labels what it assumes a person’s ethnic heritage to be, marking individuals who are either Han Chinese or Uyghur. In May, 2019 Human Rights Watch reported the relationships between Chinese surveillance apps and the Xinjiang re-education camps.

The article states: “Analysis of the IJOP app reveals that authorities are collecting massive amounts of personal information—from the color of a person’s car to their height down to the precise centimeter—and feeding it into the IJOP central system, linking that data to the person’s national identification card number.”

We have a right to the sanctity of human safety to ask: Have we opened up Pandora’s box?

Share Post
Sarah Schlothauer

Sarah Schlothauer is an editor for Conditio Humana. She received her Bachelor's degree from Monmouth University and is currently enrolled at Goethe University in Frankfurt, Germany where she is working on her Masters.

Show Comments