Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5

Social justice "subjectivity" versus machine learning "objectivity"

#1
C C Offline
https://theconversation.com/gender-is-pe...onal-90798

EXCERPT: . . . As digital technologies become more powerful and sophisticated, their designers are trying to use them to identify and categorize complex human characteristics, such as sexual orientation, gender and ethnicity. The idea is that with enough training on abundant user data, algorithms can learn to analyze people’s appearance and behavior – and perhaps one day characterize people as well as, or even better than, other humans do.

Gender is a hard topic for people to handle. It’s a complex concept with important roles both as a cultural construct and a core aspect of an individual’s identity. Researchers, scholars and activists are increasingly revealing the diverse, fluid and multifaceted aspects of gender. In the process, they find that ignoring this diversity can lead to both harmful experiences and social injustice. For example, according to the 2016 National Transgender Survey, 47 percent of transgender participants stated that they had experienced some form of discrimination at their workplace due to their gender identity. More than half of transgender people who were harassed, assaulted or expelled because of their gender identity had attempted suicide.

Many people have, at one time or another, been surprised, or confused or even angered to find themselves mistaken for a person of another gender. When that happens to someone who is transgender – as an estimated 0.6 percent of Americans, or 1.4 million people, are – it can cause considerable stress and anxiety.

In our recent research, we interviewed 13 transgender and gender-nonconforming people, about their general impressions of automatic gender recognition technology. We also asked them to describe their responses to imaginary future scenarios where they might encounter it. All 13 participants were worried about this technology and doubted whether it could offer their community any benefits.

Of particular concern was the prospect of being misgendered by it; in their experience, gender is largely an internal, subjective characteristic, not something that is necessarily or entirely expressed outwardly. Therefore, neither humans nor algorithms can accurately read gender through physical features, such as the face, body or voice.

They described how being misgendered by algorithms could potentially feel worse than if humans did it. Technology is often perceived or believed to be objective and unbiased, so being wrongly categorized by an algorithm would emphasize the misconception that a transgender identity is inauthentic. One participant described how they would feel hurt if a “million-dollar piece of software developed by however many people” decided that they are not who they themselves believe they are....

MORE: https://theconversation.com/gender-is-pe...onal-90798
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Article Social Justice for Neanderthals: What debate about our cousins reveals about us C C 0 77 Sep 30, 2023 05:45 PM
Last Post: C C
  Imperfections of man versus a perfect God Ostronomos 3 193 Feb 5, 2021 08:38 PM
Last Post: C C
  Logic versus empiricism in the case for God Ostronomos 6 1,362 Apr 12, 2019 04:18 PM
Last Post: C C
  Illusion of personal objectivity: From fundamental error to truly fundamental error C C 2 668 Sep 26, 2018 04:55 PM
Last Post: Yazata
  Science versus God and Religion Ostronomos 4 700 Feb 4, 2018 07:38 PM
Last Post: Yazata
  Self-Taught Learning Machines Ostronomos 0 410 Jun 7, 2016 04:10 PM
Last Post: Ostronomos



Users browsing this thread: 1 Guest(s)