MIT Researcher Exposing Bias in Facial Recognition Tech Triggers Amazon’s Wrath

By | April 8, 2019

  • April 8, 2019 at 2:53 pm
    rob says:
    Well-loved. Like or Dislike:
    Thumb up 12
    Thumb down 2

    As an Uglo-American, this new technology offends me. I have attempted to use facial recognition software many times on my smartphone, only to have it burst into flames each time. Fingerprint recognition should be enough.

    • April 8, 2019 at 3:22 pm
      Rosenblatt says:
      Like or Dislike:
      Thumb up 7
      Thumb down 1

      I hear ya, Rob. I’ve got the same issue. That’s why I don’t let people take pictures of me anymore – their lenses kept breaking! :D

    • April 8, 2019 at 4:26 pm
      Perplexed says:
      Like or Dislike:
      Thumb up 6
      Thumb down 1

      Good one, Rob!

  • April 8, 2019 at 4:54 pm
    Craig Cornell says:
    Poorly-rated. Like or Dislike:
    Thumb up 4
    Thumb down 16

    Hidden due to low comment rating. Click here to see.

    • April 8, 2019 at 5:09 pm
      Captain Planet says:
      Hot debate. What do you think?
      Thumb up 15
      Thumb down 7

      MARCH 20, 2019 AT 12:37 PM
      Craig Cornell says:
      LIKE OR DISLIKE:
      1
      1
      Partisan . . . Hack. Do you even realize how boring you are? How predictable and lacking in any interesting thoughts? Anything even remotely amusing? So full of hate. Sad.

      • April 8, 2019 at 5:11 pm
        Craig Cornell says:
        Poorly-rated. Like or Dislike:
        Thumb up 3
        Thumb down 15

        Hidden due to low comment rating. Click here to see.

      • April 9, 2019 at 4:55 pm
        PouellerBeaReport Fail says:
        Like or Dislike:
        Thumb up 3
        Thumb down 5

        How boilerplate of you to reply with ‘Date/ Commenter Name/ Like or dislike’ as opening comment!

    • April 10, 2019 at 1:16 am
      Libby says:
      Like or Dislike:
      Thumb up 12
      Thumb down 3

      I do have to wonder why all of Craig’s posts involve race &/or party. A little sensitive Craig? Not everything is black and white (no pun intended) or a partisan issue. You may need a new pair of blinders. Maybe some with peripheral vision?

      • April 10, 2019 at 11:21 am
        Craig Cornell says:
        Hot debate. What do you think?
        Thumb up 7
        Thumb down 15

        Did you read the article? Hello! The ARTICLE tied race to facial recognition. In fact, that was the only point of the story.

        It is the Left that wants to find racism in everything, not me. If you knew me, you would laugh at how diverse (boring word) my friends and family are. And how tired everyone I know is of hearing race brought into every issue, in order to divide us further.

        The computer can’t recognize dark faces very well? Big deal. I am sure the software has a lot of limitations beyond skin color that will never make it into an Insurance Journal article.
        (And is this insurance related at all? No. More lefty propaganda from IJ).

        (“Hey look, that cup is racist! And that rock! What about that tree over there!”)

    • April 11, 2019 at 10:23 am
      ??? says:
      Well-loved. Like or Dislike:
      Thumb up 12
      Thumb down 2

      What does this article have to do with racism and not the failing of the recognition system? It is supposed to be a facial recognition system, it doesn’t work on specific people at a large percentage.. It’s brought to the attention of the manufacturers and should be worked on being fixed.

      • April 11, 2019 at 12:49 pm
        Craig Cornell says:
        Like or Dislike:
        Thumb up 6
        Thumb down 13

        Please, please show me where the article points out flaws in the system OTHER than being unable to accurately read dark faces. That is the ONLY issue cited in the article, and then the article goes on to extrapolate the horrible racial consequences (law enforcement).

        The article didn’t point out any other limitations of the facial recognition software. Eye shape and color? Size of nose? Length of chin? Ear shape? Any other limitations in the software mentioned in the article?

        Nope. Dark skins. Welcome to America, 2019, where everything is racist, including your computer program.

        • April 11, 2019 at 2:28 pm
          ??? says:
          Like or Dislike:
          Thumb up 10
          Thumb down 4

          What are you talking about!?

          That is the exact flaw with the system… it misreads dark faces 34% of time… White faces less than 1%.. that’s an issue. What system would you want working for you that fails 34% of the time!?…. “Oh autonomous cars are dangerous and can lead to accidents because their system isn’t functioning correctly.. Yea but what OTHER Issues are there!? This country is mechanically prejudiced!”

          The problem can negatively affect a large number of people.

          There needs to be additional problems for you to care?

          You are such an ignorant racist. Pull your pants up your bias is showing.

          • April 11, 2019 at 4:58 pm
            Craig Cornell says:
            Poorly-rated. Like or Dislike:
            Thumb up 2
            Thumb down 13

            Hidden due to low comment rating. Click here to see.

          • April 14, 2019 at 7:09 pm
            A Computer Scientist says:
            Like or Dislike:
            Thumb up 3
            Thumb down 3

            It’s probably not a flaw in the system at all but an error in what training data was used. An AI trained to recognize only black cats is gonna problems when you throw in White and Stripes.

  • April 9, 2019 at 10:33 am
    CL PM says:
    Well-loved. Like or Dislike:
    Thumb up 18
    Thumb down 4

    When my brother died a few years ago, as executor of his estate, I needed to get into his laptop. He had facial recognition software on it as part of the log in process. Even though I didn’t think we looked that much alike, we are both hair-challenged and the software recognized me as him and I was able to break into his laptop. Must have liked all that pale skin with our very high foreheads. I’ve been wary of the accuracy of this software ever since.

    • April 9, 2019 at 8:23 pm
      Craig Cornell says:
      Poorly-rated. Like or Dislike:
      Thumb up 5
      Thumb down 18

      Hidden due to low comment rating. Click here to see.

  • December 9, 2019 at 3:25 am
    Zeph Smith says:
    Like or Dislike:
    Thumb up 5
    Thumb down 1

    I think that “racially biased” is an inappopriate word for software which is less accurate for some groups of people. It’s quite appropriate to bring up any such flaws – just as it would be if the software was easily fooled by hats or changes in eyeglasses. But the term “racially biased” is emotionally charged and suggests negative effect, even negative intention in the public mind.

    Many stories note that AI systems are created by human programmers and humans have implicit bias, so their creations will obviously reflect those biases. That narrative is misleading – as reported to date, none involve programmers sneaking their personal racial biases into the code, nor is that likely to have. But very few readers understand technology, nor take the time to read and parse carefully – they just pick up that AI is biased against blacks too, because it’s created by whites.

    The flaws in facial recognition systems likely stem from a combination of lower brightness and contrast in some faces, and perhaps from the training sets having more people of some races/ethnicities than others. Yes, it’s good to fix those flaws, if possible; or at least reduce them. But that’s in no way intentional or due to unconscious human bias.

    And the tenuous potential link to damaging outcomes is always nebulous. If it’s a real world danger, describe the incidents where it results in real harm. Is the problem that criminals of some races are less often personally identifiable in security footage at the scen of a crime, and so go uncaught more often for some races than for others (failure to match)? Or being unable to enter secured areas they should be authorizef for? Or are people being frequently detained briefly or extensively at airports or stadiums because of a false positive match with a banned person? Or are banned people of certain races being undetected more often? Or what?

    Some of those might be minor or major hassles for people with darker skin tones, and some might be advantages.

    I suspect that if the facial recognition software had been more accurate for darker skin tones, there would be cries of bias because it oversurveilled people of color compared to whites; or on the flip side, one could say that the facial recognition is racially biased against whites because it’s higher accuracy makes it more intrusive on their privacy (again, I have little doubt this would be argued in the other direction).

    Let’s skip all that misdirected politicization. Just report that the technology has a harder time with darker faces and has more false positive matches, false negative matches, and
    failed searches for them; it’s strongly in the interests of the vendors to continue improving their offerings in regard to any source of error.

    Let’s not make it a source of racial tension just to get headlines, by labeling it with the highly emotionally charged term “racial bias” when that’s not the most accurate description in the first place.

    Or else describe how and how often it causes actual harm; perhaps there really is a significant differential racial impact, which hasn’t yet been reported; in which case I may reconsider. I have no sympathy for real racial discrimination, I just don’t want us to be misled by constantly bringing in racial concepts when they aren’t needed.

  • September 14, 2022 at 10:13 am
    Marilyn says:
    Like or Dislike:
    Thumb up 0
    Thumb down 2

    Honestly such a big issue! Would never expect a big corporation such as Amazon to mess up this badly. DO BETTER BEZOS!!!!



Add a Comment

Your email address will not be published. Required fields are marked *

*