Detroit woman sues city after being falsely arrested while pregnant due to facial recognition technology::A Detroit woman is suing the city and a police detective after she was falsely arrested because of facial recognition technology while she was eight months pregnant, according to court documents.

  • SatanicNotMessianic@lemmy.ml
    link
    fedilink
    English
    arrow-up
    120
    ·
    1 year ago

    According to a recent review, 100% of the people falsely arrested via facial recognition findings have been black.

    The technology needs to be legally banned from law enforcement applications, because law enforcement is not making a good faith effort to use the technology.

    • rockSlayer@lemmy.world
      link
      fedilink
      English
      arrow-up
      44
      ·
      1 year ago

      We should ban patrol automation software too. They utilize historical arrest data to help automatically create patrol routes. Guess which neighborhoods have a history of disproportionate policing.

      • SatanicNotMessianic@lemmy.ml
        link
        fedilink
        English
        arrow-up
        17
        ·
        1 year ago

        The problems with the approaches that tend to get used should be the cause of absolute outrage. They’re ones that should get anyone laughed off of any college campus.

        The problem is that they lend a semblance of scientific justification to confirm the biases of both police departments and many voters. Politicians look to statisticians and scientists to tell them why they’re right, not why they’re wrong.

        That’s why it’s so important for these kinds of issues to make the front pages.

        • brygphilomena@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          It’s great how statistics can be used to basically support anything the author wants them to. Identifying initial biases in the data is super important just as verifying the statistics independently.

      • gelberhut@lemdro.id
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        19
        ·
        1 year ago

        I do not see a bias here. It did not assumed that criminal is black by default or so, it simply works much worse for black people.

        There could be different reasons for that. For example it can suck recognizing black faces in bad light conditions.

        • phillaholic@lemm.ee
          link
          fedilink
          English
          arrow-up
          25
          arrow-down
          1
          ·
          1 year ago

          This is a Systemic Bias; in this case Systemic racism.

          The outcome a product or service disproportionately targets Black people. It wasn’t designed to do it, so it’s not overt racism, it just worked out that way.

          Camera systems inherently have a harder time with dark skin. That’s a fact. However it’s been found time and time again that these systems are predominantly created by and tested on light skin individuals. So the bias is built into the flawed creation. You can see this in Hollywood where lighting has only recently been set up to highlight dark skin with majority black casts and show runners in shows like Atlanta and Insecure.

          • gelberhut@lemdro.id
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            11
            ·
            1 year ago

            Could you please point where it targets disproportionally black. Does it recognizes black people instead of white? this would be a racism.

            If it just recognizes completely wrong faces for black people - it is a shitty quality which, buy the way, works in favour of black criminals.

              • gelberhut@lemdro.id
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                8
                ·
                1 year ago

                Yes, I know. And I agree that when this happens in conditions when a human can do the work well - it is a bias. However, when this happens in conditions when a human cannot do the work either it could be a physic, like dark is worse visible in darkness.

                And in exactly discussed case it is not that clear what was the reason.

                it is a matter of interpretation as well: one can say “a system helps black criminals to avoid being arrested”.

                For me, this false reconginision statistic is an alarming signal which says that the system works bad and deeper analyses must be done, and meanwhile, police must be more accurate dealing with its results.

                • phillaholic@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  ·
                  1 year ago

                  The outcome of the bad technology and policing is disproportionately effecting dark skinned people. That’s where it becomes systemic racism. No one decided to design a system to arrest more blacks people. The outcome of various factors ended that way however. Sometimes it’s just a consequence of nature, but most of the time there are clear reasons like lack of representation in design and testing that would have found the problems earlier.

    • fabian_drinks_milk@lemmy.fmhy.net
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      A similar thing has happened here in the Netherlands. Algorithms have been used to detect fraud, but had a discriminatory bias and accused thousands of parents of child benefits fraud. Those parents came in huge financial problems as they had to back back the allowances, many even got their children taken away and to this day haven’t gotten them back.

      The Third Rutte Cabinet did resign over this scandal, but many of those politicians came back at another position, including prime minister Rutte, because that’s somehow allowed.

      Wikipedia (English): https://en.m.wikipedia.org/wiki/Dutch_childcare_benefits_scandal

    • frankblack@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      16
      ·
      1 year ago

      Well it does have its place. DoD and DHS has a human verify after the system certified the match. After the human “touch”, is when mistakes like this do not occur. What needs to happen is follow what DoD and CBP have created to verify so called matches to reduce the impact against blacks.

      Source: I’m the former Identity Operations Manager for a major agency.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    1
    ·
    1 year ago

    This is the best summary I could come up with:


    A Detroit woman is suing the city and a police detective after she was falsely arrested because of facial recognition technology while she was eight months pregnant, according to court documents.

    Porcha Woodruff, 32, was getting her two children ready for school on the morning of Feb. 16 when six police officers showed up at her doorstep and presented her with an arrest warrant alleging robbery and carjacking.

    “Ms. Woodruff later discovered that she was implicated as a suspect through a photo lineup shown to the victim of the robbery and carjacking, following an unreliable facial recognition match,” court documents say.

    When Oliver learned that a woman had returned the victim’s phone to the gas station, she ran facial technology on the video, which identified her as Woodruff, the lawsuit alleges.

    On the day Woodruff was arrested, she and her fiancé urged officers to check the warrant to confirm whether the woman who committed the crime was pregnant, which they refused to do, the lawsuit alleges.

    The office confirmed that facial recognition prompted police to include the plaintiff’s photo in a six-pack, or array of images of potential suspects in the warrant package.


    I’m a bot and I’m open source!

  • Alexstarfire@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    2
    ·
    1 year ago

    I’m going to buck the trend here and say this is less about the facial recognition software. The police used an 8 year old photo even though they had something more recent available. Then the victim identifies the woman. The only thing the software did was put her in the lineup.

    I’m very much against facial recognition, even if it’s 100% accurate. It’s because it will get abused. Just like any other tech that reduces privacy.

    • pitninja@lemmy.pit.ninja
      link
      fedilink
      English
      arrow-up
      14
      ·
      1 year ago

      Eyewitnesses are notoriously unreliable at picking people out of a lineup as well. But I can kind of understand how if two unreliable systems point to the same person, that could be seen as enough for an arrest. It shouldn’t have taken nearly as long for her to be cleared of any charges, however.

    • phillaholic@lemm.ee
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      1 year ago

      It’s sort of the “guns don’t kill people, people kill people” argument. It just gives a shitty cop cover to keep being shitty. The tools should be improved to eliminate that cover unless it’s far more accurate.

  • TIEPilot@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    edit-2
    1 year ago

    And all the facial recognition has failed on black people… Is this a Family Guy episode?