• tal
    link
    fedilink
    English
    arrow-up
    116
    arrow-down
    1
    ·
    edit-2
    28 days ago

    In total, there were 118 false positives — a rate of 4.29%.

    Earlier this year, investors filed a class-action lawsuit, accusing company executives of overstating the devices’ capabilities and claiming that “Evolv does not reliably detect knives or guns.”

    I mean, in terms of performance, I’d be more concerned about the false positive rate than the false negative rate, given the context. Like, if you miss a gun, whatever. That’s at worst just the status quo, which has been working. Some money gets wasted on the machine. But if you are incorrectly stopping more than 1 in 25 New Yorkers from getting on their train, and apply that to all subway riders, that sounds like a monumental mess.

    • jettrscga@lemmy.world
      link
      fedilink
      English
      arrow-up
      69
      ·
      28 days ago

      With how trigger happy police are, the false positives would lead to more deaths than they prevent. And police would claim it’s justified because the machine told them so.

      • Toribor@corndog.social
        link
        fedilink
        English
        arrow-up
        24
        ·
        27 days ago

        Facial recognition confirmed he was a criminal and the scanner confirmed he had a gun! Of course we opened fire instantly. How could we have known it was just some guy with a water bottle?