• 3 Posts
  • 30 Comments
Joined 1 year ago
cake
Cake day: June 12th, 2023

help-circle







  • Hmm. Are you asking in good faith, or to dogpile? Anyway, sure; I can explain why.

    The Gruesome - clickbait because “if it bleeds it leads.”
    Story - words like “story” are often plainly false when the article is a tiny blurb or fluff piece. Thankfully, this article is an actual story. But remember, it’s still bait.
    of How - clickbait because it asks a question it doesn’t answer, baiting the headline-reader to click.
    Neuralink’s Monkeys - oh, another Elon Musk altar. The press can’t get enough of Musk.
    Actually Died - more bleeding leading.

    Headlines can just be content, rather than a tease. This article title intentionally relays no new info.







  • Hmm. I’m new here. Why is this post getting downvotes (with no comments about why)?

    Edit: I originally phrased the question to be about “no-comment downvotes” which is too easy to misunderstand. I rephrased it since I do see downvotes, and thought downvoting was for content that doesn’t fit the community, or for other objections where it is expected that people would comment their objection rather than silently downvote and move on.



  • anti-clickbait tldr: system uses facial recognition, complete with the expected false positives, false negatives, and bias.

    Key passage:

    Clear’s methods determined its facial-recognition system to enroll new members was vulnerable to abuse, said people familiar with the review, who asked not to be identified discussing security-sensitive information.

    The computer-generated photos of prospective customers at times captured blurry images that only showed chins and foreheads, or faces obscured by surgical masks and hoodies.

    The process — which allowed Clear employees to manually verify prospective customers’ identities after its facial recognition system raised flags — created the potential for human error.

    Apparently last July “a man slipped through Clear’s screening lines at Reagan National Airport near Washington, before a government scan detected ammunition — which is banned in the cabin — in his possession.” And he’d “almost managed to board a flight under a false identity.” The TSA checkpoint found the ammunition, which is what it is supposed to do. This had nothing to do with his identity. There’s no suggestion that the passenger intended to do anything nefarious.


  • anti-clickbait tldr: system uses facial recognition, complete with the expected false positives, false negatives, and bias.

    Key passage:

    Clear’s methods determined its facial-recognition system to enroll new members was vulnerable to abuse, said people familiar with the review, who asked not to be identified discussing security-sensitive information.

    The computer-generated photos of prospective customers at times captured blurry images that only showed chins and foreheads, or faces obscured by surgical masks and hoodies.

    The process — which allowed Clear employees to manually verify prospective customers’ identities after its facial recognition system raised flags — created the potential for human error.

    Apparently last July “a man slipped through Clear’s screening lines at Reagan National Airport near Washington, before a government scan detected ammunition — which is banned in the cabin — in his possession.” And he’d “almost managed to board a flight under a false identity.” The TSA checkpoint found the ammunition, which is what it is supposed to do. This had nothing to do with his identity. There’s no suggestion that the passenger intended to do anything nefarious.


  • anti-clickbait tldr: system uses facial recognition, complete with the expected false positives, false negatives, and bias.

    Key passage:

    Clear’s methods determined its facial-recognition system to enroll new members was vulnerable to abuse, said people familiar with the review, who asked not to be identified discussing security-sensitive information.

    The computer-generated photos of prospective customers at times captured blurry images that only showed chins and foreheads, or faces obscured by surgical masks and hoodies.

    The process — which allowed Clear employees to manually verify prospective customers’ identities after its facial recognition system raised flags — created the potential for human error.

    Apparently last July “a man slipped through Clear’s screening lines at Reagan National Airport near Washington, before a government scan detected ammunition — which is banned in the cabin — in his possession.” And he’d “almost managed to board a flight under a false identity.” The TSA checkpoint found the ammunition, which is what it is supposed to do. This had nothing to do with his identity. There’s no suggestion that the passenger intended to do anything nefarious.



  • tldr: author is plainly dying, but can’t try risky new treatments because they might… harm his dying body(!?) and the poor widdle FDA might wook bad.

    We need to have a much stronger “right to try” presumption: “When Dying Patients Want Unproven Drugs,” we should let those patients try. I have weeks to months left; let’s try whatever there is to try, and advance medicine along the way. The “right to try” is part of fundamental freedom—and this is particularly true for palliative-stage patients without a route to a cure anyway. They are risking essentially nothing.