I did fake Bayesian math with some plausible numbers, and found that if I started out believing there was a 20% per decade chance of a lab leak pandemic, then if COVID was proven to be a lab leak, I should update to 27.5%, and if COVID was proven not to be a lab leak, I should stay around 19-20%

This is so confusing: why bother doing “fake” math? How does he justify these numbers? Let’s look at the footnote:

Assume that before COVID, you were considering two theories:

  1. Lab Leaks Common: There is a 33% chance of a lab-leak-caused pandemic per decade.
  2. Lab Leaks Rare: There is a 10% chance of a lab-leak-caused pandemic per decade.

And suppose before COVID you were 50-50 about which of these were true. If your first decade of observations includes a lab-leak-caused pandemic, you should update your probability over theories to 76-24, which changes your overall probability of pandemic per decade from 21% to 27.5%.

Oh, he doesn’t, he just made the numbers up! “I don’t have actual evidence to support my claims, so I’ll just make up data and call myself a ‘good Bayesian’ to look smart.” Seriously, how could a reasonable person have been expected to be concerned about lab leaks before COVID? It simply wasn’t something in the public consciousness. This looks like some serious hindsight bias to me.

I don’t entirely accept this argument - I think whether or not it was a lab leak matters in order to convince stupid people, who don’t know how to use probabilities and don’t believe anything can go wrong until it’s gone wrong before. But in a world without stupid people, no, it wouldn’t matter.

Ah, no need to make the numbers make sense, because stupid people wouldn’t understand the argument anyway. Quite literally: “To be fair, you have to have a really high IQ to understand my shitty blog posts. The Bayesian math is is extremely subtle…” And, convince stupid people of what, exactly? He doesn’t say, so what was the point of all the fake probabilities? What a prick.

  • locallynonlinear@awful.systems
    link
    fedilink
    English
    arrow-up
    18
    ·
    edit-2
    10 months ago

    Ah, if only the world wasn’t so full of “stupid people” updating their bayesians based off things they see on the news, because you should already be worried of and calculating your distributions for… inhales deeply terrorist nuclear attacks, mass shootings, lab leaks, famine, natural disasters, murder, sexual harassment, conmen, decay of society, copyright, taxes, spitting into the wind, your genealogy results, comets hitting the earth, UFOs, politics of any and every kind, and tripping on your shoe laces.

    What… insight did any of this provide? Seriously. Analytical statistics is a mathematically consistent means of being technically not wrong, while using a lot of words, in order to disagree on feelings, and yet saying nothing.

    Risk management is not a statistical question in fact. It’s an economics question of your opportunities. It’s why prepping is better seen as a hobby, a coping mechanism and not as viable means of surviving apocalypse. It’s why even when a EA uses their super powers of bayesian rationality the answer in the magic eight ball is always just “try to make money, stupid”.

  • Architeuthis@awful.systems
    link
    fedilink
    English
    arrow-up
    18
    ·
    edit-2
    10 months ago

    Hi, my name is Scott Alexander and here’s why it’s bad rationalism to think that widespread EA wrongdoing should reflect poorly on EA.

    The assertion that having semi-frequent sexual harassment incidents go public is actually an indication of health for a movement since it’s evidence that there’s no systemic coverup going on and besides everyone’s doing it is uh quite something.

    But surely of 1,000 sexual harassment incidents, the movement will fumble at least one of them (and often the fact that you hear about it at all means the movement is fumbling it less than other movements that would keep it quiet). You’re not going to convince me I should update much on one (or two, or maybe even three) harassment incidents, especially when it’s so easy to choose which communities’ dirty laundry to signal boost when every community has a thousand harassers in it.

    • titotal@awful.systems
      link
      fedilink
      English
      arrow-up
      17
      ·
      10 months ago

      ahh, I fucking haaaate this line of reasoning. Basically saying “If we’re no worse than average, therefore there’s no problem”, followed by some discussion of “base rates” of harrassment or whatever.

      Except that the average rate of harrassment and abuse, in pretty much every large group, is unacceptably high unless you take active steps to prevent it. You know what’s not a good way to prevent it? Downplaying reports of harrassment and calling the people bringing attention to it biased liars, and explicitly trying to avoid kicking out harmful characters.

      Nothing like a so-called “effective altruist” crowing about having a C- passing grade on the sexual harrassment test.

    • self@awful.systemsM
      link
      fedilink
      English
      arrow-up
      15
      ·
      10 months ago

      and often the fact that you hear about it at all means the movement is fumbling it less than other movements that would keep it quiet

      I just can’t get over how far this is from reality. like fuck, for a lot of these things the controversy is the community covering for the abuser, or evidence coming out that sexual harassment was covered up in the past. depressingly often in tech, the community doesn’t even try to keep it quiet; instead they just loudly endorse the abuser or talk about how there’s nothing they can do.

    • swlabr@awful.systems
      link
      fedilink
      English
      arrow-up
      15
      ·
      10 months ago

      Scott: “Hmm, the reputation of the EA community that I am part of and love for some reason is tanking, due to the bad actions of its luminaries. What can I do to help? I know, I’ll bring up 9/11”

      Empty room: “…”

      “And I’ll throw out some made up statistics about terrorist attacks and how statistically we were due for a 9/11 and we overreacted by having any response whatsoever. And then I’ll show how that’s the same as when someone big in EA does something bad.”

      “…”

      “Especially since it’s common for people to, after a big scandal, try and push their agenda to improve things. We definitely don’t want that.”

      “…”

      “Also, on average there’s less SA in STEM, and even though there is still plenty of SA, we don’t need to change anything, because averages.”

      “…”

      “Anyway, time for dexy no. 5”

      • hirudiniformes@awful.systemsOP
        link
        fedilink
        English
        arrow-up
        10
        ·
        10 months ago

        Hmm, the reputation of the EA community that I am part of and love for some reason is tanking, due to the bad actions of its luminaries.

        “And it would be clear I’m full of shit if I put this at the start of the article, so I’ll bury the lede behind a wall of text”

  • swlabr@awful.systems
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    1
    ·
    edit-2
    10 months ago

    Scott is saying essentially that “one data point doesn’t influence the data as a whole that much” (usually true)… “so therefore you don’t need to change your opinions when something happens” which is just so profoundly stupid. Just so wrong on so many levels. It’s not even correct Bayesianism!

    (if it happens twice in a row, yeah, that’s weird, I would update some stuff)

    ??? Motherfucker have you heard of the paradox of the heap? What about all that other shit you just said?

    What is this really about, Scott???

    Do I sound defensive about this? I’m not. This next one is defensive. [line break] I’m part of the effective altruist movement.

    OH ok. I see now. I mean I’ve always seen, really, that you and your friends work really hard to come up with ad hoc mental models to excuse every bit of wrongdoing that pops up in any of the communities you’re in.

    You definitely don’t get this virtue by updating maximally hard in response to a single case of things going wrong. […] The solution is not to update much on single events, even if those events are really big deals.

    Again, this isn’t correct Bayesian updating. The formula is the formula. Biasing against recency is not in it. And that’s just within Bayesian reasoning!

    In a perfect world, people would predict distributions beforehand, update a few percent on a dramatic event, but otherwise continue pursuing the policy they had agreed upon long before.

    YEAH BECAUSE IT’S A PERFECT WORLD YOU DINGUS.

    • Tar_Alcaran@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      16
      ·
      10 months ago

      Complete sidenote, but I hate how effective altruism has gone from “charities should spend more money on their charity and not on executive bonusses, here are the ones that don’t actually help anyone” to “I believe I will save infinity humans by colonizing mars, so you can just starve to death today”.

      • swlabr@awful.systems
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        1
        ·
        10 months ago

        I suspect a large portion of people in EA leadership were already on the latter train and posturing as the former. The former is actually kinda problematic in its own way! If a problem was solvable purely by throwing money at it, then what is the need for a charity at all?

        • Clifton Royston@wandering.shop
          link
          fedilink
          arrow-up
          6
          ·
          10 months ago

          @swlabr @Tar_alcaran

          Well, because (most) governments (mostly) *don’t* throw money at the problems that *could* be solved by throwing money at them.

          Look at the malaria prevention or guinea worm eradication programs, for instance. Ten years ago or so, my first encounter with EA was a website talking about how many lives you could save or improve by giving money to NGOs focused on those issues.

          Hell, look at homelessness in most “Western” countries, except Finland. Look at UBI. etc.

          • swlabr@awful.systems
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            1
            ·
            10 months ago

            Ok, so to be clear, I would (perhaps naively) prefer it if we didn’t have charities/NGOs and that governments would handle solving problems and helping people entirely. Of course, this is reductive; there are probably plenty of spaces where NGOs and charities are better suited for approaching some issues.

            That being said, while money (or a lack thereof) is the main issue in solving many problems, you still need all kinds of work to make it effective. In the case of malaria prevention, a cause EA deems to be cost-effective, you still need to pay staff to carry out logistics to deliver whatever nets or vaccines you buy with money. You wouldn’t want someone incompetent at the helm; that could cause your cost-effectiveness to go down. And how do you incentivize competent people to stay in leadership positions? There are plenty of ways, but executive bonuses will be at the top of that list.

            Anyway, my main issue with EA has gotta be how it launders false morality and money into morality. The false morality is the X-risk shit. The money is the money from working in tech.

            • Clifton Royston@wandering.shop
              link
              fedilink
              arrow-up
              5
              ·
              10 months ago

              @swlabr

              Oh I’m in total agreement with you on all these points.

              I really detest the bizarre self-delusional stuff that masquerades as “Altruism” for the TREACLES people. (Yes that’s a deliberate mis-acronyming.)

              What I was trying to express, but not clearly enough, was that I’d really like someone to help with working out what’s a genuinely effective use of effort and/or money. It’s doubly frustrating that the people who looked like they might be trying to do that were just a crank AI cult.

  • Evinceo@awful.systems
    link
    fedilink
    English
    arrow-up
    13
    ·
    10 months ago

    Pay no attention to the man behind that curtain. The rate of men behind curtains is actually quite low. Do not doubt the great and powerful Oz.

  • gerikson@awful.systems
    link
    fedilink
    English
    arrow-up
    13
    ·
    edit-2
    10 months ago

    OK my knowledge of Bayes is rusty at best, but isn’t the idea that the occurrences should be relatively common, and/or not correlated?

    So far, there has been zero or one[1] lab leak that led to a world-wide pandemic. Before COVID, I doubt anyone was even thinking about the probabilities of a lab leak leading to a worldwide pandemic.

    Also, ideally, if there was a lab leak, then people running labs would take note and ensure that that particular failure mode doesn’t happen again. Thus the probability of an occurrence would be less than the first time it happened, because people actually take note of what has happened and change stuff.

    Scottyboy could have used something that has occurred multiple times, like a nuclear powerplant accident, but his audience loves nuclear power, so that’s a non-starter. Also it’s a given that the mainstream press is the big bad in the fight against nuclear, just because serious accidents with widespread death and economic destruction happen again and again with nuclear power.

    Raising the lab leak “hypothesis” is just signalling to his base.


    [1] depending on where you stand in current US politics

    • mountainriver@awful.systems
      cake
      link
      fedilink
      English
      arrow-up
      13
      ·
      10 months ago

      Also, if you think either of these are true:

      Lab Leaks Common: There is a 33% chance of a lab-leak-caused pandemic per decade. Lab Leaks Rare: There is a 10% chance of a lab-leak-caused pandemic per decade.

      You should probably be campaigning to increase safety or shut down the labs you think would be responsible. 10% risk of pandemic per decade due to lab leaks (so in addition to viruses mutating on their own) isn’t rare or an acceptable risk.

    • locallynonlinear@awful.systems
      link
      fedilink
      English
      arrow-up
      13
      ·
      10 months ago

      So far, there has been zero or one[1] lab leak that led to a world-wide pandemic. Before COVID, I doubt anyone was even thinking about the probabilities of a lab leak leading to a worldwide pandemic.

      So, actually, many people were thinking about lab leaks, and the potential of a worldwide pandemic, despite Scott’s suggestion that stupid people weren’t. For years now, bioengineering has been concerned with accidental lab leaks because the understanding that risk existed was widespread.

      But the reality is that guessing at probabilities of this sort of thing still doesn’t change anything. It’s up to labs to pursue safety protocols, which happens at the economic edge of of the opportunity vs the material and mental cost of being diligent. Reality is that lab leaks may not change probabilities, but yes the events of them occurring does cause trauma which acts, not as some bayesian correction, but an emotional correction so that people’s motivations for atleast paying more attention increases for a short while.

      Other than that, the greatest rationalist on earth can’t do anything with their statistics about label leaks.

      This is the best paradox. Not only is Scott wrong to suggest people shouldn’t be concerned about major events (the traumatic update to individual’s memory IS valuable), but he’s wrong to suggest that anything he or anyone does after updating their probabilities could possibly help them prepare meaningfully.

      He’s the most hilarious kind of wrong.

      • locallynonlinear@awful.systems
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        10 months ago

        If I could sum up everything that’s wrong with EA, it’d be,

        “We can use statistics to do better than emotions!” in reality means “We are dysregulated and we aren’t going to do anything about it!!!”

    • swlabr@awful.systems
      link
      fedilink
      English
      arrow-up
      9
      ·
      10 months ago

      Thanks for bringing up the dogwhistles. We haven’t talked about the dog whistles enough here. My fave has gotta be him bringing up the school shooting one.

    • Soyweiser@awful.systems
      link
      fedilink
      English
      arrow-up
      9
      ·
      10 months ago

      Before COVID, I doubt anyone was even thinking about the probabilities of a lab leak leading to a worldwide pandemic.

      This makes me wonder, we know the Rationalists did worry about a global pandemic before COVID19, we checked the waste water and the smug particles increased exponentially for a short time in feb 2020. But did they also worry about a normal lab leak like which might have happened here? Or was it all either nature/terrorism/AGI stuff?

      • gerikson@awful.systems
        link
        fedilink
        English
        arrow-up
        10
        ·
        10 months ago

        For a while there, when it looked as if only the rich were gonna be able to source R95 masks and everyone else was gonna die, the SV elite were all aboard with this being the new Black Death. As soon as it became apparent that the only way to deal with it was through massive government support they did a 180 and started talking about how it wasn’t that bad after all.

        • Soyweiser@awful.systems
          link
          fedilink
          English
          arrow-up
          8
          ·
          10 months ago

          I’m myself just annoyed that the other Scott (no not the cartoonist, the other smart Scott) blamed sneerers for covid being worse. (while sneerclub itself was agreeing with the Rationalists that people should be careful and that it wasn’t a non-event). And that this Scott above argued that people should stop smoking to help against covid (not that he had any proof for that, he just disliked that people smoke (yes as good Bayesians we should now increase our ‘is the Rationalist thought leader lying to me’ priors)). The rest I don’t really recall that much.

  • Coll@awful.systems
    link
    fedilink
    English
    arrow-up
    5
    ·
    10 months ago

    people who are my worst enemies - e/acc people, those guys who always talk about how charity is Problematic - […] weird anti-charity socialists

    Today I learned that ‘effective accelerationists’ like CEO of Y-combinator Garry Tan, venture capitalist Marc Andreessen and “Beff Jezos” are socialists. I was worried that those evil goals they wanted to achieve by simply trying to advance capitalism might reflect badly on it, but luckily they aren’t fellow capitalists after all, they turned out to be my enemies the socialists all along! Phew!

  • SamuraiBeandog@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    8
    ·
    edit-2
    10 months ago

    I’m not a fanboy or necessarrily agree with his argument, but you’re seriously missing the point of what he’s trying to say. He’s just talking about how big, mediapathic events can unduly influence people’s perception of probability and risk. He doesn’t need actual real world numbers to show how this works, he’s just demonstrating how the math works and how the numbers change. He isn’t trying to convince stupid people of anything, they aren’t his target audience and they will never think this way.

    • self@awful.systemsM
      link
      fedilink
      English
      arrow-up
      17
      ·
      10 months ago

      oh yeah, scott would never use bad math to force a monstrous point

      Take sexual harassment. Surveys suggest that about 5% of people admit to having sexually harassed someone at some point in their lives; given that it’s the kind of thing people have every reason to lie about, the real number is probably higher. Let’s say 10%.

      So if there’s a community of 10,000 people, probably 1,000 of them have sexually harassed someone. So when you hear on the news that someone in that community sexually harassed someone, it shouldn’t change your opinion of that community at all. You started off pretty sure there were about 1,000, and now you know that there is at least one. How is that an update?!

      Still, every few weeks there’s a story about someone committing sexual harassment in (let’s say) the model airplane building community, and then everyone spends a few days talking about how airplanes are sexist and they always knew the model builders were up to no good.

      I mean this is just how people work! they hear about one case of sexual harassment, incorrectly update the probabilities in their heads, and then The Left convinces them that airplanes are sexist. these people are too stupid to have thoughts like “sexual harassment is happening way too often given the small size of the model airplane building community, and listening to the victims allowed me to figure out some of the systemic factors for why that’s the case for that community” and that’s why they fall into real, definitely not made up by scott to make the people he doesn’t like seem ridiculous, beliefs like airplanes being sexist. how dare these stupid people exist outside of Scott’s extremely mid imagination.

      come the fuck on. this isn’t our first scott alexander post.

      • David Gerard@awful.systemsM
        link
        fedilink
        English
        arrow-up
        19
        ·
        edit-2
        10 months ago

        Take sexual harassment.

        my Bayesian priors tell me this is what Scott’s post is actually about, and even more shit is a bit close to dropping

        though obv i’m just catastrophising on single events that keep on happening

        EDIT: oh, of course it’ll be yet more shit coming down the line from abusers in EA

      • maol@awful.systems
        link
        fedilink
        English
        arrow-up
        12
        ·
        10 months ago

        Why didn’t he calculate how many people are sexually harassed in a community? That seems a bit relevant, considering that most sexual harrassers harrass multiple people.

        • Soyweiser@awful.systems
          link
          fedilink
          English
          arrow-up
          9
          ·
          edit-2
          10 months ago

          Well those people who were sexually harassed actually were mentally unstable and have a history of lying about being sexually harassed so we shouldn’t take them seriously. (A thing Scott actually said, as a psychologist who should know that the behavior above actually makes people a higher risk of being assaulted, after (not 100% sure if after or during the accusations btw, but that doesn’t matter that much, a wizard psych should know better!) somebody killed themselves over all this).

          Scott defending or excusing the abuse in the wider LW community is very much a pattern now. (Eurgh, im noticing im getting a bit angry about this all over again, best to just not engage with this stuff and do other things (got rid of the posts here which might start further discussions and stopped reading Scotts beigeness about probability theory and sexual abuse post)).

          • TinyTimmyTokyo@awful.systems
            link
            fedilink
            English
            arrow-up
            8
            ·
            10 months ago

            Exactly. It would be easier to take Scott’s argument more seriously if it wasn’t coming from the very same person who previously labeled as unstable and thereby non-credible a woman who accused his rationalist buddies of sexual harrassment – a woman who, by the way, went on to die by suicide.

            So fuck him and his contrived rationalizations.

    • Architeuthis@awful.systems
      link
      fedilink
      English
      arrow-up
      17
      ·
      edit-2
      10 months ago

      you’re seriously missing the point of what he’s trying to say. He’s just talking about [extremely mundane and self evident motte argument]

      Nah, we’re just not giving him the benefit of a doubt and also have a lot of context to work with.

      Consider the fact that he explicitly writes that you are allowed to reconsider your assumptions on domestic terrorism if a second trans mass shooter incident “happens in a row” but a few paragraphs later Effective Altruists blowing up both FTX and OpenAI in the space of a year the second incident is immediately laundered away as the unfortunate result of them overcorrecting in good faith against unchecked CEO power.

      This should stick out even to one approaching this with a blank slate perspective in my opinion.

    • swlabr@awful.systems
      link
      fedilink
      English
      arrow-up
      12
      ·
      10 months ago

      Hey guys look it’s the Scott whisperer, Mr. Beandog. Let’s see what he’s got for us today:

      I’m not a fanboy

      sure

      or necessarrily agree with his argument

      surely then, you wouldn’t feel the need to 'splain it

      but you’re seriously missing the point of what he’s trying to say.

      oh ok

      He’s just talking about how big, mediapathic events can unduly influence people’s perception of probability and risk

      No, that isn’t what he is saying, actually.

      He doesn’t need actual real world numbers to show how this works, he’s just demonstrating how the math works and how the numbers change

      He does, actually. You can’t make fake mathematical statements about the real world and expect me to just buy your argument. He is demonstrating how the math hypothetically works in a scenario where he cooks the numbers. There is no reason why one should extrapolate that to the real world.

      He isn’t trying to convince stupid people of anything, they aren’t his target audience and they will never think this way.

      Oh ok. prior updated. Coulda sworn his target audience was morons.

    • Amoeba_Girl@awful.systems
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      10 months ago

      He isn’t trying to convince stupid people of anything, they aren’t his target audience and they will never think this way.

      you think “stupid people” is a meaningful social category, opinion dismissed.

      • hirudiniformes@awful.systemsOP
        link
        fedilink
        English
        arrow-up
        6
        ·
        10 months ago

        they would benefit from the support of the “stupid people” demographic, and stupid people only remember that something is possible for a space of a few days immediately after it happens, otherwise it’s “science fiction”

        I just hate how arrogant and quick to stereotype he is.