I did fake Bayesian math with some plausible numbers, and found that if I started out believing there was a 20% per decade chance of a lab leak pandemic, then if COVID was proven to be a lab leak, I should update to 27.5%, and if COVID was proven not to be a lab leak, I should stay around 19-20%

This is so confusing: why bother doing “fake” math? How does he justify these numbers? Let’s look at the footnote:

Assume that before COVID, you were considering two theories:

  1. Lab Leaks Common: There is a 33% chance of a lab-leak-caused pandemic per decade.
  2. Lab Leaks Rare: There is a 10% chance of a lab-leak-caused pandemic per decade.

And suppose before COVID you were 50-50 about which of these were true. If your first decade of observations includes a lab-leak-caused pandemic, you should update your probability over theories to 76-24, which changes your overall probability of pandemic per decade from 21% to 27.5%.

Oh, he doesn’t, he just made the numbers up! “I don’t have actual evidence to support my claims, so I’ll just make up data and call myself a ‘good Bayesian’ to look smart.” Seriously, how could a reasonable person have been expected to be concerned about lab leaks before COVID? It simply wasn’t something in the public consciousness. This looks like some serious hindsight bias to me.

I don’t entirely accept this argument - I think whether or not it was a lab leak matters in order to convince stupid people, who don’t know how to use probabilities and don’t believe anything can go wrong until it’s gone wrong before. But in a world without stupid people, no, it wouldn’t matter.

Ah, no need to make the numbers make sense, because stupid people wouldn’t understand the argument anyway. Quite literally: “To be fair, you have to have a really high IQ to understand my shitty blog posts. The Bayesian math is is extremely subtle…” And, convince stupid people of what, exactly? He doesn’t say, so what was the point of all the fake probabilities? What a prick.

  • Evinceo@awful.systems
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Pay no attention to the man behind that curtain. The rate of men behind curtains is actually quite low. Do not doubt the great and powerful Oz.

  • Coll@awful.systems
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    people who are my worst enemies - e/acc people, those guys who always talk about how charity is Problematic - […] weird anti-charity socialists

    Today I learned that ‘effective accelerationists’ like CEO of Y-combinator Garry Tan, venture capitalist Marc Andreessen and “Beff Jezos” are socialists. I was worried that those evil goals they wanted to achieve by simply trying to advance capitalism might reflect badly on it, but luckily they aren’t fellow capitalists after all, they turned out to be my enemies the socialists all along! Phew!

  • Architeuthis@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 year ago

    Hi, my name is Scott Alexander and here’s why it’s bad rationalism to think that widespread EA wrongdoing should reflect poorly on EA.

    The assertion that having semi-frequent sexual harassment incidents go public is actually an indication of health for a movement since it’s evidence that there’s no systemic coverup going on and besides everyone’s doing it is uh quite something.

    But surely of 1,000 sexual harassment incidents, the movement will fumble at least one of them (and often the fact that you hear about it at all means the movement is fumbling it less than other movements that would keep it quiet). You’re not going to convince me I should update much on one (or two, or maybe even three) harassment incidents, especially when it’s so easy to choose which communities’ dirty laundry to signal boost when every community has a thousand harassers in it.

    • swlabr@awful.systems
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Scott: “Hmm, the reputation of the EA community that I am part of and love for some reason is tanking, due to the bad actions of its luminaries. What can I do to help? I know, I’ll bring up 9/11”

      Empty room: “…”

      “And I’ll throw out some made up statistics about terrorist attacks and how statistically we were due for a 9/11 and we overreacted by having any response whatsoever. And then I’ll show how that’s the same as when someone big in EA does something bad.”

      “…”

      “Especially since it’s common for people to, after a big scandal, try and push their agenda to improve things. We definitely don’t want that.”

      “…”

      “Also, on average there’s less SA in STEM, and even though there is still plenty of SA, we don’t need to change anything, because averages.”

      “…”

      “Anyway, time for dexy no. 5”

      • hirudiniformes@awful.systemsOP
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Hmm, the reputation of the EA community that I am part of and love for some reason is tanking, due to the bad actions of its luminaries.

        “And it would be clear I’m full of shit if I put this at the start of the article, so I’ll bury the lede behind a wall of text”