• Soyweiser@awful.systems
    link
    fedilink
    English
    arrow-up
    16
    ·
    edit-2
    6 months ago

    Yeah see also his denouncement of Roko’s Basilisk (ctrl-f the page), we know it wasn’t that important, the funny part was that it was a dumb rehash of Pascals wager, and that at the time Yud took is very seriously.

    Wood also doesn’t seem to link to the actual Rationalwiki article which also makes clear that Yud doesn’t really believe in it (probably). It also mentions just how few (but above the 5% lizardman constant, so cause for concern, if they took their own ideas and MH seriously) people were worried about it. And every now and then you do find a person online who does take the idea seriously and worries about it, which is a bit of a concern. So oddly they should take it more seriously but only because it wrecks a small percentage of minds.

    It is weird to not mention Yuds freakout:

    Listen to me very closely, you idiot.

    YOU DO NOT THINK IN SUFFICIENT DETAIL ABOUT SUPERINTELLIGENCES CONSIDERING WHETHER OR NOT TO BLACKMAIL YOU. THAT IS THE ONLY POSSIBLE THING WHICH GIVES THEM A MOTIVE TO FOLLOW THROUGH ON THE BLACKMAIL.

    There’s an obvious equilibrium to this problem where you engage in all positive acausal trades and ignore all attempts at acausal blackmail. Until we have a better worked-out version of TDT and we can prove that formally, it should just be OBVIOUS that you DO NOT THINK ABOUT DISTANT BLACKMAILERS in SUFFICIENT DETAIL that they have a motive toACTUALLY[sic] BLACKMAIL YOU.

    And pretend this was just a blip and nothing more. Mf’er acted like he was in Stross novel.

    (Also after not clearly sharing the information about Roko’s Basilisks history, and we sneer at it, I came across this sentence: “then cites his pet article on Roko’s Basilisk directly while giggling about how mad it made Yudkowsky fans.” lol, no selfawareness there wood).

    • barsquid@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      ·
      6 months ago

      Roko’s basilisk is one of my favorite things because of the combination of how stupid it is and also how utterly panicked they all were. Desperately imagining magical communication across time and space with their dumb paperclip demon and panicking.

      Why don’t they just believe in a deity if they want one so bad?

      • YouKnowWhoTheFuckIAM@awful.systems
        link
        fedilink
        English
        arrow-up
        16
        ·
        edit-2
        6 months ago

        It’s funny because you will hear over and over again from them online, in almost rank-and-file prose, about how it was all a big storm in the collective teacup, and then some time later run across yet another story of a real life non-anonymous person who was freaked the fuck out for some good period of time, as were some large portion of their friends

        • Soyweiser@awful.systems
          link
          fedilink
          English
          arrow-up
          14
          ·
          6 months ago

          And it still randomly freaks people out to this day, it clearly isn’t a storm in a teacup for some people. And it is quite easily countered, but that also adds some ideas which counter the whole FOOM bs, so it is clear why they rather all push this under the rug, no matter the mental health cost of any Rationalists or would be Rationalists.

    • flavia@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      15
      ·
      edit-2
      6 months ago

      HOW does he seriously use the phrase ‘acausal blackmail’. I assumed the majority of word combinations for ‘acausal’ + noun were just jokes from people on here but apparently not.

      • Soyweiser@awful.systems
        link
        fedilink
        English
        arrow-up
        11
        ·
        edit-2
        6 months ago

        Ikr, literally a plot point from Charles Stross Singularity Sky (2003) series and they take it seriously. (Which fits a pattern, CW: sexual abuse

        spoiler

        Yuds math pets thing also has a similarity to the horrible sadist torture scene in the last rifter series book (2004) in which a woman gets horribly abused more if she answers question wrong. (with the added plot twist that the torturer isn’t as smart as he thinks he is and is partially wrong about the questions he ‘rewards’ (which could be some 4d chess move but doubtful)). Anyway, not the greatest read. Esp 2 decades later, when a lot of the weird science fiction terms used in the books have taken a very alt-right/dark enlightenment turn (which makes me wonder if they stole that).

        )