1. Post in !techtakes@awful.systems attacks the entire concept of AI safety as a made-up boogeyman
  2. I disagree and am attacked from all sides for “posting like an evangelist”
  3. I give citations for things I thought would be obvious, such as that AI technology in general has been improving in capability compared to several years ago
  4. Instance ban, “promptfondling evangelist”

This one I’m not aggrieved about as much, it’s just weird. It’s reminiscent of the lemmy.ml type of echo chamber where everyone’s convinced it’s one way, because in a self-fulfilling prophecy, anyone who is not convinced gets yelled at and receives a ban.

Full context: https://ponder.cat/post/1030285 (Some of my replies were after the ban because I didn’t PT Barnum carefully enough, so didn’t realize.)

  • surph_ninja@lemmy.world
    link
    fedilink
    arrow-up
    5
    arrow-down
    4
    ·
    7 days ago

    Because this is new tech. Not a Ponzi scheme.

    Seems you’re still struggling to adequately assess emerging technology.

    • Susaga@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      3
      ·
      7 days ago

      It’s the same people picking up new technology and telling else to get on board or be left behind. People with a good understanding of technology and society point out the obvious flaws. Then everyone who jumped on the bandwagon starts calling everyone who didn’t jump with them a Luddite who is going to be left behind.

      Meanwhile, you have people stealing the work from artists without compensation. You have a rampant misuse of computing power to meet the needs of the new technology. You have features forced on people who want nothing to do with it. You have countless people using the technology to get a cheap cash-grab, then hopping on to do it again. You have people using the technology to commit legitimate crimes, using the slow speed of legal definitions to get away with it.

      This is nothing new. We’ve been here before. I’d like to move on.