Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

  • BlueMonday1984@awful.systems
    link
    fedilink
    English
    arrow-up
    7
    ·
    4 months ago

    I also believe they have employees who they cannot fire because they would spread a hella lot doomspeak if they did, who are True Believers.

    Part of me suspects they probably also aren’t the sharpest knives in OpenAI’s drawer.

    • imadabouzu@awful.systems
      link
      fedilink
      English
      arrow-up
      9
      ·
      4 months ago

      It can be both. Like, probably OpenAI is kind of hoping that this story becomes wide and is taken seriously, and has no problem suggesting implicitly and explicitly that their employee’s stocks are tied to how scared everyone is.

      Remember when Altman almost got outed and people got pressured not to walk? That their options were at risk?

      Strange hysteria like this doesn’t need just one reason. It just needs an input dependency and ambiguity, the rest takes of itself.