• 4 Posts
  • 1.66K Comments
Joined 1 year ago
cake
Cake day: September 7th, 2023

help-circle


  • you should always escalate every conflict to the maximum in order to dissuade hypothetical agents from blackmailing you.

    Ah not understanding the escalation ladder (or knowing it exists, and just going all in every time because they don’t get it) a common issue, slightly more common among younger nerds it seems (Made worse by Enders Game), but also fascists (somebody shared a story on bsky where they had violently (fists etc) gotten rid of some fascists at an event, who later returned to do a drive by shooting).






  • They believe in race and IQ, that people who code javascript blog a lot (no a lot) are superior (and should have all the money and power), and that the time of democracy is over. (and that some people should be reduced to biodiesel). They don’t believe the world has gotten less violent, they believe in corporal punishment, leather jackets make you look cool no matter the situation, etc etc.

    Ow and that you can build Skynet using Turing machines.

    This is not to be confused with what the zizians believe, they believe a bespoke set of other wild things. But I have never really looked much into them.



  • Technically he doesn’t have connections to Thiel directly. They have connections to the lesswrong Rationalists (note, rationalwiki is not associated with the lesswrong Rationalists, and they don’t like each other, this place is also on team rationalwiki). However enough Rationalists are (or are friends with) neoreactionaries (for a long time LW was one of the few places where they were sometimes welcome, or even talked about). But this is one of the places where (through heavy layers of snark and derision) you could learn about these groups.

    However, as the lore is deep, very annoying, quite bad (A lot of IQ-anon type people), I suggest you don’t start reading up on this. Ignorance once lost cannot be regained (There are also some very small (small as on low risk, but if you are one of the rare people high effect) mental health risks if you believe very specific things about skynet style AI and a few other silly beliefs). And you already got the important parts (Thiel and people close to him suck).







  • That’s basically the entire problem in a nutshell.

    I think a lot of these people are cunning, aka good at somewhat sociopathic short term plans and thinking, and they confuse this ability (and they survivor biassed success) for being good at actual planning (or just thinking that planning is worthless, after all move fast and break things (and never think about what you just said)). You don’t have to actually have good plans if people think you have charisma/a magical money making ability (which needs more and more rigging of the casino to get money on the lot of risky bets to hope one big win pays for it all machine).

    Doesn’t help that some of them seem to either be on a lot of drugs, or have undiagnosed adhd. Unrelated, Musk wants to go into Fort Knox all of a sudden, because he saw a post on twitter which has convinced him ‘they’ stole the gold (my point here is that there is no way he was thinking about Knox at all before he randomly came across the tweet, the plan is crayons).