Skip Navigation

Posts
852
Comments
2,448
Joined
2 yr. ago

  • so there's a whole network of specifically Thiel-associated SF tech guys who are into particular churches

  • at least one matches, i forget the name but it's all over bsky

  • A subspecies of the tie nazi

    OBJECTION! Lanyard nazis include many a shove-in-a-locker nazi

  • this user is just too smart for the average awful systems poster to deal with, and has been sent on his way to a more intellectual lemmy

  • came here to post this!

    I loved this comment:

    =====

    [Former member of that world, roommates with one of Ziz's friends for a while, so I feel reasonably qualified to speak on this.]

    The problem with rationalists/EA as a group has never been the rationality, but the people practicing it and the cultural norms they endorse as a community.

    As relevant here:

    1. While following logical threads to their conclusions is a useful exercise, each logical step often involves some degree of rounding or unknown-unknowns. A -> B and B -> C means A -> C in a formal sense, but A -almostcertainly-> B and B -almostcertainly-> C does not mean A -almostcertainly-> C. Rationalists, by tending to overly formalist approaches, tend to lose the thread of the messiness of the real world and follow these lossy implications as though they are lossless. That leads to...
    2. Precision errors in utility calculations that are numerically-unstable. Any small chance of harm times infinity equals infinity. This framing shows up a lot in the context of AI risk, but it works in other settings too: infinity times a speck of dust in your eye >>> 1 times murder, so murder is "justified" to prevent a speck of dust in the eye of eternity. When the thing you're trying to create is infinitely good or the thing you're trying to prevent is infinitely bad, anything is justified to bring it about/prevent it respectively.
    3. Its leadership - or some of it, anyway - is extremely egotistical and borderline cult-like to begin with. I think even people who like e.g. Eliezer would agree that he is not a humble man by any stretch of the imagination (the guy makes Neil deGrasse Tyson look like a monk). They have, in the past, responded to criticism with statements to the effect of "anyone who would criticize us for any reason is a bad person who is lying to cause us harm". That kind of framing can't help but get culty.
    4. The nature of being a "freethinker" is that you're at the mercy of your own neural circuitry. If there is a feedback loop in your brain, you'll get stuck in it, because there's no external "drag" or forcing functions to pull you back to reality. That can lead you to be a genius who sees what others cannot. It can also lead you into schizophrenia really easily. So you've got a culty environment that is particularly susceptible to internally-consistent madness, and finally:
    5. It's a bunch of very weird people who have nowhere else they feel at home. I totally get this. I'd never felt like I was in a room with people so like me, and ripping myself away from that world was not easy. (There's some folks down the thread wondering why trans people are overrepresented in this particular group: well, take your standard weird nerd, and then make two-thirds of the world hate your guts more than anything else, you might be pretty vulnerable to whoever will give you the time of day, too.)

    TLDR: isolation, very strong in-group defenses, logical "doctrine" that is formally valid and leaks in hard-to-notice ways, apocalyptic utility-scale, and being a very appealing environment for the kind of person who goes super nuts -> pretty much perfect conditions for a cult. Or multiple cults, really. Ziz's group is only one of several.

  • Is there anything written up anywhere about these AI book summaries? I know they were doing "AI" audiobooks.

  • when your skulls are both packed solid with wet cat food

  • College Hill is very good and on top of this shit

  • and imagine the ego to name yourself after the Simurgh from Worm

  • this is a nice writeup of the story of Ziz

  • TechTakes @awful.systems

    AI really is the new bitcoin: Google and Facebook extend coal burning in Omaha

    Buttcoin @awful.systems

    Pete Howson on futuristic "smart city" plans as crypto token scams

    TechTakes @awful.systems

    even DHH suggests Mullenweg might wanna cool it

    TechTakes @awful.systems

    LLMs can’t reason — they just crib reasoning-like steps from their training data

    TechTakes @awful.systems

    How ChatGPT nearly destroyed my wedding day

    TechTakes @awful.systems

    the Automattic cult acolyte onboarding guide, enjoy using all our systems whose names are puns on "Matt"

    TechTakes @awful.systems

    more of Mullenweg alienating his contributor base as hard as possible, for your amusement

    TechTakes @awful.systems

    Elon’s double nothingburger: robotaxis any year now, bro. And robots, bro. Trust us, bro.

    TechTakes @awful.systems

    translation: FYIFV

    TechTakes @awful.systems

    Zoom will let you deepfake yourself for meetings! Surely this cannot go badly wrong

    TechTakes @awful.systems

    The Nobel Prize in physics goes to Geoffrey Hinton for his work in computer science. What?

    TechTakes @awful.systems

    stack your demented relatives like cordwood in the inconsistent matrix simulation

    TechTakes @awful.systems

    The AI bill Newsom didn’t veto — AI devs must list models’ training data

    TechTakes @awful.systems

    Eric Schmidt: ‘We’re not going to hit the climate goals. I’d rather bet on AI solving the problem.’ With "alien intelligence"!

    TechTakes @awful.systems

    Grindr proposes ‘AI wingman’ to optimize away the gay sex bit

    TechTakes @awful.systems

    I went on Stilgherrian's podcast to talk about AI

    TechTakes @awful.systems

    Cash incinerator OpenAI secures its $6.6 billion lifeline — ‘in the spirit of a donation’

    Buttcoin @awful.systems

    oi have nah comment to th press at this torme and th rumours moi accent gave me away are scurrilous nonsense

    TechTakes @awful.systems

    Hello Matt this is your lawer speaking. I am advising you today to please keep posting this shit

    Buttcoin @awful.systems

    Daniel Larimer pivots from serial shitcoin founder to biblical apocalypse poster. The world ends on October 9.