Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)CI
帖子
6
评论
105
加入于
4 mo. ago

  • I agree that its gross to discuss a lot of this in public, and that underage sex is often an ethical grey area. I had no idea that the person who accused BD of pushing him into substance use and extreme BDSM scenarios is also the person who allegedly had sex underage with a MIRI staffer while living in a Rationalist group home.

  • The GMS model fits the rise and fall of scientific skepticism pretty well. As the first generation of deeply nerdy leaders like Martin Gardner, L. Sprague de Camp, and James Randi aged and died, new leaders appeared who said that the movement should be bigger and address more important things like social justice. These leaders and the new party-style events brought more people in the door, but some of the leaders believed irrational things and wanted money and sex and were not fussy how they got it (Shermer, Carrier)1, and some liked pushing people around and being tastemakers (Watson, Myers). My understanding is that the skeptics got rid of most of the big egos, but in doing so they shattered their movement. Most of the big names are still around with online followings, and various rump skeptic and atheist movements still exist, but the attempt to rally everyone around skepticism or Atheism Plus collapsed, and some basically decent and rational people like Hal Bidlack and Harriett Hall ended up in the wilderness for ideological crimes.

    I don't know what movements from the 20th century Chapman was thinking of, and it would be less polarizing to talk about things which were cool in the 1980s than things which were cool recently. I would bet at 50-50 that someone will be offended by the previous paragraph.

    1: Shermer and Carrier's belief that there was one objective morality which can be proven is a lot like Yudkowsky's belief that there is one objective morality which can be programmed into Friendly AI. The way 'sex-positivity' was used in the skeptical and atheist sphere also rhymes. I could write a whole essay about how LessWrong cut out the parts of skepticism which would help newbies to spot that the movement was cult-adjacent and irrational.

  • occultism

    Another common example for Americans is "milk before meat" among the Later-Day Saints. The paper by Gleiberman above lays out how once you are committed to the idea that altruism should be as effective as possible and that your intuitions about what is effective are not trustworthy, the Longtermists pull you into a dark alley where their friend Pascal is waiting to mug you (although longermist EA never received a majority of EA funding). Its all as sad as when I am trying to have a factual conversation with Americans online and they try to convert me to pseudoscientific racism.

  • I think one of the biggest flaws of our friends is that they want there to be one hierarchy of power and capability, with Electric Jesus at the top, then them, then their admirers, then the rest of us. Yukowsky is brilliant at getting people to give him money, good at getting them to give him sex, but not a scientist or a skeptic (I am told he asked for special powers to delete LessWrong comments which explain what he got wrong or did not see).

    The "geeks, mops, and sociopaths" model does not encourage people to look at themselves and ask whether their community's problems are their own fault. It also does not encourage them to ask "I am a drama kid, you are a min-maxer, can we find a way to have a fun game of D&D together or should we find our own groups?"

    Alex Karp's Wikipedia page has a wild gap from "trying to raise enough money to be a Bohemian in Berlin in 2002" to "senior exec at Palantir with a Norwegian bodyguard and spicy takes on the Gaza war."

  • Saying that at age 46 you are proud of not reenacting tropes from fantasy novels you read when you were 9 is something special. "He's the greatest single mind since L. Ron Hubbard."

    His OkCupid profile also showed a weak grasp on the difference between fantasy and reality.

    Do we know when he transitioned from Margaret Weiss and Lawrence Watt-Evans to filthy Japanese cartoons?

  • Chapman's advice seems pretty good for keeping an indy art scene small and for autistic introverts not big and for normies, but not for realizing that LessWrong and EA are cults founded by bad people with bad goals with an exoteric doctrine out front and an esotetric doctrine once you are committed.

  • Was TPOT a Twitter thing? It seems like LessWrong was all over Tumblr and Twitter.

    Most of us are harrmless and just want to explore our special interests. But I don't think any of our friends fits that description. I don't think it was just about power games either, Scott Alexander really cares about peddling racist lies, and Yudkowsky seems to build his whole worldview around the idea that he is a world-historical figure (and maybe he is, but Grigori Rasputin not Albert Einstein). So neither the "clueless, losers, sociopaths" model nor the "geek, mops, sociopaths" explains what happened to LW or Effective Altruism.

  • SneerClub @awful.systems

    A Post-Mortem for Geeks, Mops, and Sociopaths

  • The Optimist: Sam Altman, OpenAI, and the Race to Invent the Future by Keach Hagey has potential https://archive.is/22O9z

    Two members of the Extropian community, internet entrepreneurs Brian and Sabine Atkins—­who met on an Extropian mailing list in 1998 and were married soon after—­were so taken by this message that in 2000 they bankrolled a think tank for Yudkowsky, the Singularity Institute for Artificial Intelligence. At 21, Yudkowsky moved to Atlanta and began drawing a nonprofit salary of around $20,000 a year to preach his message of benevolent superintelligence. “I thought very smart things would automatically be good,” he said. Within eight months, however, he began to realize that he was wrong—­way wrong. AI, he decided, could be a catastrophe.

    This excerpt on Wired slams down names and dates and social connections without getting distracted by all the things that are wrong with what it describes.

  • SneerClub @awful.systems

    Is there a prosopography of MIRI/SIAI/CFAR?

  • Asterisk, Vox Future Perfect, and Astral Codex Ten are all LessWrong projects and none of them liked Yud's new book either. Always remember that your political opponents are riven by factions too.

    The reviewer mentions the heavy Ritalin (proscription stimulant) use by someone in her old social circle.

  • Also endless "its not X— its Y," an overheated but empty style, and a conclusion which promises "It documents specific historical connections between specific intellectual figures using publicly available sources." when there are no footnotes or links. Was ESR on the Extropians mailing list or did plausible string generator emit that plausible string?

    Chatbots are good at generating writing in the style of LessWrong because wordy vagueness based on no concrete experience is their whole thing.

  • I have found some Medium and Substack blogs full of slop which sneers at the TESCREAL types. Hoist on their own petard. Example: https://neutralzone.substack.com/p/the-genealogy-of-authority-capture (I was looking for a list of posters on the Extropians mailing list to 2001, which is actual precise work so a chatbot can't do it like it can extrude something essay-like).

  • Cinnas

    The Rolling Stone article is a bit odd (it appears to tell the story of the ex-employee who created Miricult twice, the first time without names and the second naming the accuser) but I trust them that MIRI did pay the accuser. Rolling Stone are a serious news organization which can be sued.

  • The tabletop game Attack Vector: Tactical from 2004 also models the need for radiators and risk of overheating.

    Science fiction from the 20th century tended to ignore cooling and cosmic rays (the one exception I can recall is Jerry Pournelle's superscience Langston Field) and we know these guys are not up to date even on pop culture or good at reading.

  • An aging population too. Young people commit more violence, use more substances, and get into more accidents. The Boomers still have massive influence on culture even though its not cool to market to them.

    My understand that the decline of tobacco use, alcohol use, sex, and pregnancy among youth in the USA are well-established (but how many young US people are sharing naughty texts or photos when before the smartphone they would be making out?)

    And isn't the trans and nonbinary stuff a norm that teh yoofs are enthusiastically breaking while people over 50 fret?

  • SneerClub @awful.systems

    Stephen and Steven

    SneerClub @awful.systems

    Selfishness and Altruism

    SneerClub @awful.systems

    Has anyone with psychiatry training ever commented on Scott Alexander's ideas?

    SneerClub @awful.systems

    4Chan Unsong About NPCs