Jane Street have an 'old money' secretive culture but have employed SBF, Caroline Ellision, and Kelsey Piper's patron James McClabe. McClabe created a $37 million foundation to fund EA causes (although he spends more on campaign contributions than Vox Future Perfect and GiveWell). Given the 80,000 hours side of Effective Altruism I suspect Jane Street have other friends of Yud who post pseudonymously if at all.
Pinkerite is interested in what RationalWiki calls the ThielSphere. I think its likely that the two Scotts or some of the Jane Street people have connections to Thiel which they don't talk about on the Internet.
Reason (American Libertarians) and Vox often introduce LW and EA people into US mainstream media.
Kelsey Piper has an up-to-date RationalWiki page including how she connected with SBF (she was on the board of an Effective Altruism club with Caroline Ellison at Stanford)
There was a creepy time when all the ex-Scienceblogs / Atheism Plus / Skeptic circle of bloggers posted an angry post about the enemy of the day. That was not at all what I understood as skepticism or free thinking, but they had already discovered that original, independent, research-based posts are hard and repeating the party line about what someone said on the Internet is easy. So is beefing with a friend who had the wrong take about what someone said on the Internet.
I must have confused my memories of the really nasty era around 2010-2012 with my occasional checks on FreeThoughtBlogs afterwards. I have not really thought about that world in the COVID era.
That reminds me of Kelsey Piper randomly posting that she helped James Damore get his first job after Google and she would do it again gosh darn it! So much of social media is people in the Bay Area recruiting people for their petty feuds. One of the shibboleths as ScienceBlogs broke up was posting that Damore was a bad bad person and not just a very ordinary clueless wealthy young dude you never met.
The blogger Pinkerite has studies of people around Steven Pinker but focuses on public intellectuals over the kind of people who serve on boards and organize meetups. Ever since I learned about the face-to-face, Bay Area aspect of all of this I have been wonder how to rethink it. The people who post the most on the open web are not necessarily the most influential.
Extropia's Children show that you can do scholarship with someone you disagree with (he is a chatbot fan but his timeline is reasonable).
The only people from those days who I met face to face were Randi and Shermer. I remember sitting at a table afterwards talking about how I wished Shermer would go back to writing skepticism and ditch the bad arguments for Libertarianism.
Myers was happy to have Carrier as one of his bully boys against anyone who refused to toe the constantly shifting party line. He jettisoned Carrier only after the later became embarrassing (it became public that Carrier kept hitting on women who said they were not interested). IMHO that was like the Kray twins ordering a hit on an enforcer who went off the leash. Edit: The FreeThoughtBlogs take on their separation with Carrier begins with Myers and Carrier speaking at a two-speaker event where Carrier meets a young woman.
Two things with echos of our friends were Carrier's undisclosed sexual relationship with one of the people who hired him to speak, and that the term "polyamory" was used to cover behaviour which does not look good when you describe the specifics. A third was that Dawkins and friends were allergic to history and philosophy, but wanted to share their thoughts on history and philosophy.
Harriet Hall got into trouble for just-asking-questions transphobia.
Hall published a noncommittal review of a dodgy-sounding book. Scientific skepticism is a method of inquiry not a set of shibboleths. I suspect that her review was not good skepticism, but nobody is a good skeptic on every issue, and it did not seem worthy of retraction (maybe a note that the editors did not endorse it). Back to the original comment, this brings us to the difference between the thing (critical inquiry) and the symbolic representation of the thing (yelling that bigfoot is not real and homeopathy is sugar pills).
I agree that its gross to discuss a lot of this in public, and that underage sex is often an ethical grey area. I had no idea that the person who accused BD of pushing him into substance use and extreme BDSM scenarios is also the person who allegedly had sex underage with a MIRI staffer while living in a Rationalist group home.
The GMS model fits the rise and fall of scientific skepticism pretty well. As the first generation of deeply nerdy leaders like Martin Gardner, L. Sprague de Camp, and James Randi aged and died, new leaders appeared who said that the movement should be bigger and address more important things like social justice. These leaders and the new party-style events brought more people in the door, but some of the leaders believed irrational things and wanted money and sex and were not fussy how they got it (Shermer, Carrier)1, and some liked pushing people around and being tastemakers (Watson, Myers). My understanding is that the skeptics got rid of most of the big egos, but in doing so they shattered their movement. Most of the big names are still around with online followings, and various rump skeptic and atheist movements still exist, but the attempt to rally everyone around skepticism or Atheism Plus collapsed, and some basically decent and rational people like Hal Bidlack and Harriett Hall ended up in the wilderness for ideological crimes.
I don't know what movements from the 20th century Chapman was thinking of, and it would be less polarizing to talk about things which were cool in the 1980s than things which were cool recently. I would bet at 50-50 that someone will be offended by the previous paragraph.
1: Shermer and Carrier's belief that there was one objective morality which can be proven is a lot like Yudkowsky's belief that there is one objective morality which can be programmed into Friendly AI. The way 'sex-positivity' was used in the skeptical and atheist sphere also rhymes. I could write a whole essay about how LessWrong cut out the parts of skepticism which would help newbies to spot that the movement was cult-adjacent and irrational.
Another common example for Americans is "milk before meat" among the Later-Day Saints. The paper by Gleiberman above lays out how once you are committed to the idea that altruism should be as effective as possible and that your intuitions about what is effective are not trustworthy, the Longtermists pull you into a dark alley where their friend Pascal is waiting to mug you (although longermist EA never received a majority of EA funding). Its all as sad as when I am trying to have a factual conversation with Americans online and they try to convert me to pseudoscientific racism.
I think one of the biggest flaws of our friends is that they want there to be one hierarchy of power and capability, with Electric Jesus at the top, then them, then their admirers, then the rest of us. Yukowsky is brilliant at getting people to give him money, good at getting them to give him sex, but not a scientist or a skeptic (I am told he asked for special powers to delete LessWrong comments which explain what he got wrong or did not see).
The "geeks, mops, and sociopaths" model does not encourage people to look at themselves and ask whether their community's problems are their own fault. It also does not encourage them to ask "I am a drama kid, you are a min-maxer, can we find a way to have a fun game of D&D together or should we find our own groups?"
Alex Karp's Wikipedia page has a wild gap from "trying to raise enough money to be a Bohemian in Berlin in 2002" to "senior exec at Palantir with a Norwegian bodyguard and spicy takes on the Gaza war."
Chapman's advice seems pretty good for keeping an indy art scene small and for autistic introverts not big and for normies, but not for realizing that LessWrong and EA are cults founded by bad people with bad goals with an exoteric doctrine out front and an esotetric doctrine once you are committed.
Was TPOT a Twitter thing? It seems like LessWrong was all over Tumblr and Twitter.
Most of us are harrmless and just want to explore our special interests. But I don't think any of our friends fits that description. I don't think it was just about power games either, Scott Alexander really cares about peddling racist lies, and Yudkowsky seems to build his whole worldview around the idea that he is a world-historical figure (and maybe he is, but Grigori Rasputin not Albert Einstein). So neither the "clueless, losers, sociopaths" model nor the "geek, mops, sociopaths" explains what happened to LW or Effective Altruism.
The Optimist: Sam Altman, OpenAI, and the Race to Invent the Future by Keach Hagey has potential https://archive.is/22O9z
Two members of the Extropian community, internet entrepreneurs Brian and Sabine Atkins—who met on an Extropian mailing list in 1998 and were married soon after—were so taken by this message that in 2000 they bankrolled a think tank for Yudkowsky, the Singularity Institute for Artificial Intelligence. At 21, Yudkowsky moved to Atlanta and began drawing a nonprofit salary of around $20,000 a year to preach his message of benevolent superintelligence. “I thought very smart things would automatically be good,” he said. Within eight months, however, he began to realize that he was wrong—way wrong. AI, he decided, could be a catastrophe.
This excerpt on Wired slams down names and dates and social connections without getting distracted by all the things that are wrong with what it describes.
Asterisk, Vox Future Perfect, and Astral Codex Ten are all LessWrong projects and none of them liked Yud's new book either. Always remember that your political opponents are riven by factions too.
The reviewer mentions the heavy Ritalin (proscription stimulant) use by someone in her old social circle.
Also endless "its not X— its Y," an overheated but empty style, and a conclusion which promises "It documents specific historical connections between specific intellectual figures using publicly available sources." when there are no footnotes or links. Was ESR on the Extropians mailing list or did plausible string generator emit that plausible string?
Chatbots are good at generating writing in the style of LessWrong because wordy vagueness based on no concrete experience is their whole thing.
I have found some Medium and Substack blogs full of slop which sneers at the TESCREAL types. Hoist on their own petard. Example: https://neutralzone.substack.com/p/the-genealogy-of-authority-capture (I was looking for a list of posters on the Extropians mailing list to 2001, which is actual precise work so a chatbot can't do it like it can extrude something essay-like).
The Rolling Stone article is a bit odd (it appears to tell the story of the ex-employee who created Miricult twice, the first time without names and the second naming the accuser) but I trust them that MIRI did pay the accuser. Rolling Stone are a serious news organization which can be sued.
The tabletop game Attack Vector: Tactical from 2004 also models the need for radiators and risk of overheating.
Science fiction from the 20th century tended to ignore cooling and cosmic rays (the one exception I can recall is Jerry Pournelle's superscience Langston Field) and we know these guys are not up to date even on pop culture or good at reading.
Jane Street have an 'old money' secretive culture but have employed SBF, Caroline Ellision, and Kelsey Piper's patron James McClabe. McClabe created a $37 million foundation to fund EA causes (although he spends more on campaign contributions than Vox Future Perfect and GiveWell). Given the 80,000 hours side of Effective Altruism I suspect Jane Street have other friends of Yud who post pseudonymously if at all.