Skip Navigation
Stubsack: weekly thread for sneers not worth an entire post, week ending 27th July 2025
  • It's "general intelligence", the eugenicist wet dream of a supposedly quantitative measure of how the better class of humans do brain good.

  • Stubsack: weekly thread for sneers not worth an entire post, week ending 27th July 2025
  • From Yud's remarks on Xitter:

    As much as people might like to joke about how little skill it takes to found a $2B investment fund, it isn't actually true that you can just saunter in as a psychotic IQ 80 person and do that.

    Well, not with that attitude.

    You must be skilled at persuasion, at wearing masks, at fitting in, at knowing what is expected of you;

    If "wearing masks" really is a skill they need, then they are all susceptible to going insane and hiding it from their coworkers. Really makes you think (TM).

    you must outperform other people also trying to do that, who'd like that $2B for themselves. Winning that competition requires g-factor and conscientious effort over a period.

    zoom and enhance

    g-factor

    <Kill Bill sirens.gif>

  • Stubsack: weekly thread for sneers not worth an entire post, week ending 27th July 2025
  • Yud continues to bluecheck:

    "This is not good news about which sort of humans ChatGPT can eat," mused Yudkowsky. "Yes yes, I'm sure the guy was atypically susceptible for a $2 billion fund manager," he continued. "It is nonetheless a small iota of bad news about how good ChatGPT is at producing ChatGPT psychosis; it contradicts the narrative where this only happens to people sufficiently low-status that AI companies should be allowed to break them."

    Is this "narrative" in the room with us right now?

    It's reassuring to know that times change, but Yud will always be impressed by the virtues of the rich.

  • Stubsack: weekly thread for sneers not worth an entire post, week ending 27th July 2025
  • Here's their page of instructions, written as usual by the children who really liked programming the family VCR:

    https://en.wikipedia.org/wiki/Wikipedia:Database_download

  • So You Think You've Awoken ChatGPT: LWer tries to talk fellow LWers down
  • Let's see who he reads. Vox Day (who is now using ChatGPT to "disprove" evolution), Christopher Rufo, Curtis Yarvin, Emil Kirkegaard, Mars Review model Bimbo Ubermensch.... It's a real Who's Who of Why The Fuck Do I Know Who These People Are?!

  • Stubsack: weekly thread for sneers not worth an entire post, week ending 27th July 2025
  • Want to feel depressed? Over 2,000 Wikipedia articles, on topics from Morocco to Natalie Portman to Sinn Féin, are corrupted by ChatGPT. And that's just the obvious ones.

    https://en.wikipedia.org/w/index.php?search=insource%3A"utm_source%3Dchatgpt.com"&title=Special%3ASearch&profile=advanced&fulltext=1&ns0=1&searchToken=8ops8b9qb8qmw8by39k248jyp

  • Stubsack: Stubsack: weekly thread for sneers not worth an entire post, week ending 20th July 2025 - awful.systems
  • https://xcancel.com/jasonlk/status/1946069562723897802

    Vibe Coding Day 8,

    I'm not even out of bed yet and I'm already planning my day on @Replit.

    Today is AI Day, to really add AI to our algo.

    [...]

    If @Replit deleted my database between my last session and now there will be hell to pay

  • So You Think You've Awoken ChatGPT: LWer tries to talk fellow LWers down
  • Seems overly generous both to Christopher Hitchens and to Julia Galef.

  • So You Think You've Awoken ChatGPT: LWer tries to talk fellow LWers down
  • (putting on an N95 before I enter the grocery store) dun dun DUN DUN dun dun DUN DUN deedle dee deedle dee DUN DUN

  • So You Think You've Awoken ChatGPT: LWer tries to talk fellow LWers down
  • The more expertise you have, the more you can use ChatGPT as an idea collaborator, and use your own discernment on the validity of the ideas.

    Good grief. Just take drugs, people.

  • So You Think You've Awoken ChatGPT: LWer tries to talk fellow LWers down
  • Don't worry; this post is not going to be cynical or demeaning to you or your AI companion.

    If you're worried that your "AI companion" can be demeaned by pointing out the basic truth about it, then you deserve to be demeaned yourself.

  • Stubsack: Stubsack: weekly thread for sneers not worth an entire post, week ending 20th July 2025 - awful.systems
  • Evan Urquhart:

    I had to attend a presentation from one of these guys, trying to tell a room full of journalists that LLMs could replace us & we needed to adapt by using it and I couldn't stop thinking that an LLM could never be a trans journalist, but it could probably replace the guy giving the presentation.

  • www.motherjones.com The shocking rise of one of the tech right’s favorite posters

    After he leaked Zohran Mamdani’s Columbia application data to the New York Times, critics called Jordan Lasker a “eugenicist.” A Mother Jones report shows there’s much more to his backstory.

    The shocking rise of one of the tech right’s favorite posters

    Mother Jones has a new report about Jordan Lasker:

    > A Reddit account named Faliceer, which posted highly specific biographical details that overlapped with Lasker’s offline life and which a childhood friend of Lasker’s believes he was behind, wrote in 2016, “I actually am a Jewish White Supremacist Nazi.” The Reddit comment, which has not been previously reported, is one of thousands of now-deleted posts from the Faliceer account obtained by Mother Jones in February. In other posts written between 2014 and 2016, Faliceer endorses Nazism, eugenics, and racism. He wishes happy birthday to Adolf Hitler, says that “I support eugenics,” and uses a racial slur when saying those who are attracted to Black people should kill themselves.

    10
    Stubsack: Stubsack: weekly thread for sneers not worth an entire post, week ending 20th July 2025 - awful.systems
  • After understanding a lot of things it’s clear that it didn’t. And it fooled me for two weeks.

    I have learned my lesson and now I am using it to generate one page at a time.

    qu1j0t3 replies:

    that's, uh, not really the ideal takeaway from this lesson

  • Stubsack: Stubsack: weekly thread for sneers not worth an entire post, week ending 20th July 2025 - awful.systems
  • We—yes, even you—are using some version of AI, or some tools that have LLMs or machine learning in them in some way shape or form already

    Fucking ghastly equivocation. Not just between "LLMs" and "machine learning", but between opening a website that has a chatbot icon I never click and actually wasting my time asking questions to the slop machine.

  • Stubsack: Stubsack: weekly thread for sneers not worth an entire post, week ending 20th July 2025 - awful.systems
  • I like how quoting Grimes lyrics makes the banality of these people thuddingly clear.

  • Stubsack: Stubsack: weekly thread for sneers not worth an entire post, week ending 20th July 2025 - awful.systems
  • Yud:

    ChatGPT has already broken marriages, and hot AI girls are on track to remove a lot of men from the mating pool.

    And suddenly I realized that I never want to hear a Rationalist say the words "mating pool".

    (I fired up xcancel to see if any of the usual suspects were saying anything eminently sneerable. Yudkowsky is re-xitting Hanania and some random guy who believes in g. Maybe he should see if the Pioneer Fund will bankroll publicity for his new book....)

  • Stubsack: Stubsack: weekly thread for sneers not worth an entire post, week ending 20th July 2025 - awful.systems
  • Previously sneered:

    The context for this essay is serious, high-stakes communication: papers, technical blog posts, and tweet threads.

    More recently, in the comments:

    After reading your comments and @Jiro 's below, and discussing with LLMs on various settings, I think I was too strong in saying....

    It's like watching people volunteer for a lobotomy.

  • Stubsack: Stubsack: weekly thread for sneers not worth an entire post, week ending 20th July 2025 - awful.systems
    awful.systems Stubsack: Stubsack: weekly thread for sneers not worth an entire post, week ending 13th July 2025 - awful.systems

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret. Any awf...

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    (Credit and/or blame to David Gerard for starting this.)

    175
    Veo fails week 4: the final faildown
  • Kaneda just scooting to the side at the 14:05 mark like a Looney Tunes character caught with their pants down is comedy gold. I want to loop it with a MIDI rendition of Joplin's "The Entertainer".

  • Wake up babe, new "in this moment I am enlightened" copypasta just dropped

    "TheFutureIsDesigned" bluechecks thusly:

    > You: takes 2 hours to read 1 book > > Me: take 2 minutes to think of precisely the information I need, write a well-structured query, tell my agent AI to distribute it to the 17 models I've selected to help me with research, who then traverse approximately 1 million books, extract 17 different versions of the information I'm looking for, which my overseer agent then reviews, eliminates duplicate points, highlights purely conflicting ones for my review, and creates a 3-level summary. > > And then I drink coffee for 58 minutes. > > We are not the same.

    For bonus points:

    >I want to live in the world of Hyperion, Ringworld, Foundation, and Dune.

    You know, Dune.

    (Via)

    50
    Credulous coverage of AI slop on Wikipedia

    Everybody loves Wikipedia, the surprisingly serious encyclopedia and the last gasp of Old Internet idealism!

    (90 seconds later)

    We regret to inform you that people write credulous shit about "AI" on Wikipedia as if that is morally OK.

    • https://en.wikipedia.org/wiki/Prompt_engineering
    • https://en.wikipedia.org/wiki/Vibe_coding

    Both of these are somewhat less bad than they were when I first noticed them, but they're still pretty bad. I am puzzled at how the latter even exists. I had thought that there were rules against just making a whole page about a neologism, but either I'm wrong about that or the "rules" aren't enforced very strongly.

    47
    Stubsack: weekly thread for sneers not worth an entire post, week ending 9 March 2025

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    (Credit and/or blame to David Gerard for starting this.)

    162
    Stubsack: weekly thread for sneers not worth an entire post, week ending 23 February 2025

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    >The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    (Credit and/or blame to David Gerard for starting this.)

    159
    www.technologyreview.com DeepSeek might not be such good news for energy after all

    New figures show that if the model’s energy-intensive “chain of thought” reasoning gets added to everything, the promise of efficiency gets murky.

    DeepSeek might not be such good news for energy after all

    > In the week since a Chinese AI model called DeepSeek became a household name, a dizzying number of narratives have gained steam, with varying degrees of accuracy [...] perhaps most notably, that DeepSeek’s new, more efficient approach means AI might not need to guzzle the massive amounts of energy that it currently does. > > The latter notion is misleading, and new numbers shared with MIT Technology Review help show why. These early figures—based on the performance of one of DeepSeek’s smaller models on a small number of prompts—suggest it could be more energy intensive when generating responses than the equivalent-size model from Meta. The issue might be that the energy it saves in training is offset by its more intensive techniques for answering questions, and by the long answers they produce. > > Add the fact that other tech firms, inspired by DeepSeek’s approach, may now start building their own similar low-cost reasoning models, and the outlook for energy consumption is already looking a lot less rosy.

    16
    Random Positivity Thread: Happy Book Memories

    In the spirit of our earlier "happy computer memories" thread, I'll open one for happy book memories. What's a book you read that occupies a warm-and-fuzzy spot in your memory? What book calls you back to the first time you read it, the way the smell of a bakery brings back a conversation with a friend?

    As a child, I was into mystery stories and Ancient Egypt both (not to mention dinosaurs and deep-sea animals and...). So, for a gift one year I got an omnibus set of the first three Amelia Peabody novels. Then I read the rest of the series, and then new ones kept coming out. I was off at science camp one summer when He Shall Thunder in the Sky hit the bookstores. I don't think I knew of it in advance, but I snapped it up and read it in one long summer afternoon with a bottle of soda and a bag of cookies.

    24
    Stubsack: weekly thread for sneers not worth an entire post, week ending 2nd February 2025

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    (Semi-obligatory thanks to @dgerard for starting this.)

    192
    Stubsack: weekly thread for sneers not worth an entire post, week ending 26th January 2025 - awful.systems - awful.systems

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    (Semi-obligatory thanks to @dgerard for starting this.)

    204
    Stubsack: weekly thread for sneers not worth an entire post, week ending 19th January 2025 - awful.systems

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    (Semi-obligatory thanks to @dgerard for starting this.)

    155
    Facebook "Secretly Trained Its AI on a Notorious Piracy Database, Newly Unredacted Court Docs Reveal"

    Kate Knibbs reports in Wired magazine:

    > Against the company’s wishes, a court unredacted information alleging that Meta used Library Genesis (LibGen), a notorious so-called shadow library of pirated books that originated in Russia, to help train its generative AI language models. [...] In his order, Chhabria referenced an internal quote from a Meta employee, included in the documents, in which they speculated, “If there is media coverage suggesting we have used a dataset we know to be pirated, such as LibGen, this may undermine our negotiating position with regulators on these issues.” [...] These newly unredacted documents reveal exchanges between Meta employees unearthed in the discovery process, like a Meta engineer telling a colleague that they hesitated to access LibGen data because “torrenting from a [Meta-owned] corporate laptop doesn’t feel right 😃”. They also allege that internal discussions about using LibGen data were escalated to Meta CEO Mark Zuckerberg (referred to as "MZ" in the memo handed over during discovery) and that Meta's AI team was "approved to use" the pirated material.

    15
    Elsevier: Proudly charging you money so its AI can make your articles worse

    Retraction Watch reports:

    > All but one member of the editorial board of the Journal of Human Evolution (JHE), an Elsevier title, have resigned, saying the “sustained actions of Elsevier are fundamentally incompatible with the ethos of the journal and preclude maintaining the quality and integrity fundamental to JHE’s success.”

    The resignation statement reads in part,

    > In fall of 2023, for example, without consulting or informing the editors, Elsevier initiated the use of AI during production, creating article proofs devoid of capitalization of all proper nouns (e.g., formally recognized epochs, site names, countries, cities, genera, etc.) as well italics for genera and species. These AI changes reversed the accepted versions of papers that had already been properly formatted by the handling editors.

    (Via Pharyngula.)

    Related:

    2
    The Professor Assigns Their Own Book &mdash; But Now With a Tech Bubble in the Middle Step

    The UCLA news office boasts, "Comparative lit class will be first in Humanities Division to use UCLA-developed AI system".

    The logic the professor gives completely baffles me:

    > "Normally, I would spend lectures contextualizing the material and using visuals to demonstrate the content. But now all of that is in the textbook we generated, and I can actually work with students to read the primary sources and walk them through what it means to analyze and think critically."

    I'm trying to parse that. Really and truly I am. But it just sounds like this: "Normally, I would [do work]. But now, I can actually [do the same work]."

    I mean, was this person somehow teaching comparative literature in a way that didn't involve reading the primary sources and, I'unno, comparing them?

    The sales talk in the news release is really going all in selling that undercoat.

    > Now that her teaching materials are organized into a coherent text, another instructor could lead the course during the quarters when Stahuljak isn’t teaching — and offer students a very similar experience. And with AI-generated lesson plans and writing exercises for TAs, students in each discussion section can be assured they’re receiving comparable instruction to those in other sections.

    Back in my day, we called that "having a book" and "writing a lesson plan".

    Yeah, going from lecture notes and slides to something shaped like a book is hard. I know because I've fuckin' done it. And because I put in the work, I got the benefit of improving my own understanding by refining my presentation. As the old saying goes, "Want to learn a subject? Teach it." Moreover, doing the work means that I can take a little pride in the result. Serving slop is the cafeteria's job.

    (Hat tip.)

    13
    Harmonice Mundi Books: An idea for an ethical academic publisher

    So, after the Routledge thing, I got to wondering. I've had experience with a few noble projects that fizzled for lacking a clear goal, or at least a clear breathing point where we could say, "Having done this, we're in a good place. Stage One complete." And a project driven by volunteer idealism &mdash; the usual mix of spite and whimsy &mdash; can splutter out if it requires more than one person to be making it a high/top priority. If half a dozen people all like the idea but each of them ranks it 5th or 6th among things to do, academic life will ensure that it never gets done.

    With all that in mind, here is where my thinking went. I provisionally tagged the idea "Harmonice Mundi Books", because Kepler writing about the harmony of the world at the outbreak of the Thirty Years' War is particularly resonant to me. It would be a micro-publisher with the tagline "By scholars, for scholars; by humans, for humans."

    The Stage One goal would be six books. At least one would be by a "big name" (e.g., someone with a Wikipedia article that they didn't write themselves). At least one would be suitable for undergraduates: a supplemental text for a standard course, or even a drop-in replacement for one of those books that's so famous it's known by the author's last name. The idea is to be both reputable and useful in a readily apparent way.

    Why six books? I want the authors to get paid, and I looked at the standard flat fee that a major publisher paid me for a monograph. Multiplying a figure in that range by 6 is a budget that I can imagine cobbling together. Not to make any binding promises here, but I think that authors should also get a chunk of the proceeds (printing will likely be on demand), which would be a deal that I didn't get for my monograph.

    Possible entries in the Harmonice Mundi series:

    • anything you were going to send to a publisher that has since made a deal with the LLM devil

    • doctoral theses

    • lecture notes (I find these often fall short of being full-fledged textbooks, chiefly by lacking exercises, but perhaps a stipend is motivation to go the extra km)

    • collections of existing long-form online writing, like the science blogs of yore

    • text versions of video essays &mdash; zany, perhaps, but the intense essayists already have manual subtitles, so maybe one would be willing to take the next, highly experimental step

    Skills necessary for this project to take off:

    • subject-matter editor(s) &mdash; making the call about what books to accept, in the case we end up with the problem we'd like to have, i.e., too many books; and supervising the revision of drafts

    • production editing &mdash; everything from the final spellcheck to a print-ready PDF

    • website person &mdash; the site could practically be static, but some kind of storefront integration would be necessary (and, e.g., rigging the server to provide LLM scrapers with garbled material would be pleasingly Puckish)

    • visuals &mdash; logo, website design, book covers, etc. We could have all the cover art be pictures of flowers that I have taken around town, but we probably shouldn't.

    • publicity &mdash; getting authors to hear about us, and getting our books into libraries and in front of reviewers

    Anyway, I have just barely started looking into all the various pieces here. An unknown but probably large amount of volunteer enthusiasm will be needed to get the ball rolling. And cultures will have to be juggled. I know that there are some tasks I am willing to do pro bono because they are part of advancing the scientific community, I am already getting a salary and nobody else is profiting. I suspect that other academics have made similar mental calculations (e.g., about which journals to peer review for). But I am not going to go around asking creative folks to work "for exposure".

    2
    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 6 October 2025

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    Last week's thread

    (Semi-obligatory thanks to @dgerard for starting this)

    307
    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 29 September 2024

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    Last week's thread

    (Semi-obligatory thanks to @dgerard for starting this)

    165
    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 22 September 2024

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    Last week's thread

    (Semi-obligatory thanks to @dgerard for starting this)

    200
    Random Positivity Thread: Happy Computer Memories

    Time for some warm-and-fuzzies! What happy memories do you have from your early days of getting into computers/programming, whenever those early days happened to be?

    When I was in middle school, I read an article in Discover Magazine about "artificial life" &mdash; computer simulations of biological systems. This sent me off on the path of trying to make a simulation of bugs that ran around and ate each other. My tool of choice was PowerBASIC, which was like QBasic except that it could compile to .EXE files. I decided there would be animals that could move, and plants that could also move. To implement a rule like "when the animal is near the plant, it will chase the plant," I needed to compute distances between points given their x- and y-coordinates. I knew the Pythagorean theorem, and I realized that the line between the plant and the animal is the hypotenuse of a right triangle. Tada: I had invented the distance formula!

    22
    Off-Topic: Music Recommendation Thread

    So, here I am, listening to the Cosmos soundtrack and strangely not stoned. And I realize that it's been a while since we've had a random music recommendation thread. What's the musical haps in your worlds, friends?

    38
    blakestacey blakestacey @awful.systems
    Posts 47
    Comments 1.1K
    Moderates