Skip Navigation
Stubsack: weekly thread for sneers not worth an entire post, week ending 27th July 2025
  • Hey, remember the thing that you said would happen?

    The part about condemnation and mockery? Yeah, I already thought that was guaranteed, but I didn't expect to be vindicated so soon afterwards.

    EDIT: One of the replies gives an example for my "death of value-neutral AI" prediction too, openly calling AI "a weapon of mass destruction" and calling for its abolition.

  • Stubsack: weekly thread for sneers not worth an entire post, week ending 27th July 2025
  • Discovered some commentary from Baldur Bjarnason about this:

    Somebody linked to the discussion about this on hacker news (boo hiss) and the examples that are cropping up there are amazing

    This highlights another issue with generative models that some people have been trying to draw attention to for a while: as bad as they are in English, they are much more error-prone in other languages

    (Also IMO Google translate declined substantially when they integrated more LLM-based tech)

    On a personal sidenote, I can see non-English text/audio becoming a form of low-background media in and of itself, for two main reasons:

    • First, LLMs' poor performance in languages other than English will make non-English AI slop easier to identify - and, by extension, easier to avoid

    • Second, non-English datasets will (likely) contain less AI slop in general than English datasets - between English being widely used across the world, the tech corps behind this bubble being largely American, and LLM userbases being largely English-speaking, chances are AI slop will be primarily generated in English, with non-English AI slop being a relative rarity.

    By extension, knowing a second language will become more valuable as well, as it would allow you to access (and translate) low-background sources that your English-only counterparts cannot.

  • Some Off-The-Cuff Predictions about the Next AI Winter

    It’s pretty much a given that we’re in for an AI winter once this bubble bursts - the only thing we can argue on at this point is exactly how everything will shake out. So, let’s beat this dead horse and make some random predictions before it inevitably gets sent to the glue factory. I’ve hardly got anything better to do.

    The Death of “Value-Neutral” AI

    Before this bubble, artificial intelligence was generally viewed as value-neutral. It was generally viewed as a tool, capable of good or evil, bringing about a futuristic utopia or a Terminator-style apocalypse.

    Between the large-scale art theft/plagiarism committed to build the datasets (through coercion, deception, ignoring the victim’s refusal, spamming new scrapers, et cetera), the abused and underpaid workers who classified the datasets, the myriad harms brought by the LLMs themselves (don’t get me fucking started), and the utterly ghoulish acts of the CEOs and AI bros involved (defending open theft, mocking their victims, cultural vandalism, denigrating human work, etcetera), that “value neutral” notion is dead and fucking buried.

    Going forward, I expect artificial intelligence to be viewed not as a tool or a technology, but as an enemy (of sorts), built to perpetrate evil, and capable only of evil. As for its users (assuming it still has users), I expect them to be viewed as tech assholes, class traitors, incompetent dipshits, “prompt goblins” craving approval, and generally worthy only of mockery or condemnation.

    Confidence: Near-certain. Ali Alkhatib’s “Defining AI” (which called for redefining AI as an ideological project to more effectively resist it) and Matthew Hughes’ “People Are The Point” (a manifesto which opposes AI on principle, calling it “an expression of contempt towards people”) have already provided crystal-clear examples of AI being treated as an evil unto itself, and the links in the previous paragraph already show use of AI being treated as a moral failing as well.

    Side-Order of Tech Crash

    It’s no secret that the tech industry has put a horrific amount of cash into this AI bubble - every major AI corp burns billions in VC cash with no end in sight, Microsoft performed mass layoffs to throw money at AI (mass layoffs of people making the company money, mind you), NVidia is blowing billions on AI money-burners (to keep making a killing off of selling shovels in this AI gold rush), the fucking works. And all in pursuit of a Hail Mary pass intended to keep the tech industry’s Endless Growth™ going for just a few years more.

    (Going by David Gerard, previous AI springs were primarily funded by the Department of Defense, with winter setting in whenever their patience for burning cash ran out.)

    With all the billions upon billions thrown into AI, and revenue from said AI being somewhere between Jack and Shit (barring the profits of shovel-sellers like NVidia, as mentioned before), this AI winter will likely kick off with a very wide-ranging tech crash that takes a chunk out of the entire industry, and causes some serious economic woes for good measure.

    Confidence: Very high. Ed Zitron’s gone into punishing detail about the utterly fucked economics of basically everyone involved in this bubble, and I’d be here all day if I went over everything he’s written about. Picking just a single article, here’s him talking about OpenAI being a systemic risk to tech.

    Scrapers Need Not Apply

    Before the AI bubble, scrapers/crawlers were a normal, accepted part of the Internet ecosystem - there was no real incentive to block crawlers by default, since the vast majority were well-behaved and followed robots.txt, and search engine crawlers specifically were something you wanted to welcome, since those earned you traffic from search results.

    Come the AI bubble, this status quo would be completely undermined, for three main reasons.

    First, and most obviously, there’s the theft - far from having any benevolent purposes, the crawlers employed by AI corps are created to outright steal data off your blog/website, then use it to create a slop generator that claims your work as its own and/or tries to put you out of business, making AI crawlers an long-term existential threat to whatever endeavours you go into.

    Second, AI Summary™ services (like Google’s) created through the aforementioned theft have utterly cratered search engine traffic, taking the main upside to allowing crawlers to scrape your site and turning it into a severe downside.

    Last, but not least, are the AI crawlers themselves - thanks to how they DDoS whatever sites or FOSS infrastructure they decide to scrape, and the dirty tricks employed in said scraping (ignoring robots.txt, lying about their user agent, spamming new scrapers, using botnets, etcetera), doing anything short of blocking scrapers on sight is not just a long-term liability to you, but an immediate liability to your website as well.

    As a response to these crawlers, a cottage industry of anti-scraping solutions cropped up providing a variety of ways to fight back. Between dedicated bot-blockers like Anubis, tarpits like Iocaine and Nepenthes, and media-poisoning tools like Glaze and Nightshade, scrapers of all stripes now face an ever-present risk of being blocked from data (especially high quality data), or force-fed misleading data intended to waste their time and poison their datasets.

    As the cherry on top of this anti-scraper shit sundae, the rise of generative AI has flooded the ‘Net with AI slop, which is difficult to identify, near-impossible to avoid, and outright useless (if not dangerous) to scrape. Unless you’re limiting yourself to sources made before 2022 (commonly known as low-background media), chances are you’re gonna have to deal with your dataset getting contaminated.

    Given all this, I expect scraper activity in general (malicious or otherwise) to steeply drop during the AI winter, as all scrapers get treated as guilty (of AI fuckery) until proven innocent, and non-malicious scraper activity drops off as developers deem running them to be not worth the hassle.

    Confidence: Moderate. I already know of one scraper-based project (wordfreq, to be specific) which shut down as a consequence of the AI bubble - I wouldn’t be shocked to see more cases crop up down the line.

    Condemnation and Mockery

    For the past two years, the AI bubble has been inescapable for the public at large.

    On one front, they’ve spent the past two years being utterly inundated with AI hype of every stripe - AI bros hyping up AI as The Future™, wild and spurious claims of Incoming Superintelligence™, rigged tests and cheated benchmarks made directly by the AI corps, and relentless anthropomorphisation of spicy autocompletes and signal-shaped noise generators.

    Especially anthropomorphisation - whether it be painting hallucinations as lies, presenting AI as deceptive or coercive, or pretending they can feel pain, there has been a horrendous amount of time and money spent on trying to deceive the public into believing LLMs are sentient, if not humanlike in their actions.

    On another front, the public has bore witness to a wide variety of harms as a direct consequence of AI’s creation.

    Local environmental catastrophe, global water loss and sky high emissions, widespread job loss, academic misconduct, nonstop hallucinations and misinformation, voice-cloning scams, programming disasters, damaged productivity, psychosis, outright suicide (on multiple occasions), the list goes on and on and on and on and on.

    All of this has been thoroughly burned into the public consciousness over these past two or three years, ensuring AI will retain a major (and deeply negative) presence there, and ensuring AI as a concept will face widespread mockery and condemnation from the public, until long after the bubble bursts.

    Giving some more specifics:

    Confidence: Completely certain. I’m basically “predicting” something that’s already happening right now, and has a very good chance of continuing months, if not years, down the road.

    Arguably, I’m being a bit conservative with this prediction - given the cultural rehabilitation of the Luddites, and the rise of a new Luddite movement in 2024, I could easily argue that the bubble’s started a full-blown resistance movement against the tech industry as a whole.

    2
    Stubsack: weekly thread for sneers not worth an entire post, week ending 27th July 2025
    awful.systems Stubsack: Stubsack: weekly thread for sneers not worth an entire post, week ending 20th July 2025 - awful.systems - awful.systems

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret. Any awf...

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    (Credit and/or blame to David Gerard for starting this.)

    61
    Godot Showcase - Dogwalk
    godotengine.org Godot Showcase - Dogwalk – Godot Engine

    Julien and Simon from Blender Studio tell us about their experience working on Dogwalk.

    Godot Showcase - Dogwalk – Godot Engine

    A crossover between Godot and Blender was not on my bingo card for 2025, but I'm still pretty happy to see - not just because we got a cool little game out of it, but because interop between the two got a major boost.

    0
    Stubsack: Stubsack: weekly thread for sneers not worth an entire post, week ending 20th July 2025 - awful.systems
  • Found a good security-related sneer in response to a low-skill exploit in Google Gemini (tl;dr: "send Gemini a prompt in white-on-white/0px text"):

    I've got time, so I'll fire off a sidenote:

    In the immediate term, this bubble's gonna be a goldmine of exploits - chatbots/LLMs are practically impossible to secure in any real way, and will likely be the most vulnerable part of any cybersecurity system under most circumstances. A human can resist being socially engineered, but these chatbots can't really resist being jailbroken.

    In the longer term, the one-two punch of vibe-coded programs proliferating in the wild (featuring easy-to-find and easy-to-exploit vulnerabilities) and the large scale brain drain/loss of expertise in the tech industry (from juniors failing to gain experience thanks to using LLMs and seniors getting laid off/retiring) will likely set back cybersecurity significantly, making crackers and cybercriminals' jobs a lot easier for at least a few years.

  • OpenAI investor falls for GPT's SCP-style babble
  • Found a neat tangent whilst going through that thread:

    The single most common disciplinary offense on scpwiki for the past year+ has been people posting AI-generated articles, and it is EXTREMELY rare for any of those cases to involve a work that had been positively received

    On a personal note, I expect the Foundation to become a reliable source of post-'22 human-made work for the same reasons I stated Newgrounds would recently:

    • An explicit ban on AI slop, which deters AI bros and allow staff to nuke it on sight

    • A complete lack of an ad system, which prevents content farms from setting up shop

    • Dedicated quality control systems (deletion and rewrite policies, in this case) which prevent slop from gaining a foothold and drowning out human-made work

  • Stubsack: Stubsack: weekly thread for sneers not worth an entire post, week ending 20th July 2025 - awful.systems
  • Tangential: I’ve heard that there are 3D printer people that print junk and sell them. This would not be much of a problem if they didn’t pollute the spaces they operate in.

    So, essentially AI slop, but with more microplastics. Given the 3D printer bros are much more limited in their ability to pollute their spaces (they have to pay for filament/resin, they're physically limited in where they can pollute, and they produce slop much slower than an LLM), they're hopefully easier to deal with.

  • Stubsack: Stubsack: weekly thread for sneers not worth an entire post, week ending 20th July 2025 - awful.systems
  • Similarly, at the chip production facilities, a committee of representatives stands at the end of the production line basically and rolls a ten-sided die for each chip; chips that don’t roll a 1 are destroyed on the spot.

    Ah, yes, artificially kneecap chip fabs' yields, I'm sure that will go over well with the capitalist overlords who own them

  • A Mini-Essay on Newgrounds' Resistance to AI Slop

    Recently, I found myself ruminating on the general lack of AI slop over on Newgrounds (a site I use rather heavily, and have been since I joined in 2023). The only major case I've seen in recent memory was an influx of vibe-coded shovelware I saw last month.

    If the title didn't tip you off, I personally believe it to be due to Newgrounds being naturally resistant to contamination with AI slop. Here's a few reasons why I think that is:

    An Explicit Stance

    First off, I'll get the obvious reason out the way.

    Newgrounds has explicitly banned AI slop from being uploaded since September 2022,very early into the bubble. Whilst the guidelines carve out some minor exceptions for using AI to assist human work, simply generating a piece of slop and hitting “Upload” is off the table.

    The only real development since then was a site update in early March 2024, which added the option to flag a submission as AI slop.

    Both of these moves made the site’s stance loud and clear: AI slop of all stripes is not welcome on NG.

    Beyond giving the mods explicit permission to delete slop on sight, the move likely did plenty to deter AI bros from setting up shop - if they weren't gonna get some easy clout from spewing their slop on there, why bother?

    DeviantArt provides an obvious point of contrast here - far from take any concrete stance against AI slop, the site actively welcomed it, launching a slop generator of their own in November 2022 and doing nothing to rein in slop as it flooded the site.

    Slop-proof Monetization

    A second, and arguably less important factor, is Newgrounds’ general approach to monetisation - both in making money and paying it out.

    In terms of making money, Newgrounds has pushed heavily towards running ad-free since the start of the decade - as of this writing, Newgrounds relies near-exclusively on Supporter (a subscription service which started in 2012, for context) for revenue. (Right now, adverts run exclusively on A-rated submissions (i.e. porn), which require an account to view.)

    At the same time, the site wound down its previous rev-share system (which directly ran on ad revenue), leaving just the monthly cash prizes for payouts.

    The overall effect of this change has been to render NG outright inhospitable to content farms (AI-based or otherwise) - being reliant on ad revenue to turn a profit off their low quality Content™, the near-total lack thereof renders running one on there impractical.

    (Arguably, this reason isn't a particularly important one - being a niche animation site dwarfed by the likes of YouTube and Instagram, NG likely fell well under the radars of content farms even before its ad-free push.)

    DeviantArt, once again, provides an easy point of contrast - as the site itself has proudly paraded, the site's monetisation features have enabled AI bros to make a quick buck off of flooding it with slop.

    Judgment/Scouting

    Wrapping this up with something that doesn't have a parallel in dA, I'm gonna look at the judgment and scouting systems used on the site. Though originally intended to maintain a minimum level of quality, these systems have helped prevent AI slop from gaining a foothold as well.

    Judgment

    For the main Portal (which covers animations and games), a simple voting process called judgment is used - users vote from 0 to 5 on uploaded works, with low-scoring submissions being automatically deleted (referred to as being ‘blammed’).

    Whilst rather simple, the process has proven effective in keeping low-effort garbage off of Newgrounds - and with “low-effort garbage” being a perfect description of AI slop, the judgment process has enabled users to get rid of AI slop without moderator intervention, reducing the mods’ workload in the process.

    Scouting

    For the Audio Portal and the Art Portal, a vetting system (referred to as “scouting”) is used instead.

    By default, work by unscouted artists appears in the "Undiscovered Artists" section of the Art/Audio Portals, hidden away from public view unless someone actively opts-in to view it or they find it from checking the user’s account.

    If an already-scouted user or a moderator sees an unscouted user, they have the option to scout them, essentially vouching that their work follows site guidelines and is of sufficiently high quality. The effects of this are twofold:

    • First, the user’s work is placed into the “Approved Artists” section of the appropriate Portal, granting a large boost to its visibility.

    • Second, the user is granted the ability to scout other users, vouching for their work in turn.

    Said ability is something users are required to exercise with caution - if the scouted user is later caught breaking site guidelines (or if their work is deemed too poor-quality), they can be de-scouted by an Art/Audio Moderator, and those who scouted them can be stripped of their ability to scout other users.

    This system creates an easy method of establishing trust among the userbase (arguably equivalent to a PGP-style web of trust) - simply by knowing someone's been scouted, you can be confident they're posting human-made work, and scouted users can in turn extend that trust by scouting other users.

    Additionally, Art/Audio Moderators are equipped to handle any breaches in said trust, whether by de-scouting users for posting slop, or removing scouting abilities from users who can't be trusted with them, enabling trust to be quickly restored.

    As a secondary benefit, any slop which does get submitted is effectively hidden from view by default, making it easy for human-made work to drown out the slop, rather than the other way around.

    Conclusion

    The large-scale proliferation of generative AI has been a disaster for the Internet at large, flooding practically every corner of it with AI slop of all stripes.

    Given all that, its kinda miraculous to know that there's any corner of the ‘Net which has braved the slop-nami and come out unscathed, let alone one as large (if rather niche) as Newgrounds is.

    So, if you're looking for human-made work produced after 2022…well, I don't know where to search for most things, but for art, music, games or animation, you already know where to start :P

    0
    Stubsack: Stubsack: weekly thread for sneers not worth an entire post, week ending 20th July 2025 - awful.systems
  • The deluge of fake bug reports is definitely something I should have noted as well, since that directly damages FOSS' capacity to find and fix bugs.

    Baldur Bjanason has predicted that FOSS is at risk of being hit by "a vicious cycle leading to collapse", and security is a major part of his hypothesised cycle:

    1. Declining surplus and burnout leads to maintainers increasingly stepping back from their projects.

    2. Many of these projects either bitrot serious bugs or get taken over by malicious actors who are highly motivated because they can’t relay on pervasive memory bugs anymore for exploits.

    3. OSS increasingly gets a reputation (deserved or not) for being unsafe and unreliable.

    4. That decline in users leads to even more maintainers stepping back.

  • Stubsack: Stubsack: weekly thread for sneers not worth an entire post, week ending 20th July 2025 - awful.systems
  • Potential hot take: AI is gonna kill open source

    Between sucking up a lot of funding that would otherwise go to FOSS projects, DDOSing FOSS infrastructure through mass scraping, and undermining FOSS licenses through mass code theft, the bubble has done plenty of damage to the FOSS movement - damage I'm not sure it can recover from.

  • Stubsack: Stubsack: weekly thread for sneers not worth an entire post, week ending 20th July 2025 - awful.systems
  • Reading through some of the examples at the end of the article it’s infuriating when these slop reports have opened and when the patient curl developers try to give them benefit of the doubt the reporter replies with “you have a vulnerability and I cannot explain further since I’m not an expert”

    At that point, I feel the team would be justified in telling these slop-porters to go fuck themselves and closing the report - they've made it crystal clear they're beyond saving.

    (And on a wider note, I suspect the security team is gonna be a lot less willing to give benefit of the doubt going forward, considering the slop-porters are actively punishing them for doing so)

  • Stubsack: Stubsack: weekly thread for sneers not worth an entire post, week ending 20th July 2025 - awful.systems
  • This is pure speculation, but I suspect machine learning as a field is going to tank in funding and get its name dragged through the mud by the popping of the bubble, chiefly due to its (current) near-inability to separate itself from AI as a concept.

  • Stubsack: Stubsack: weekly thread for sneers not worth an entire post, week ending 13th July 2025
    awful.systems Stubsack: Stubsack: weekly thread for sneers not worth an entire post, week ending 6th July 2025 - awful.systems

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret. Any awf...

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    (Credit and/or blame to David Gerard for starting this.)

    132
    Stubsack: Stubsack: weekly thread for sneers not worth an entire post, week ending 6th July 2025
    awful.systems Stubsack: Stubsack: weekly thread for sneers not worth an entire post, week ending 29th June 2025 - awful.systems

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret. Any awf...

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    (Credit and/or blame to David Gerard for starting this. Also, happy 4th July in advance...I guess.)

    148
    Rolling the ladder up behind us - Xe Iaso on the LLM bubble
    xeiaso.net Rolling the ladder up behind us

    Who will take over for us if we don't train the next generation to replace us? A critique of craft, AI, and the legacy of human expertise.

    Rolling the ladder up behind us
    2
    Stubsack: Stubsack: weekly thread for sneers not worth an entire post, week ending 29th June 2025
    awful.systems Stubsack: weekly thread for sneers not worth an entire post, week ending 22nd June 2025 - awful.systems

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret. Any awf...

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    (Credit and/or blame to David Gerard for starting this.)

    82
    Some More Quick-and-Dirty Thoughts on AI's Future
    awful.systems Some Quick-and-Dirty Thoughts on AI's Future - awful.systems

    (This is basically an expanded version of a comment on the weekly Stubsack - I’ve linked it above for convenience’s sake.) This is pure gut instinct, but I’m starting to get the feeling this AI bubble’s gonna destroy the concept of artificial intelligence as we know it. On the artistic front, there’...

    (No clue why I didn't get around to this earlier, I've had this in drafts for too long.)

    Eight months ago, as you probably know, I predicted the current AI bubble would destroy artificial intelligence as a concept, focusing on the unrelenting slop and failures of AI, and on the near-universal backlash it receives whenever it rears its ugly, slop-ridden head.

    As it turns out, I had completely failed to recognise the the political elements of this entire bubble. In retrospect, I should've recognised it a lot fucking earlier.

    Between Baldur Bjarnason outlining the esoteric fascist elements at the heart of the AI bubble, AI slop's enthusiastic adoption by fascists of all stripes, Damien Williams' notes on authoritarians' love for gen-AI, Ashley Lynch calling AI slop inherently fascist, Pavel calling AI a specifically right-wing phenomenon and probably some extra stuff I've missed, its become clear the outright fascist nature of AI has been staring me in the face the entire time.

    With all that in mind, I'd like to expand on my previous piece with three additional predictions.

    1. AI-as-Fascism

    Right off the bat, I expect AI as a concept will pick up a public perception of being inherently fascist, or at least a tool of fascism. Beyond all the ink spilled about AI's fascist nature, Donald Trump going all-in on AI has done plenty to link his administration with the tech, whether through making AI slop of deportees, or letting Elon Musk's AI Poweredtm Department Of Government Efficiency go to town on the federal gov.

    Long-term, I expect this will hamper future attempts to start new AI bubbles/AI springs, as attempts to revive the tech get treated as morally equivalent to creating the Fourth Reich.

    1. The Wider Tech Industry

    On a wider front, I expect the tech industry at large will pick up a similar stench of Eau de Fash as well. Whilst the tech industry has long enjoyed a perception of apoliticality (which James Allen-Robertson has talked about (spoilers for Devs BTW), their own heavy involvement with the Trump administration has done plenty to undermine that.

    Between their sucking up to Trump and AI's own stench of Eau de Fash, I can see the public starting to view the tech industry at large as a Nazi bar once the bubble bursts. Silicon Valley's given 'em plenty of reason to do so.

    1. A Bone for The Humanities

    Ending this off on a vaguely positive note, I suspect the bubble's burst will earn the humanities some begrudging respect once the dust settles - primarily through cannibalising a fair bit of the cultural cachet that STEM has built up over the decades.

    On one front, the slop-nami has given us an absolute torrent of slop flooding the Internet, notable both in its uniquely inhuman shittiness, and in AI bros' breathless adoration of it. Given that, I expect programmers/software engineers will come to be viewed as inherently incapable of making anything on par with anyone who has taken up art as a hobby/profession, and incapable of understanding art with any sort of depth to it.

    Additionally, I suspect a stereotype of programmers/software engineers being hostile to art/artists may form, thanks to the rather drastic toll this bubble has had on artists, and the ongoing rhetoric of "democratising art" that the bubble's given us. (The age-old "learn to code" adage may also come back to haunt them as well, if this comes to pass.)

    On a wider front, the breathless "AI doomsday" criti-hype, more general over-the-top AI hype, and nonstop hallucination-induced mishaps will likely also contribute to making STEM as a discipline look out-of-touch with reality, making the humanities look grounded and reasonable by comparison as the public looks in confusion at AI bros' inability to recognise LLMs' shittiness for what it is.

    (EDIT: Slightly expanded the introduction, adding an example from Pavel.)

    0
    We started porting LEGO Island to... everything? [MattKC]

    Good news: a portable version of the LEGO Island decomp just came out, meaning ports to other OSes or devices are now possible.

    Hell, there's even a browser port now, available at https://isle.pizza/. If you wanna learn more about the decomp, check out the video I linked.

    3
    The Psychology Behind Tech Billionaires
    www.rollingstone.com What You've Suspected Is True: Billionaires Are Not Like Us

    Why Elon Musk and other tech billionaires have a different perspective

    What You've Suspected Is True: Billionaires Are Not Like Us

    New Rolling Stone piece from Alex Morris, focusing heavily on our very good friends and the tech billionaires they're buddies with.

    (Also, that's a pretty clever alternate title)

    2
    Stubsack: weekly thread for sneers not worth an entire post, week ending 22nd June 2025
    awful.systems Stubsack: weekly thread for sneers not worth an entire post, week ending 15th June 2025 - awful.systems

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret. Any awf...

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    (Credit and/or blame to David Gerard for starting this.)

    102
    Stubsack: weekly thread for sneers not worth an entire post, week ending 15th June 2025
    awful.systems Stubsack: weekly thread for sneers not worth an entire post, week ending 8th June 2025 - awful.systems

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret. Any awf...

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    (Credit and/or blame to David Gerard for starting this.)

    102
    Stubsack: weekly thread for sneers not worth an entire post, week ending 8th June 2025
    awful.systems Stubsack: weekly thread for sneers not worth an entire post, week ending 1st June 2025 - awful.systems

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret. Any awf...

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    (Credit and/or blame to David Gerard for starting this. Also, happy Pride :3)

    115
    Stubsack: weekly thread for sneers not worth an entire post, week ending 1st June 2025
    awful.systems Stubsack: weekly thread for sneers not worth an entire post, week ending 25th May 2025 - awful.systems

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret. Any awf...

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    (Credit and/or blame to David Gerard for starting this.)

    125
    Stubsack: weekly thread for sneers not worth an entire post, week ending 25th May 2025
    awful.systems Stubsack: weekly thread for sneers not worth an entire post, week ending 18th May 2025 - awful.systems

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret. Any awf...

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    (Credit and/or blame to David Gerard for starting this.)

    171
    Stubsack: weekly thread for sneers not worth an entire post, week ending 18th May 2025

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    (Credit and/or blame to David Gerard for starting this.)

    171
    Stubsack: weekly thread for sneers not worth an entire post, week ending 11th May 2025
    awful.systems Stubsack: weekly thread for sneers not worth an entire post, week ending 4th May 2025 - awful.systems

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret. Any awf...

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    (Credit and/or blame to David Gerard for starting this.)

    194
    Stubsack: weekly thread for sneers not worth an entire post, week ending 4th May 2025
    awful.systems Stubsack: weekly thread for sneers not worth an entire post, week ending 27th April 2025 - awful.systems

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret. Any awf...

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    (Credit and/or blame to David Gerard for starting this.)

    174
    Stubsack: weekly thread for sneers not worth an entire post, week ending 27th April 2025
    awful.systems Stubsack: weekly thread for sneers not worth an entire post, week ending 20th April 2025 - awful.systems

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret. Any awf...

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    (Credit and/or blame to David Gerard for starting this.)

    167
    InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)BL
    BlueMonday1984 @awful.systems
    Posts 74
    Comments 790