Skip Navigation
Stubsack: weekly thread for sneers not worth an entire post, week ending 23rd December 2024
  • In other news, Character.AI has ended up in the news again for allowing school shooter chatbots to flourish on its platform.

    You want my off-the-cuff take, this is definitely gonna fuck c.ai's image even further, and could potentially leave them wide open to a lawsuit.

    On a wider front, this is likely gonna give AI another black eye, and push us one step further to the utter destruction of AI as a concept I predicted a couple months ago.

  • Stubsack: weekly thread for sneers not worth an entire post, week ending 23rd December 2024

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    (Semi-obligatory thanks to @dgerard for starting this.)

    108
    Apple Intelligence AI mangles headlines so badly the BBC officially complains
  • But Apple Intelligence has its good points. “I find the best thing about Apple intelligence is that since I haven’t enabled it, my phone optimized for onboard AI has incredible battery life,” responded another Bluesky user. [Bluesky]

    Y'know, if Apple had simply removed the AI altogether and went with that as a marketing point, people would probably buy more iPhones.

    At the bare minimum, AI wouldn't be actively driving people away from buying them.

  • Stubsack: weekly thread for sneers not worth an entire post, week ending 15th December 2024
  • Baldur Bjarnason just put out his interim notes on tech

    Reading through his notes, I get the feeling my off-the-cuff predictions from two months ago are very likely to come true - much more likely than I had anticipated. I'm gonna focus on Predictions 2 and 3:

    1. A Decline in Tech/STEM Students/Graduates

    Whether or not tech crashes like I expect, AI's likely did some serious damage to the notion of "tech/STEM = high-paying job" as it sent the job industry into turmoil (with junior positions being particularly affected).

    1. Tech/STEM’s Public Image Changes - For The Worse

    Whilst my original post focused on AI turning the public against the tech industry, one thing I didn't account for was the tech industry directly aligning itself with Trump. As such, tech/STEM's public image is now firmly intertwined with Trump's public image - with predictable results.

  • Stubsack: weekly thread for sneers not worth an entire post, week ending 15th December 2024
    awful.systems Stubsack: weekly thread for sneers not worth an entire post, week ending 8th December 2024 - awful.systems

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret. Any awf...

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    (Semi-obligatory thanks to @dgerard for starting this.)

    163
    Stubsack: weekly thread for sneers not worth an entire post, week ending 8th December 2024
  • My hands are requesting the CEO's home address (in Minecraft real life):

    Quick bonus I found in the replies:

    And a quick sidenote from me:

    This is sorta repeating a previous prediction of mine, but I expect this AI bubble's gonna utterly tank the public image of tech as a whole. When you develop a tech whose primary use case boils down to "make the world worse so the line can go up", its gonna be virtually impossible for the public to forgive you.

    Being more specific, I expect artists/musicians/creatives in general to be utterly hostile to AI, if not tech as a whole - AI has made their lives significantly harder in a variety of ways, and all signs pointing to the tech industry having done so willingly.

  • Stubsack: weekly thread for sneers not worth an entire post, week ending 8th December 2024
    awful.systems Stubsack: weekly thread for sneers not worth an entire post, week ending 1st December 2024 - awful.systems

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret. Any awf...

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    (Semi-obligatory thanks to @dgerard for starting this - gonna try posting last week's thread a different way this time)

    84
    Stubsack: weekly thread for sneers not worth an entire post, week ending 1st December 2024
  • New post from Brian Merchant: No thanks to generative AI, which is about AI-run publisher Spines and their attempt to enshittify the literature world. Pulling a paragraph near the end here:

    For another, the needle can move here; if the noise is loud enough, AI publishing can get slapped with a stigma that can at least help slow the erosion of the industry. Public shame can be a powerful tool, when warranted! So yeah: This is why I’m thankful that we’re building this community, and that there are people out there willing to go to the mat to oppose things like the AI-enabled automation of book production. (I fully resent that ‘AI enabled automation of book production’ is a phrase I had to write in 2024.)

    Giving my thoughts, I feel Merchant and co. have a headstart when it comes to moving the needle here, for two main reasons:

    • AI has been thoroughly stripped of whatever "wow factor" - showing off that your gen-AI system can make books isn't gonna impress Joe Public the way it would've back in '22 or '23.

    • The one-two punch of the slop-nami and the plagiarism lawsuits have indelibly associated "AI" as a concept with "zero effort garbage made of stolen shit" - as a consequence, using or supporting it will immediately disgust a good portion of the crowd right out of the gate

  • Stubsack: weekly thread for sneers not worth an entire post, week ending 1st December 2024
  • On the one hand, that spectacular failure could potentially dissuade the military from buying in and prolonging this bubble. On the other hand, having an accountability sink for war crimes would be a tempting offer to your average army.

  • Stubsack: weekly thread for sneers not worth an entire post, week ending 1st December 2024

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    Last week's thread

    (Semi-obligatory thanks to @dgerard for starting this - this one was a bit late, I got distracted)

    131
    Stubsack: weekly thread for sneers not worth an entire post, week ending 24th November 2024

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    Last week's thread

    (Semi-obligatory thanks to @dgerard for starting this)

    181
    Where's Your Ed At - Lost In The Future
    www.wheresyoured.at Lost In The Future

    Soundtrack: Post Pop Depression - Paraguay I haven't wanted to write much in the last week. Seemingly every single person on Earth with a blog has tried to drill down into what happened on November 5 — to find the people to blame, to somehow explain what could've been done differently,

    Lost In The Future
    0
    Stubsack: weekly thread for sneers not worth an entire post, week ending 17th November 2024

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    Last week's thread

    (Semi-obligatory thanks to @dgerard for starting this)

    166
    Some Off-The-Cuff Predictions on Trump's Presidency

    Well, Trump's got elected, I'm deeply peeved about the future, and I need to feel like I can see the future coming.

    Fuck it. Here's some off-the-cuff predictions, because I need to

    1. There's a Spike in Infanticides

    Under normal circumstances, anyone who doesn't want to deal with raising crotch goblins has two major options: use contraception to keep the pregnancy from happening at all, or shitcan the unborn fetus by getting an abortion.

    Abortion's been on the way out since Roe v. Wade got double-tapped, and contraception's probably gonna get nuked as well, so giving the baby to an early grave is probably gonna be the only option for most unwanted births.

    Chance: Low/Moderate. This is more-or-less pure gut feeling I'm going on, and chances there's gonna be a lot of abortions getting recorded as infanticides and skewing the numbers

    1. America's Reputation Nosedives

    Gonna just copy-paste istewart's comment for this, because bro said it better than I could:

    > I am left thinking that many people here in the US are going to have a hard time accepting that having this person, at this stage in his life, as the national figurehead will do permanent damage to the US’ prestige and global standing. Doesn’t matter if Reagan was sundowning, media was more controlled then and his handlers were there to support the institution of the presidency at least as much as they were there to support a narcissist. In 2016, other countries could look at Trump as a temporary aberration and wait him out. This time, it’s clear that the US is no longer a reliable partner.

    Chance: Near-Guaranteed. With 2016, the US could point to Hillary winning the popular vote to blunt the damage. 2024 gives no real way to spin it - a majority of Americans explicitly wanted Trump as president. Whatever he does, its in their name.

    1. Violence Against Silicon Valley

    I've already predicted this before, but I'm gonna predict it again. Whether its because Trump scapegoats the Valley as someone else predicted, labor comes to believe peace is no longer an option, someone decides "fuck it, time to make Uncle Ted proud", or God-knows-what-else, I fully expect there's gonna be direct and violent attacks on tech at some point.

    No clue on the exact form either - could be someone getting punched for wearing Ray-Ban Autodoxxers, could be rank-and-file employees getting bombed, could be Trumpies pulling a Pumped Up Kicks, fuck if I know.

    Chance: Moderate. The tech industry's managed to piss off both political wings for one reason or another, though the left's done a good job not resorting to pipe bombs during my lifetime.

    1. Another Assassination Attempt

    With Trump having successfully evaded justice on his criminal convictions (which LegalEagle's discussed in depth), any hope that he's going to see the inside of a jail cell is virtually zero. This leaves vigilante justice as the only option for anyone who wants this man to face anything approaching consequences for his actions.

    Chance: Low. Presidents are very well-protected these days, though Trump's own ego and stupidity may end up opening him up to an attempt.

    1. A Total Porn Ban

    Project 2025 may be defining everything even remotely queer as "pornography", but nothing's stopping a Trump presidency from putting regular porn on the chopping block as well. With the existing spate of anti-porn laws in the US, there's already some minor precedent.

    Chance: Moderate/High. Porn doesn't enjoy much political support from either the Dems or the Reps, though the porn industry could probably get the Dems' support if it knows what its doing.

    5
    Stubsack: weekly thread for sneers not worth an entire post, week ending 10th November 2024

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    Last week's thread

    (Semi-obligatory thanks to @dgerard for starting this)

    208
    Stubsack: weekly thread for sneers not worth an entire post, week ending 3rd November 2024

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    Last week's thread

    (Semi-obligatory thanks to @dgerard for starting this)

    292
    Some Quick-and-Dirty Thoughts on the Character.ai lawsuit

    (This is an expanded version of two of my comments [Comment A, Comment B] - go and read those if you want)

    Well, Character.ai got themselves into some real deep shit recently - repeat customer Sewell Setzer shot himself and his mother, Megan Garcia, is suing the company, its founders and Google as a result, accusing them of "anthropomorphising" their chatbots and offering “psychotherapy without a license.”, among other things and demanding a full-blown recall.

    Now, I'm not a lawyer, but I can see a few aspects which give Garcia a pretty solid case:

    • The site has "mental health-focused chatbots like “Therapist” and “Are You Feeling Lonely,” which Setzer interacted with" as Emma Roth noted writing for The Verge

    • Character.ai has already had multiple addiction/attachment cases like Sewell's - I found articles from Wired and news.com.au, plus a few user testimonies (Exhibit A, Exhibit B, Exhibit C) about how damn addictive the fucker is.

    • As Kevin Roose notes for NYT "many of the leading A.I. labs have resisted building A.I. companions on ethical grounds or because they consider it too great a risk". That could be used to suggest character.ai were being particularly reckless.

    Which way the suit's gonna go, I don't know - my main interest's on the potential fallout.

    Some Predictions

    Win or lose, I suspect this lawsuit is going to sound character.ai's death knell - even if they don't get regulated out of existence, "our product killed a child" is the kind of Dasani-level PR disaster few companies can recover from, and news of this will likely prompt any would-be investors to run for the hills.

    If Garcia does win the suit, it'd more than likely set a legal precedent which denies Section 230 protection to chatbots, if not AI-generated content in general. If that happens, I expect a wave of lawsuits against other chatbot apps like Replika, Kindroid and Nomi at the minimum.

    As for the chatbots themselves, I expect they're gonna rapidly lock their shit down hard and fast, to prevent themselves from having a situation like this on their hands, and I expect their users are gonna be pissed.

    As for the AI industry at large, I suspect they're gonna try and paint the whole thing as a frivolous lawsuit and Garcia as denying any fault for her son's suicide , a la the "McDonald's coffee case". How well this will do, I don't know - personally, considering the AI industry's godawful reputation with the public, I expect they're gonna have some difficulty.

    2
    You Can't Make Friends With The Rockstars - Ed Zitron on the tech press
    www.wheresyoured.at You Can't Make Friends With The Rockstars

    You cannot make friends with the rock stars...if you're going to be a true journalist, you know, a rock journalist. First, you never get paid much, but you will get free records from the record company. [There’s] fuckin’ nothin' about you that is controversial. God, it's gonna get

    You Can't Make Friends With The Rockstars

    Gonna add the opening quote, because it is glorious:

    > You cannot make friends with the rock stars...if you're going to be a true journalist, you know, a rock journalist. First, you never get paid much, but you will get free records from the record company. > > [There’s] fuckin’ nothin' about you that is controversial. God, it's gonna get ugly. And they're gonna buy you drinks, you're gonna meet girls, they're gonna try to fly you places for free, offer you drugs. I know, it sounds great, but these people are not your friends. You know, these are people who want you to write sanctimonious stories about the genius of the rock stars and they will ruin rock 'n' roll and strangle everything we love about it. > > Because they're trying to buy respectability for a form that's gloriously and righteously dumb. > > Lester Bangs, Almost Famous (2000)

    EDITED TO ADD: If you want a good companion piece to this, Devs and the Culture of Tech by @UnseriousAcademic is a damn good read, going deep into the cultural issues which leads to the fawning tech press Zitron so thoroughly tears into.

    3
    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 20 October 2024

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    Last week's thread

    (Semi-obligatory thanks to @dgerard for starting this)

    245
    Some Quick-and-Dirty Thoughts on AI's Future
    awful.systems Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 13 October 2024 - awful.systems

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret. Any awf...

    (This is basically an expanded version of a comment on the weekly Stubsack - I've linked it above for convenience's sake.)

    This is pure gut instinct, but I’m starting to get the feeling this AI bubble’s gonna destroy the concept of artificial intelligence as we know it.

    On the artistic front, there's the general tidal wave of AI-generated slop (which I've come to term "the slop-nami") which has come to drown the Internet in zero-effort garbage, interesting only when the art's utterly insane or its prompter gets publicly humiliated, and, to quote Line Goes Up, "derivative, lazy, ugly, hollow, and boring" the other 99% of the time.

    (And all while the AI industry steals artists' work, destroys their livelihoods and shamelessly mocks their victims throughout.)

    On the "intelligence" front, the bubble's given us public and spectacular failures of reasoning/logic like Google gluing pizza and eating onions, ChatGPT sucking at chess and briefly losing its shit, and so much more - even in the absence of formal proof LLMs can't reason, its not hard to conclude they're far from intelligent.

    All of this is, of course, happening whilst the tech industry as a whole is hyping the ever-loving FUCK out of AI, breathlessly praising its supposed creativity/intelligence/brilliance and relentlessly claiming that they're on the cusp of AGI/superintelligence/whatever-the-fuck-they're-calling-it-right-now, they just need to raise a few more billion dollars and boil a few more hundred lakes and kill a few more hundred species and enable a few more months of SEO and scams and spam and slop and soulless shameless scum-sucking shitbags senselessly shitting over everything that was good about the Internet.

    ----

    The public's collective consciousness was ready for a lot of futures regarding AI - a future where it took everyone's jobs, a future where it started the apocalypse, a future where it brought about utopia, etcetera. A future where AI ruins everything by being utterly, fundamentally incompetent, like the one we're living in now?

    That's a future the public was not ready for - sci-fi writers weren't playing much the idea of "incompetent AI ruins everything" (Paranoia is the only example I know of), and the tech press wasn't gonna run stories about AI's faults until it became unignorable (like that lawyer who got in trouble for taking ChatGPT at its word).

    Now, of course, the public's had plenty of time to let the reality of this current AI bubble sink in, to watch as the AI industry tries and fails to fix the unfixable hallucination issue, to watch the likes of CrAIyon and Midjourney continually fail to produce anything even remotely worth the effort of typing out a prompt, to watch AI creep into and enshittify every waking aspect of their lives as their bosses and higher-ups buy the hype hook, line and fucking sinker.

    ----

    All this, I feel, has built an image of AI as inherently incapable of humanlike intelligence/creativity (let alone Superintelligencetm), no matter how many server farms you build or oceans of water you boil.

    Especially so on the creativity front - publicly rejecting AI, like what Procreate and Schoolism did, earns you an instant standing ovation, whilst openly shilling it (like PC Gamer or The Bookseller) or showcasing it (like Justine Moore, Proper Prompter or Luma Labs) gets you publicly and relentlessly lambasted. To quote Baldur Bjarnason, the “E-number additive, but for creative work” connotation of “AI” is more-or-less a permanent fixture in the public’s mind.

    I don't have any pithy quote to wrap this up, but to take a shot in the dark, I expect we're gonna see a particularly long and harsh AI winter once the bubble bursts - one fueled not only by disappointment in the failures of LLMs, but widespread public outrage at the massive damage the bubble inflicted, with AI funding facing heavy scrutiny as the public comes to treat any research into the field as done with potentally malicious intent.

    3
    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 13 October 2024

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    Last week’s thread

    (Semi-obligatory thanks to @dgerard for starting this)

    169
    "The Subprime AI Crisis" - Ed Zitron on the bubble's impending collapse
    www.wheresyoured.at The Subprime AI Crisis

    None of what I write in this newsletter is about sowing doubt or "hating," but a sober evaluation of where we are today and where we may end up on the current path. I believe that the artificial intelligence boom — which would be better described as a generative AI boom

    The Subprime AI Crisis

    > None of what I write in this newsletter is about sowing doubt or "hating," but a sober evaluation of where we are today and where we may end up on the current path. I believe that the artificial intelligence boom — which would be better described as a generative AI boom — is (as I've said before) unsustainable, and will ultimately collapse. I also fear that said collapse could be ruinous to big tech, deeply damaging to the startup ecosystem, and will further sour public support for the tech industry.

    Can't blame Zitron for being pretty downbeat in this - given the AI bubble's size and side-effects, its easy to see how its bursting can have some cataclysmic effects.

    (Shameless self-promo: I ended up writing a bit about the potential aftermath as well)

    32
    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 9 September 2024

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    (Semi-obligatory thanks to @dgerard for starting this)

    243
    Ed Zitron on Google's antitrust loss
    www.wheresyoured.at Monopoly Money

    Last week, in the midst of the slow, painful collapse of the generative AI hype cycle, something incredible happened. On Monday, a Federal Judge delivered a crushing ruling in the multi-year-long antitrust case filed against Google by the Department of Justice. In 300-pages of dense legal text, Jud...

    Monopoly Money
    0
    pivot-to-ai.com Humane Ai Pin returns are now outpacing sales

    Humane was founded by former Apple employees Imran Chaudhri and Bethany Bongiorno. They wanted something that would rival the iPhone. The Ai Pin (that’s “Ai”, not “AI”) would take commands by…

    Humane Ai Pin returns are now outpacing sales
    3
    Some Quick and Dirty Thoughts on "The empty brain"
    aeon.co Your brain does not process information and it is not a computer | Aeon Essays

    Your brain does not process information, retrieve knowledge or store memories. In short: your brain is not a computer

    Your brain does not process information and it is not a computer | Aeon Essays

    This started as a summary of a random essay Robert Epstein (fuck, that's an unfortunate surname) cooked up back in 2016, and evolved into a diatribe about how the AI bubble affects how we think of human cognition.

    This is probably a bit outside awful's wheelhouse, but hey, this is MoreWrite.

    The TL;DR

    The general article concerns two major metaphors for human intelligence:

    • The information processing (IP) metaphor, which views the brain as some form of computer (implicitly a classical one, though you could probably cram a quantum computer into that metaphor too)
    • The anti-representational metaphor, which views the brain as a living organism, which constantly changes in response to experiences and stimuli, and which contains jack shit in the way of any computer-like components (memory, processors, algorithms, etcetera)

    Epstein's general view is, if the title didn't tip you off, firmly on the anti-rep metaphor's side, dismissing IP as "not even slightly valid" and openly arguing for dumping it straight into the dustbin of history.

    His main major piece of evidence for this is a basic experiment, where he has a student draw two images of dollar bills - one from memory, and one with a real dollar bill as reference - and compare the two.

    Unsurprisingly, the image made with a reference blows the image from memory out of the water every time, which Epstein uses to argue against any notion of the image of a dollar bill (or anything else, for that matter) being stored in one's brain like data in a hard drive.

    Instead, he argues that the student making the image had re-experienced seeing the bill when drawing it from memory, with their ability to do so having come because their brain had changed at the sight of many a dollar bill up to this point to enable them to do it.

    Another piece of evidence he brings up is a 1995 paper from Science by Michael McBeath regarding baseballers catching fly balls. Where the IP metaphor reportedly suggests the player roughly calculates the ball's flight path with estimates of several variables ("the force of the impact, the angle of the trajectory, that kind of thing"), the anti-rep metaphor (given by McBeath) simply suggests the player catches them by moving in a manner which keeps the ball, home plate and the surroundings in a constant visual relationship with each other.

    The final piece I could glean from this is a report in Scientific American about the Human Brain Project (HBP), a $1.3 billion project launched by the EU in 2013, made with the goal of simulating the entire human brain on a supercomputer. Said project went on to become a "brain wreck" less than two years in (and eight years before its 2023 deadline) - a "brain wreck" Epstein implicitly blames on the whole thing being guided by the IP metaphor.

    Said "brain wreck" is a good place to cap this section off - the essay is something I recommend reading for yourself (even if I do feel its arguments aren't particularly strong), and its not really the main focus of this little ramblefest. Anyways, onto my personal thoughts.

    Some Personal Thoughts

    Personally, I suspect the AI bubble's made the public a lot less receptive to the IP metaphor these days, for a few reasons:

    1. Articial Idiocy

    The entire bubble was sold as a path to computers with human-like, if not godlike intelligence - artificial thinkers smarter than the best human geniuses, art generators better than the best human virtuosos, et cetera. Hell, the AIs at the centre of this bubble are running on neural networks, whose functioning is based on our current understanding of how the brain works. [Missed this incomplete sensence first time around :P]

    What we instead got was Google telling us to eat rocks and put glue in pizza, chatbots hallucinating everything under the fucking sun, and art generators drowning the entire fucking internet in pure unfiltered slop, identifiable in the uniquely AI-like errors it makes. And all whilst burning through truly unholy amounts of power and receiving frankly embarrassing levels of hype in the process.

    (Quick sidenote: Even a local model running on some rando's GPU is a power-hog compared to what its trying to imitate - digging around online indicates your brain uses only 20 watts of power to do what it does.)

    With the parade of artificial stupidity the bubble's given us, I wouldn't fault anyone for coming to believe the brain isn't like a computer at all.

    1. Inhuman Learning

    Additionally, AI bros have repeatedly and incessantly claimed that AIs are creative and that they learn like humans, usually in response to complaints about the Biblical amounts of art stolen for AI datasets.

    Said claims are, of course, flat-out bullshit - last I checked, human artists only need a few references to actually produce something good and original, whilst your average LLM will produce nothing but slop no matter how many terabytes upon terabytes of data you throw at its dataset.

    This all arguably falls under the "Artificial Idiocy" heading, but it felt necessary to point out - these things lack the creativity or learning capabilities of humans, and I wouldn't blame anyone for taking that to mean that brains are uniquely unlike computers.

    1. Eau de Tech Asshole

    Given how much public resentment the AI bubble has built towards the tech industry (which I covered in my previous post), my gut instinct's telling me that the IP metaphor is also starting to be viewed in a harsher, more "tech asshole-ish" light - not just merely a reductive/incorrect view on human cognition, but as a sign you put tech over human lives, or don't see other people as human.

    Of course, AI providing a general parade of the absolute worst scumbaggery we know (with Mira Murati being an anti-artist scumbag and Sam Altman being a general creep as the biggest examples) is probably helping that fact, alongside all the active attempts by AI bros to mimic real artists (exhibit A, exhibit B).

    33
    InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)BL
    BlueMonday1984 @awful.systems
    Posts 30
    Comments 361