Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 22 September 2024
Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
Our DSO now greenlit the stupid Copilot integration because "Microsoft said it's okay" (of course they did), and he also was on some stupid AI convention yesterday and whatever fucking happened there, he's become a complete AI bro and is now preaching the Gospel of Altman that everyone who's not using AI will be obsolete in few years and we need to ADAPT OR DIE. It's the exact same shit CEO is spewing.
He wants an AI that handles data security breaches by itself. He also now writes emails with ChatGPT even though just a week ago he was hating on people who did that. I sat with my fucking mouth open in that meeting and people asked me whether I'm okay (I'm not).
I need to get another job ASAP or I will go clinically insane.
I’m so sorry. the tech industry is shockingly good at finding people who are susceptible to conversion like your CEO and DSO and subjecting them to intense propaganda that unfortunately tends to work. for someone lower in the company like your DSO, that’s a conference where they’ll be subjected to induction techniques cribbed from cults and MLM schemes. I don’t know what they do to the executives — I imagine it involves a variety of expensive favors, high levels of intoxication, and a variant of the same techniques yud used — but it works instantly and produces someone who can’t be convinced they’ve been fed a lie until it ends up indisputably losing them a ton of money
Yeah, I assume that's exactly what happened when CEO went to Silicon Valley to talk to "important people". Despite being on a course to save money before, he dumped tens of thousands into AI infrastructure which hasn't delivered anything so far and is suddenly very happy with sending people to AI workshops and conferences.
But I'm only half-surprised. He's somewhat known for making weird decisions after talking to people who want to sell him something. This time it's gonna be totally different, of course.
The "important people" line is a huge part of how the grift works and makes tech media partially responsible. Legitimizing the grift rather than criticizing it makes it easy for sales folks to push "the next big thing." And after all, don't you want to be an important person?
He wants an AI that handles data security breaches by itself. He also now writes emails with ChatGPT
He is the data security breach.
E: Dropped a T. But hey, at least chatgpt uses SSL to communicate, so the databreach is now constrained to the ChatGPT trainingdata. So it isn't that bad.
I have realized working at a corporation that a lot of employees will just mindlessly regurgitate the company message. And not in a "I guess this is what we have to work on" way, but as if it replaced whatever worldview they had previously.