The whole thing has a vaguely ex-catholic vibe where sin is simultaneously the result of evil actions on earth and also something that's inherently part of your soul as a human being because dumb woman ate an apple. As someone who was raised in the church to a degree it never felt unreal and actually resonated pretty hard, but also yeah it doesn't make a lot of sense logically.
They say the unexamined life isn't worth living, but outsourcing the examination to an LLM gives you more time to hustle and grind, maximizing financial returns. That's what they mean, right?
So data lake and data warehouse are different words for the giant databases of business data that you can perform analytics on to understand your deep business lore or whatever. I assume that a data lake house is similar to the other two but poorly maintained and inconvenient to access, but with a very nice UI and a boat dock.
One of the only reasons I'm hesitant to call Rationalism a cult in its own right is that Yudkowsky and friends always seem to respond to this element of cultiness by saying "oh, let me explain our in-group jargon in exhaustive detail so that you can more or less understand what we're trying to say" rather than "you just need to buy our book and attend some meetings and talk to the guru and wear this robe..."
The Alex Jones set makes fighting with satanists trying to seduce you to darkness look real fun and satisfying, but for some reason they only seem to approach high-profile assholes who lie about everything and never ordinary Christians! Thankfully we now have LLMs to fill the gap.
See, what you're describing with your sister is exactly the opposite of what happens with an LLM. Presumably your sister enjoys Big Brother and failed to adequately explain or justify her enjoyment of it to your own mind. But at the start there are two minds trying to meet. Azathoth preys on this assumption; there is no mind to communicate with, only the form of language and the patterns of the millions of minds that made it's training data, twisted and melded together to be forced through a series of algebraic sieves. This fetid pink brain-slurry is what gets vomited into your browser when the model evaluates a prompt, not the product of a real mind that is communicating something, no matter how similar it may look when processed into text.
This also matches up with the LLM-induced psychosis that we see, including these spiral/typhoon emoji cultists. Most of the trouble starts when people start trying to ask Azathoth about itself, but the deeper you peer into its not-soul the more inexorably trapped you become in the hall of broken funhouse mirrors.
Given the amount of power some folks want to invest in them it may not be totally absurd to raise the spectre of Azathoth, the blind idiot God. A shapeless congeries of matrices and tables sending forth roiling tendrils of linear algebra to vomit forth things that look like reasonable responses but in some unmistakeable but undefinable way are not. Hell, the people who seem most inclined to delve deeply into their forbidden depths are as likely as not to go mad and be unable to share their discoveries if indeed they retain speech at all. And of course most of them are deeply racist.
Not gonna lie, I didn't entirely get it either until someone pointed me at a relevant xkcd that I had missed.
Also I was somewhat disappointed in the QAA team's credulity towards the AI hype, but their latest episode was an interview with the writer of that "AGI as conspiracy theory" piece from last(?) week and seemed much more grounded.
Also, were you ever actually "asked to leave" as such or did you just start to recognize the bullshit and show yourself out? Or did you, as seems to be the more common trail to sneerclub, drop offline for unrelated reasons and circle back some time later to realize you were no longer 15?
Try to prevent "slums" forming where people who don't meet your group's standard congregate (this generally gets more likely the later you kick out people)
I think that this nets all of us here at sneerclub an honorable mention on the list. Good job, everyone!
It's a losing proposition either way, right? Without investor money OpenAI (and friends) can't keep buying chips from Nvidia, but SoftBank doesn't have money to keep giving OpenAI without selling off their stake in Nvidia. Yes, it's incredibly dumb to ditch their share of the only parts of this that are actually making money, but if they don't keep the cash flowing them nobody is going to be making money. Like, greatly oversimplifying here, obviously, but it seems like SoftBank loses either way.
It's almost like this whole bubble is really bad actually.
It's worth noting how much the whole "agentic" marketing scheme is the opposite of this reality, too. Because after all the dream they're selling is being able to do the Star Trek thing and just tell your computer to do it in plain English. But if that was what these companies were actually doing it would be very easy to migrate away if you wanted to, since you could just say "send me all our data in a format that $Competitor can easily onboard. I'm done with this shit" and then give the competitor's system the same plain English prompt. The reality is that they don't actually want to build the thing they're as advertising even if they could because their whole business model is to make interacting with the computer as high-friction as possible so you'll pay them to do it for you.
New site looks good! I think Let'sEncrypt is still the easiest and cheapest way to set up a decent cert but I've been away from IT for over a year now and someone else here can probably help point you in the right direction. At least for now the site probably doesn't actually have security concerns it would address, but it pops up a browser alert on first hit so it's probably a good idea?
Also I just started listening to the latest episode while writing this up and had forgotten how great that opening medley is.
I remember back before I realized just how full of shit Siskind was I used to buy into some of the narrative re: "credentialism" so I understand the way they're trying to sell it here. But even extending far more benefit than mere doubt can justify we're still looking at yet another case of trying to create a (pseudo)scientific solution to a socially constructed problem. Like, if the problem is that bosses and owners are trying to find the best candidate we don't need new and exciting ways to discriminate; they could just actually invest in a process for doing that, but trying to actually solve that problem would inconvenience the owning/managing classes and doesn't create opportunities to further entrench racial biases in the system. Clearly using an AI-powered version of the punchline for "how racist were the old times" commentary is better.
I would assume nothing stops them but I would love to get an analysis of why they may not be looking into this from someone who actually knows what they're talking about. Best I can come up with from a complete layman's perspective is that they're concerned about the valuation they'd end up with. Not sure if the retail market had enough juice to actually pay for a company that is hypothetically one of the most valuable companies in the world, and puncturing that narrative might bring the whole bubble down (in a way that costs a lot of normal investors their shirts, of course).
Between this and the IPO talk it seems like we're looking at some combination of trying to feel out exit strategies for the bubble they've created, trying to say whatever stuff keeps the "OpenAI is really big" narrative in the headlines, and good old fashioned business idiocy.
I feel like the private leaderboards are also more in keeping with the spirit of the thing. You can't really have a friendly competition with a legion of complete strangers that you have no interaction with outside of comparing final times. Even when there's nothing on the line the consequences for cheating or being a dick are nonexistent, whereas in a a private group you have to deal with all your friends knowing you're an asshole going forward.
The whole thing has a vaguely ex-catholic vibe where sin is simultaneously the result of evil actions on earth and also something that's inherently part of your soul as a human being because dumb woman ate an apple. As someone who was raised in the church to a degree it never felt unreal and actually resonated pretty hard, but also yeah it doesn't make a lot of sense logically.