AI Bubble Side Effect
AI Bubble Side Effect
AI Bubble Side Effect
Is there a better algorithm to generate code? Hmm, maybe theoretically but it certainly doesn’t exist now.
I don't know if there is a better algorithm, but there is definitely a lot of prior work in this field:
https://en.wikipedia.org/wiki/Comparison_of_code_generation_tools
Yes, smart precise helping tools and code analysis is what we want, not some humbo jumbo.
Im honestly not that certain that there’s a better algo for NLP. LLMs do suck for most of the things we try to make it do but the NLP part of it has been consistently great.
People should look at something like Inform 7 and everything that has gone into making software understand natural speech. Even a basic LambdaMOO parser takes awhile to set up.
Now it's free to map basic language to functions. My 4b models use less ram / CPU than watching a 720p music video. Sucks people overhype AI because transformers are really cool.
The bubble is already starting to burst. Or at the very least starting to deflate. I'm currently in a technical conference with one of the main topics being AI. About half the sentiment of the talks is shortcomings, challenges and alternatives to LLMs in pretty much all areas. Devs and Execs alike notice that the advancement of LLMs has stalled and promised capabilities won't be achieved anytime soon.
Out of curiosity what were they trying to do with LLMs that they're now trying to do with something else?
I noticed a slow down to functional programming concepts (monads, pattern matching, option result, immutability, pipes) infavour of shitty vibe coded python.
🙄 corpos not talking about real tech doesn't mean people aren't developing real tech
they just aren't the five people who own english speaking newspapers
For "boilerplate code generation", you can just write a script, use a language with metaprogramming capabilities (even the C preprocessor is capable of it to a llimited degree), etc.
LLMs are like Excel. They're not great at most things but can handle lost things you throw at it.
Excel is like a hammer. As long as you use it correctly, it will reliably pound nails, every time.
LLMs are like a golden retriever. It might bring back the ball most of the time, but sometimes it might just take a shit in front of you.
Bro just reinvented cost of opportunity
LLMs are spell checkers on steroids. Mathy-maths aren't much better.
LLMs/AI can do anything. We just need to keep training them. It's not like their effectiveness has started to plateau or the feedback loop of AI slop is poisoning the training data. You just don't understand. AI can't fail. We need to light more money on fire. Don't worry about the smoke, the electricity required to run the LLMs make that carbon little more than a drop in the bucket. Plus, AI can solve the climate crisis - we just need more of your data to train the model. It can do anything. It'll pay off in the end - you'll see. When it does we can make boatloads of money by firing all of the workers that ultimately make up our customer base and buy our products. Don't you understand? It can't fail. More money. More training data. It has to work.