You can hardly get online these days without hearing some AI booster talk about how AI coding is going to replace human programmers. AI code is absolutely up to production quality! Also, you’re all…
The general comments that Ben received were that experienced developers can use AI for coding with positive results because they know what they’re doing. But AI coding gives awful results when it’s used by an inexperienced developer. Which is what we knew already.
That should be a big warning sign that the next generation of developers are not going to be very good. If they're waist deep in AI slop, they're only going to learn how to deal with AI slop.
As a non-programmer, I have zero understanding of the code and the analysis and fully rely on AI and even reviewed that AI analysis with a different AI to get the best possible solution (which was not good enough in this case).
What I'm feeling after reading that must be what artists feel like when AI slop proponents tell them "we're making art accessible".
Watched a junior dev present some data operations recently. Instead of just showing the sql that worked they copy pasted a prompt into the data platform's assistant chat. The SQL it generated was invalid so the dev simply told it "fix" and it made the query valid, much to everyone's amusement.
The actual column names did not reflect the output they were mapped to, there's no way the nicely formatted results were accurate. Average duration column populated the total count output. Junior dev was cheerfully oblivious. It produced output shaped like the goal so it must have been right
In so many ways, LLMs are just the tip of the iceberg of bad ideology in software development. There have always been people that come into the field and develop heinously bad habits. Whether it's the "this is just my job, the only thing I think about outside work is my family" types or the juniors who only know how to copy paste snippets from web forums.
And look, I get it. I don't think 60-80 hour weeks are required to be successful. But I'm talking about people who are actively hostile to their own career paths, who seem to hate programming except that it pays good and let's them raise families. Hot take: that sucks. People selfishly obsessed with their own lineage and utterly incurious about the world or the thing they spend 8 hours a day doing suck, and they're bad for society.
The juniors are less of a drain on civilization because they at least can learn to do better. Or they used to could, because as another reply mentioned, there's no path from LLM slop to being a good developer. Not without the intervention of a more experienced dev to tell them what's wrong with the LLM output.
It takes all the joy out of the job too, something they've been working on for years. What makes this work interesting is understanding people's problems, working out the best way to model them, and building towards solutions. What they want the job to be is a slop factory: same as the dream of every rich asshole who thinks having half an idea is the same as working for years to fully realize an idea in all it's complexity and wonder.
They never have any respect for the work that takes because they've never done any work. And the next generation of implementers are being taught that there are no new ideas. You just ask the oracle to give you the answer.
What people want when they say “AI is making art accessible” is they want high quality professional art for dirt cheap.
...and what their opposition means when they oppose it is "this line of work was supposed to be totally immune to automation, and I'm mad that it turns out not to be."
There is already a lot of automation out there, and more is better, when used correctly. And that's not talking about the outright theft of the material from these artists it is trying to replace so badly.
I think they also want recognition/credit for spending 5 minutes (or less) typing some words at an image generator as if that were comparable to people who develop technical skills and then create effortful meaningful work just because the outputs are (superficially) similar.
I dunno. I feel like the programmers who came before me could say the same thing about IDEs, Stack Overflow, and high level programming languages. Assembly looks like gobbledygook to me and they tell me I'm a Senior Dev.
If someone uses ChatGPT like I use StackOverflow, I'm not worried. We've been stealing code from each other since the beginning."Getting the answer" and then having to figure out how to plug it into the rest of the code is pretty much what we do.
There isn't really a direct path from an LLM to a good programmer. You can get good snippets, but "ChatGPT, build me a app" will be largely useless. The programmers who come after me will have to understand how their code works just as much as I do.
All the newbs were just copying lines from stackexchange before AI. The only real difference at this point is that the commenting is marginally better.