Know a guy who tried to use AI to vibe code a simple web server. He wasn't a programmer and kept insisting to me that programmers were done for.
After weeks of trying to get the thing to work, he had nothing. He showed me the code, and it was the worst I've ever seen. Dozens of empty files where the AI had apparently added and then deleted the same code. Also some utter garbage code. Tons of functions copied and pasted instead of being defined once.
I then showed him a web app I had made in that same amount of time. It worked perfectly. Never heard anything more about AI from him.
AI isn't ready to replace just about anybody's job, and probably never will be technically, economically or legally viable.
That said, the c-suit class are certainly going to try. Not only do they dream of optimizing all human workers out of every workforce, they also desperately need to recoup as much of the sunk cost that they've collectively dumped into the technology.
Take OpenAI for example, they lost something like $5,000,000,000 last year and are probably going to lose even more this year. Their entire business plan relies on at least selling people on the idea that AI will be able to replace human workers. The minute people realize that OpenAI isn't going to conquer the world, and instead end up as just one of many players in the slop space, the entire bottom will fall out of the company and the AI bubble will burst.
In all seriousness though I do worry for the future of juniors. All the things that people criticise LLMs for, juniors do too. But if nobody hires juniors they will never become senior
I had a dude screaming pretty much the same thing at me yesterday on here (on a different account), despite the fact that I'm senior-level, near the top of my field and that all the objective data as well as anecdotal reports from tons of other people says otherwise. Like, okay buddy, sure. People seem to just like fighting things online to feel better about themselves, even if the thing they're fighting doesn't really exist.
AI isn't ready to replace programmers, engineers or IT admins yet. But let's be honest if some project manager or CTO somewhere hasn't already done it they're at least planning it.
Then eventually to save themselves or out of sheer ignorance they'll blame the chaos that results on the few remaining people who know what they're doing because they won't be able to admit or understand the fact that the bold decision they took to "embrace" AI and increase the company's bottom line which everyone else in their management bubble believes in has completely mangled whatever system their company builds or uses. More useful people will get fired and more actual work will get shifted to AI. But because that'll still make the number go up the management types will look even better and the spread of AI will carry on. Eventually all systems will become an unwieldy mess nobody can even hope to repair.
This is just IT, I'm pretty sure most other industries will eventually suffer the same fate. Global supply chains will collapse and we'll all get sent back to the dark ages.
TL,DR: The real problem with AI isn't that it'll become too powerful and choose to kill us, but that corporate morons will overestimate how powerful it already is and that will cause our eventual downfall.
As an end user with little knowledge about programming, I've seen how hard it is for programmers to get things working well many times over the years. AI as a time saver for certain simple tasks, sure, but no way in hell they'll be replacing humans in my lifetime.
I take issue with the "replacing other industries" part.
I know that this is an unpopular opinion among programmers but all professions have roles that range from small skills sets and little cognitive abilities to large skill sets and high level cognitive abilities.
Generative AI is an incremental improvement in automation. In my industry it might make someone 10% more productive. For any role where it could make someone 20% more productive that role could have been made more efficient in some other way, be it training, templates, simple conversion scripts, whatever.
Basically, if someone's job can be replaced by AI then they weren't really producing any value in the first place.
Of course, this means that in a firm with 100 staff, you could get the same output with 91 staff plus Gen AI. So yeah in that context 9 people might be replaced by AI, but that doesn't tend to be how things go in practice.
I work in QA, even devs who've worked for 10+ years make dumb mistakes every so often. I wouldn't want to do QA when AI is writing the software, it's just gonna give me even more work 🫠
AI is certainly a very handy tool and has helped me out a lot but anybody who thinks "vibe programming" (i.e. programming from ignorance) is a good idea or will save money is woefully misinformed. Hire good programmers, let them use AI if they like, but trust the programmer's judgement over some AI.
That's because you NEED that experience to notice the AI is outputting garbage. Otherwise it looks superficially okay but the code is terrible, or fragile, or not even doing what you asked it properly. e.g. if I asked Gemini to generate a web server with Jetty it might output something correct or an unholy mess of Jetty 8, 9, 10, 11, 12 with annotations and/or programmatic styles, or the correct / incorrect pom dependencies.
It's even funnier because the guy is mocking DHH. You know, the creator of Ruby on Rails. Which 37signals obviously uses.
I know from experience that a) Rails is a very junior developer friendly framework, yet incredibly powerful, and b) all Rails apps are colossal machines with a lot of moving parts. So when the scared juniors look at the apps for the first time, the senior Rails devs are like "Eh, don't worry about it, most of the complex stuff is happening on the background, the only way to break it if you genuinely have no idea what you're doing and screw things up on purpose." Which leads to point c) using AI coding with Rails codebases is usually like pulling open the side door of this gargantuan machine and dropping in a sack of wrenches in the gears.
English isn’t my first language, so I often use translation services. I feel like using them is a lot like vibe coding — very useful, but still something that needs to be checked by a human.
AI is a tool, Ashish is 100% correct in that it may do some things for developers but ultimately still needs to be reviewed by people who know what they're doing. This is closer to the change from punch cards to writing code directly on a computer than making software developers obsolete.
The way I see it, there are two types of developers we should take into consideration for this discussion:
Software Engineers
Code editors
Most "programmers" these days are really just code editors, they know how to search stack overflow for some useful pointers, copy that code and edit it to what they need. That is absolutely fine, this advances programming in so many ways. But the software engineers are the people that actually answer the stack overflow questions with detailed answers. These engineers have a more advanced skillset in problem solving for specific coding frameworks and languages.
When people say: programmers are cooked, I keep thinking that they mean code editors, not software engineers. Which is a similar trend in basically all industries in relation with AI. Yes, AI has the potential to make some jobs in health care obsolete (e.g. radiologist), but that doesn't mean we no longer need surgeons or domain expert doctors. Same thing applies to programming.
So if you are a developer today, ask yourself the following: Do actually know my stuff well, am I an expert? If the answer is no, and you're basically a code editor (which again, is fine), then you should seriously consider what AI means for your job.
I once asked chatGPT to write a simple RK2 algorithm in python. The function couldve been about 3 lines followed by a return statement. It gave me some convoluted code that was 3 functions and about 20 lines. AI still has some time to go before its can handle writing code on its own. Ive asked copilot/chatGPT several times to write code (just for fun) and it always does this
I have mixed feelings about that company. They have some interesting things "going against the flow" like ditching the cloud and going back to on prem, hating on microservices, advocating against taking money from VCs, and now hiring juniors. On the other hand, the guy is a Musk fanboy and they push some anti-DEI bullshit. Also he's a TypeScript hater for some reason...
Hey cool, an AI can program itself as well as a human can now. Think of how this will impact the programmer job market! That's got to be like, the biggest implication of this development.
You're hiring junior programmers for $145k a year? Americans have too much money, I swear. The rest of the world has juniors on less than a third of that if they're in Europe.
I mean honestly… probably. Not yet. But soon. Right now, ai can make lies and shitty code, but it’s probably not that far from making ok code. So there is likely going to be a surge in need for highly skilled programmers that can fix trash ai code that is….. almost there.