A (human) coder should definitely be able to think about a novel problem and come up with an algorithm to solve it without copying the algorithm from someone else. Particularly for those circumstances where there isn't any option but to come up with a novel solution.
I took it more as a (very common among coders) joke about how writing code is actually just googling and copying code from Stackoverflow. (Of course it's exaggerated, which is part of why it's, while not completely true, funny.)
The good coders are good because they know how to copy and paste. It’s the same as “googling it”. We’re good. We ain’t got nothing to prove to no one. And we know where all the bodies are buried.
The difference between me copying code and chatgpt is i can understand what the code is doing and not what just matches a statistical association of a string of words.
In that case the only people who program are those who develop new programming languages from the ground up. And maybe not them since they learned how to do it somewhere.
AI is just the next level of abstraction. First there was paper tape, then assembly, then C, then C++ and then the higher level OOP languages, JavaScript, and now finally this - natural language. It's the next logical step. And I'm sure at each previous milestone people were having arguments about it much the same as this time.
Thing is, this is the lowest the bar has ever been to get into development - and yet, you still need to understand both what you are asking the LLM to produce and, even more importantly, the output it produces. This second part is in my opinion the most likely aspect to blow up in people's faces.
Don't come crying when the mission critical finance app vibe coded by your MBA suddenly starts erroring out at 3am every second Saturday because your LLM decided to hallucinate a magic number somewhere in your codebase.