Can confirm. Was quite unhappy in my mechanical engineering job, had an opportunity to develop something nice in python, was told we'd do it in excel/vba instead, still unhappy.
was told we'd do it in excel/vba instead, still unhappy.
I just threw up in my mouth a little. Fifteen years ago, "I'll stick to Excel" was a (bad, but) defensible position in data automation. Today that's just insanity.
Nice. You can put that on your resume so you can get more of those kinds of jobs.
(/s. I like excel to a point but i really feel your pain too-- and fuck vba)
Every job lately seems to have been infected by Meta/google “data driven” leadership. Its so painful and wasteful sometimes.
It's cargo cult mentality. They look at FANGs and see them as success stories, and thus they try to be successful by mimicking visible aspects of FANG's way of doing things, regardless of having the same context or even making sense.
I once interviewed for a big name non-FANG web-scale service provider whose recruiter bragged about their 7-round interview process. When I asked why on earth they need 7 rounds of interviews, the recruiter said they optimized the process down from the 12 rounds of interviews they did in the past, and they do it because that's what FANGs do. Except FANGs do typically 4, with the last being an on-site.
Yeah. I, like most leaders, spent some time learning all that crap. It was awful and worse than useless.
Google and Meta's secrets are recruiting top talent to for top dollars, and then buying every start up that threatens their empire. There's no secrets to great management to be had there.
I just threw out my copy of "product engineering at Google".
Will AI steal their jobs? 70% of professional programmers don’t see artificial intelligence as a threat to their work.
If your job can be replaced with GPT, you had a bullshit job to begin with.
What so many people don't understand is that writing code is only a small part of the job. Figuring out what code to write is where most of the effort goes. That, and massaging the egos of management/the C-suite if you're a senior.
I'm an accountant. Components of the job have been being automated or systemised for many decades. Most of the tasks that occupied a graduate when I was one 20 years ago don't exist anymore.
Not because AI is doing those tasks but just because everything became more integrated, we configure and manage the flow of data rather than making the data, you might say.
If you had to hire 100 professional programmers in the past, but then AI makes programmers 10% more efficient than previously, then you can do the same work with 91 programmers.
That doesn't mean that 9 people were doing something that an LLM can do, it just means that more work is being completed with fewer programmers.
If you had to hire 100 professional programmers in the past, but then AI makes programmers 10% more efficient than previously, then you can do the same work with 91 programmers.
You've nailed to root of the misunderstanding by non-programmers. We're already optimized past that target.
Some people think we type all day. We don't. We stare at our screen saying "what the fuck?!" for most of the day. Those is especially true for the best programmers doing really interesting work.
There's maybe three living humans who actually know how to correctly build a Windows installer. One of those three is paid to sell software to automate the task for everyone else. The other two retired already. (One is hiding out as a bar tender and claims to not speak any English if recognized from their MSI days.)
Pick an interesting topic in programming, and you'll find similarly ludicrous optimization.
There's a few hundred programmers building all banking automation, selling it to millions of bank employees.
It's possible that AI will force a dozen people to stop doing banking automation. It's a lot more likely that the backlog of unmet banking automation need will instead just get very slightly smaller.
Now, the reality of the economics won't stop CIOs from laying off staff and betting that AI will magically expand to fill the gap. We're seeing that now. That's called the "fuck around" phase.
But we've seen "this revolutionary technology will make us not need more programmers" before (several times). The outcomes, when the dust settles are:
The job is now genuinely easier to do, at least for beginners. (Senior professionals had access to equivalent solutions, before everyone else got excited.)
More people are now programmers. (We laid a bunch of them off, and we meant to not hire any back, but it turned out that our backlog of cool/revolutionary/necessary ideas was more important to leadership than pinching pennies.)
A lot of work that was previously ignored completely now gets done, but done very badly by brand new programmers. (We asked the senior developers to do it, but they said "Fuck you, that's not important, make the new kid do it." I think they're just still cranky that we spent three years laying off staff instead of training...)
The average quality of all software is now a bit worse, but there's a lot more variety of (worse) software now available.
To add on this, this doesn't necessarily mean that there are fewer programing jobs in total. If people work 10% more efficently, that means that the cost of labor is only 91% of what it was before meaning that people might be able to afford to finance more programing projects. One thing that does matter is for example things like entry level jobs disappearing or the nature of the work changing. Doing less boring gruntwork can make the job more fun, but otoh digitization sometimes results in the worker having less agency in what they do since they have to fit everything into a possibly inflexible digital system.
But that is always happening. Software that now can be built by two programers needed IBM few decades ago, just because of hardware, languages, available libraries and shared knowledge.
But we still have so many "app ideas" that there is more work to be done. I would be happy to have AI write all those apps that I need and have no time or money to make them.
My conclusion is that it is only about money and economy. We are in unofficial recession so everyone is cutting costs, as soon as money comes back we will go back into bulking/exploration phase.
If all you bring to the job is looking shit up and telling me yes or no instead of actually trying to help me find solutions, or explaining me what I did wrong, you're just a glorified robot. You're in line for replacement and you'll fucking deserve it.
At least that's what I wanna say to "the computer said" people.
There's a lot of like management being like "we gotta hit this deadline (that we made up)" combined with "if I hit all my targets and put in some overtime, the boss can buy another sports car this year"
I don't want to work extra to make someone else richer. Maybe if I had a shit load of shares. Maybe. But I don't. So I do my job with professional standards, but I'm not doing 12 hour days
Maybe it is just my experience, but in the last decade, employers stopped trying to recruit and retain top developers.
I have been a full time software engineer for more than a decade. In the 2010s, the mindset at tech giants seemed to be that they had to hire the best developers and do everything they could to keep them. The easiest way to do both was to be the best employer around. For example, Google had 20% time, many companies offered paid sabbaticals after so many years, and every office had catering once a week (if not a free cafeteria). That way, employees would be telling all of their friends how great it is to work for you and if they decide to look for other work, they would have to give up their cushy benefits.
Then, a few years before the pandemic, my employer switched to a different health insurance company and got the expected wave of complaints (the price of this drug went up, my doctor is not covered). HR responded with "our benefits package is above industry averages". That is a refrain I have been hearing since, even after switching employers. The company is not trying to be the best employer that everyone wants to work at, they just want to be above average. They are saying "go ahead and look for another employer, but they are probably going to be just as bad".
Obviously, this is just my view, so it is very possible that I have just been unlucky with my employers.
I've kinda checked out of the private sector for this reason. I've been having a great time working for a government job. Great benefits, union, etc... pay is about 80 percent of what others make but it's more than enough to get by.
In the 2010s, the mindset at tech giants seemed to be that they had to hire the best developers and do everything they could to keep them.
Not really. The mindset was actually to hire skilled developers just to dry up the market, so that their competitors would not have skilled labour to develop their own competing products and services.
Then the economy started to take a turn for the worse, and these same companies noted that not only they could not afford blocking their competitors from hiring people but also neither did their competitors. Twice the reasons to shed headcount.
It was not a coincidence that we saw all FANGs shed people at around the same time.
This is the first rule of sales. It is not important or necessary to be the best. It is only necesaary to be slightly less shitty than your nearest competitor.
The thing that frustrates me about developers who feel powerless over technical debt is...who is actually stopping them from dealing with it? They way I see it, as a software engineer, your customer is sales/marketing/product/etc. They don't care about the details or maintenance, they just want the thing. And that's okay. But you have to include the cost of managing technical debt into the line items the customer wants. That is, estimate based on doing the right things, not taking shortcuts. Your customer isn't reading your commits. If they were, they wouldn't need you.
It would be bizarre if your quote for getting your house siding redone included line items for changing the oil on the work truck, organizing the shop, or training new crew members. But those costs of business are already factored into what you pay at the end of the day.
Yes, this. Refactor first to make the upcoming change easier and cleaner, not after. Don’t ask for permission, don’t even call it refactoring or cleanup. Just call it working on the feature, because that’s what it is. Don’t let non-engineers tell you how to engineer.
who is actually stopping them from dealing with it?
Management. Someone in management sets idiotic deadlines, then someone tells you “do X”, you estimate and come up with “it will take T amount of time” and production simply tells you “that’s too long, do it faster”
they don’t care about the details or maintenance
They don’t, they care about time. If there are 6 weeks to implement a feature that requires reworking half the product, they don’t care to know half the product needs to be reworked. They only care to hear you say that you’ll get it done in 6 weeks. And if you say that’s impossible, they tell you to do it anyway
you have to include the cost of managing technical debt
I do, and when I get asked why my time estimations are so long compared to those of other colleagues I say I include known costs that are required to develop the feature, as well as a buffer for known unknowns and unknown unknowns which, historically, has been necessary 100% of the time and never included causing us development difficulties and us running over cost and over time causing delays and quality issues that caused internal unhappiness, sometimes mandatory overtime, and usually a crappy product that the customers are unhappy with. That’s me doing a good job right? Except I got told to ignore all of that and only include the minimum time to get all of the dozens of tiny pieces working. We went over time, over cost, and each tiny piece “works” when taken in isolation but doesn’t really mix with everything else because there was no integration time and so each feature kinda just exists there on its own.
Then we do retrospectives in which we highlight all the process mistakes that we ran into only to do them all again next time. And I get blamed come performance review time because I was stressed and I wasn’t at the top of my game in the last year due to being chronically overburdened, overworked, and underpaid.
Yeah management is totally backwards there; it's like the building manager on a construction project going "all electrical needs to be done in X weeks", but realistically they have no direct control over that deadline being met by declaring an arbitrary deadline. The unfortunate difference is that if you do a shitty job wiring a building, you'll fail inspection and have to spend more time and money fixing it. Software can often hobble along; there aren't strict enforcements for quality that the business can legally ignore, so you'll always have sad defeated devs go "okay boss, we'll skip the things we need to get this done faster for you (I hate this job and don't care about the product's long term success)". Having a steady supply of those people will slowly kill a software company.
In the past, I've dealt with estimate pushback not by explaining what necessary work can be removed like tests, documentation, or refactoring, but by talking through ways to divide the project more effectively to get more people involved (up to a point, a la mythical man month). That seems to go more proactively. Then we look at nixing optional requirements. But, I've also usually dealt with mostly competent engineering management.
I believe for many companies, developers work on giant codebases with many hundred thousands or even millions of lines of code.
With such large codebase you have no control over any system. Because control is split between groups of devs.
If you want to refactor a single subsystem it would take coordination of all groups working on that part and will halt development, probably for months. But first you have to convince all the management people, that refactor is needed, that on itself could take eternity.
So instead you patch it on your end and call it a day.
So instead you patch it on your end and call it a day.
Yep!
I'm looking forward to the horror stories that emerge once some percentage of those changes are made solely by unmanaged hallucination-prone AI.
I would feel bad for the developera who have to clean up the mess, but honestly, it's their chance to make $$$$$$ off of that cleanup. If they manage not to, their union is completely incompetent.
The bloody managers are the biggest problem. Most don't understand code much less the process of making a software product. They force you into idiotic meetings where they want to change how things work because they "don't have visibility into the process" which just translated to "I don't understand what you're doing".
Also trying to force people who love machines but people less so into leading people is a recipe for unhappiness.
But at least the bozos at the top get to make the decisions and the cheddar for being ignorant and not listening.
The bloody managers are the biggest problem. Most don’t understand code much less the process of making a software product.
So, I've had my eye on management and started doing some management training. The job of management really isn't to do the work itself (or even to understand the work). That's the job of specialists and technical leads. The job of management is to oversee the workforce (hiring, organizing teams, dictating process, allocating project time, planning mid and long term department goals, etc) not to actually get your hands into the work itself.
It's certainly helpful to understand coding broadly speaking. But I'm in an office where we're supporting dozens of apps written and interfaced with at least as many languages. Nevermind all the schemas within those languages. There's no way a manager could actually do my job without months (if not years) of experience in the project itself.
At the same time, the managers should understand the process of coding, particularly if they're at the lower tier and overseeing an actual release cycle. What causes me to pull my hair out is managers who think hand-deploying .dlls and fixing user errors with SQL scripts is normal developer behavior and not desperate shit you do when your normal workflows have failed.
Being in a perpetual state of damage control and thinking that this is normal because you inherited from the last manager is the nightmare.
But at least the bozos at the top get to make the decisions and the cheddar for being ignorant and not listening.
Identifying and integrating new technologies is normal and good managerial behavior.
Getting fleeced by another round of over-hyped fly-by-night con artists time after time after time is not as much.
But AI seems to thread the needle. Its sophisticated and helpful enough to seem useful on superficial analysis. You only really start realizing you've been hoodwinked after you try and integrate it.
Setting aside the absurd executive level pay (every fucking corporate enterprise is just an MLM that's managed to stay cash positive) it does feel like the problem with AI is that each business is forced to learn the lesson the hard way because no business journal or news channel wants to admit that its all shit.