Op-ed: Slowed manufacturing advancements are upending the way tech progresses.
"These price increases have multiple intertwining causes, some direct and some less so: inflation, pandemic-era supply crunches, the unpredictable trade policies of the Trump administration, and a gradual shift among console makers away from selling hardware at a loss or breaking even in the hopes that game sales will subsidize the hardware. And you never want to rule out good old shareholder-prioritizing corporate greed.
But one major factor, both in the price increases and in the reduction in drastic “slim”-style redesigns, is technical: the death of Moore’s Law and a noticeable slowdown in the rate at which processors and graphics chips can improve."
It’s not that they’re not improving like they used to, it’s that the die can’t shrink any more.
Price cuts and “slim” models used to be possible due to die shrinks. A console might have released on 100nm, and then a process improvement comes out that means it can be made on 50nm, meaning 2x as many chips on a wafer and half the power usage and heat generation. This allowed smaller and cheaper revisions.
Now that the current ones are already on like 4nm, there’s just nowhere to shrink to.
Not to mention that even when some components do shrink, it's not uniform for all components on the chip, so they can't just do 1:1 layout shrinks like in the past, but pretty much need to start the physical design portion all over with a new layout and timings (which then cascade out into many other required changes).
Porting to a new process node (even at the same foundry company) isn't quite as much work as a new project, but it's close.
Same thing applies to changing to a new foundry company, for all of those wondering why chip designers don't just switch some production from TSMC to Samsung or Intel since TSMC's production is sold out. It's almost as much work as just making a new chip, plus performance and efficiency would be very different depending in where the chip was made.
NM has been a marketing gimmick since Intel launched their long-standing 14nm node. Actual transistor density depending on which fab you compare to is shambles.
It's now a title / name of a process and not representative of how small the transistors are.
I've not paid for a CPU upgrade since 2020, and before that I was using a 22nm CPU from 2014. The market isn't exciting (to me anymore), I don't even want to talk about the GPUs.
Back in the late 90s or early 2000s upgrades felt substantial and exciting, now it's all same-same with some minor power efficiency gains.
This article doesn't factor in the new demand that is gobbling up all the CPU and GPU production: Ai server farms. For example, Nvidia, that was once only making graphic cards for gamers, has been trying to keep up with global demand for Ai. The whole market is different, then toss tarrifs and the rest of top.
I wouldn't blame moores law death, technology is still advancing, but per usual, based on demand.
Actually not really: performance per watt of the high end stuff has been stagnating since Ampere generation. NVidia hides it by changing models in its benchmarks or advertising raw performance without power figures.
AI has nothing to do with it. Die shrinks were the reason for “slim” consoles and big price drops in the past. Die shrinks are basically a thing of the past now.
Not exactly, but smaller nodes are getting really expensive. So they could make a "slim" version with a lower power unit, but it would likely cost more than the original.
You don't need a top end card to match console specs, something like a 6650XT or 6700XT is probably enough. Your initial PC build will be more than a console by about 2X if you're matching specs (maybe 3X if you need a monitor, keyboard, etc), but you'll make it up with access to cheaper games and being able to upgrade the PC without replacing it, not to mention the added utiliy a PC provides.
So yeah, think of PC vs console as an investment into a platform.
If you only want to play 1-2 games, console may be a better option. But if you're interested in older or indie games, a PC is essential.
2060 super for 300, and then another 200 for a decent processor puts you ahead of a ps5 and for a comparable price. Games are cheaper on PC too, as well as a broader selection. https://pcpartpicker.com/list/zYGmJn here is a mid tier build for 850, you could cut the procesor down, install linux for free, and im sure youve got a computer monitor laying around somwhere... the only thing stopping you is inertia.
I mean, for the price of a mid range graphics card I can still buy a whole console. GPU prices are ridiculous. Never mind everything else on top of that.
Yeah but remember to factor in that you probably already need a normal computer for non-game purposes so if you also use that for games you only have to buy one device not two
Yeah, GPU prices are kinda ridiculous, but a 7600 is probably good enough to match console quality (essentially the same as the 6650XT, so get whatever is cheaper), and I see those going for $330. It should be more like $250, so maybe you can find it closer to that amount when there's a sale. Add $500-600 for mobo, CPU, PSU, RAM storage, and a crappy case, and you have a decent gaming rig. Maybe I'm short by $100 or so, but that should be somewhere in the ballpark.
So $900-1000 for a PC. That's about double a console, extra if you need keyboard, monitor, etc. Let's say that's $500. So now we're 3x a console.
Entry cost is certainly higher, so what do you get in return?
deeper catalogue
large discounts on older games (anything older than a year or so)
emulation and other PC tasks
can upgrade piecemeal - next console gen, just need a new CPU + GPU, and if you go AMD, you can probably skip a gen on your mobo + RAM
can repurpose old PC once you rebuild it (my old PC is my NAS)
generally no need to pay a sub for multiplayer
Depending on how many and what types of games you play, it may or may not be cheaper. I play a ton of indies and rarely play AAA new releases, so a console would be a lot more expensive for me. I also have hundreds of games, and probably play 40 or so in a given year (last year was 50 IIRC). If I save just $10 per game, it would be the same price as a console after 2 years, but I save far more since I wait for sales. Also, I'll have a PC anyway, so technically I should only count the extra stuff I buy for playing games, as in my GPU.
GPU prices are ridiculous, but those GPUs are also ridiculously more powerful than anything in any console.
The rough equivalent to a PS5Pro's GPU component is a ... not current gen, not last gen, but the gen before that... find AMD's weakest GPU model in the 6 series, the RX 6600, and that is roughly the same performance as the GPU performance of a PS5Pro.
The Switch 2 may have an interesting, custom mobile grade Nvidia APU, but at this point, its not out yet, no benchmarks, etc.
Oh right also: If GPU prices for PCs remain elevated... well, any future consoles will also have elevated prices. Perhaps not to the same degree, but again, that will be because a console will be basically fairly low tier if you compared it to the range of PC hardware... and console mfgs can subsidize console costs with game sales... and they get discounts on ordering the components that go into their consoles by ordering in huge bulk volumes.
The Steam Deck is basically a PC. You can get mini PCs with APUs of a similar performance for very low prices these days. That won't perform like a current gen console but it's a cheap gaming machine with a huge selection of low cost games and you won't have to pay for multiplayer.
Is it Moores law failing or have we finally reached the point where capitalists are not even pretending to advance technology in order to charge higher prices? Like are we actually not able to make things faster and cheaper anymore or is the market controlled by a monopoly that sees no benefit in significantly improving their products? My opinion has been leaning more and more towards the latter since the pandemic.
I don't agree. It is capitalism, but not in a bad way. Simply put it is economy logic. Chip market has shifted from consumer market to the enterprise market.
So because the supply is limited, the demand has gone way up, and enteprise market has a lot, a mean a lot of money to spare buying, because it is an investment for them and not entertainment.
Also some bad capitalist tacticts in other areas, hard drives for example, that the big players reduced production to keep prices from falling. They cotribute to the problem, but they are not the major factor.
While blaming anything and everything on "capitalism" is disingenuous, it really does have to do with a lack of competition in the space. None of the incumbents have any incentive to really put much effort into improving the performance of gaming GPUs. PC CPUs face a similar issue. They're good enough for the vast majority of users. There is no sizable market that would justify spending huge amounts of money on developing new products. High end gaming PCs and media production workstations are niche products. The real money is made in data centre products.
Moore's law started failing in 2000, when single core speeds peaked, leading to multi core processors since. Memory and storage still had ways to go. Now, the current 5nm process is very close to the limits imposed by the laws of physics, both in how small a laser beam can be and how small a controlled chemical reaction can be done. Unless someone can figure a way to make the whole chip fabrication process in less steps, or with higher yield, or with cheaper machines or materials, even if at 50nm or larger, don't expect prices to drop.
Granted, if TSMC stopped working in Taiwan, we'd be looking at roughly 70% of all production going poof, so that can be considered a monopoly (it is also their main defense against China, the "Silicon Shield", so there's more than just capitalistic greed at play for them)
Very interesting! I was aware of the 5nm advancements and the limitations of chip sizes approaching the physical limitations of the material but I had been assuming since we worked around the single core issue a similar innovation would appear for this bottleneck. It seems like the focus instead was turned towards integrating AI into the gpu architecture and cranking up the power consumption for marginal gains in performance instead of working towards a paradigm shift. Thanks for the in depth explanation though, I always appreciate an opportunity to learn more about this type of stuff!
Wtf, that headline is fucking backwards thinking and capitalistic. If you’re not greedy and don’t have unnecessary high standards that doesn’t make a game, you’re the problem. Sorry not sorry but gamers demand and the companies are at fault here.