It's like how banks figured there was more money in catering to the super rich and just shit all over the rest of us peasants, GPU manufacturers that got big because of gamers have now turned their backs to us to cater to the insane "AI" agenda.
Also, friendly advice, unless you need CUDA cores and you have to upgrade, try avoiding Nvidia.
Nvidia doesn’t really care about the high-end gamer demographic nearly as much as they used to, because it’s no longer their bread and butter. Nvidia’s cash cow at this point is supplying hardware for ML data centers. It’s an order of magnitude more lucrative than serving consumer + enthusiast market.
So my next card is probably gonna be an RX 9070XT.
For me it's the GPU prices, stagnation of the technology (most performance gains come at the cost of stupid power draw) and importantly being fed up with AAA games. Most games I played recently were a couple years old, indie titles or a couple years old indie titles. And I don't need a high powered graphics card for that. I've been playing far more on my steam deck than my desktop PC, despite the latter having significantly more powerful hardware. You can't force fun through sheer hardware performance
GPU prices are what drove me back to consoles. It was time to overhaul my PC as it was getting painfully out of date. Video card alone was gonna be 700. Meanwhile a whole ass PS5 that plays the same games was 500.
It's been 2 years since and I don't regret it. I miss mods, but not nearly as much as I thought. It also SOOO nice to play multiplayer games without cheaters everywhere. I actually used to be one of those people who thought controllers gave an unfair advantage but... you can use a M/KB on PS5 and guess what? I do just fine! Turns out that the problem was never controllers, it was the cheaters.
But then there is that. The controller. Oh my lord it's so much more comfortable than even the best gaming mouse. I've done a complete 180 on this. So many game genres are just so terrible to play with M/KB that I now tell people whining about controller players this:
Use gaming equipment for gaming and leave office equipment in the office.
Endless infinite growth forever on a fragile and very much finite planet where wages are suppressed and most money is intentionally funneled into the coffers of a small handful of people who are already so wealthy that their descendants 5 generations down the line will still be some of the richest people on the planet.
I just looked up the price and I was "Yikes!". You can get a PS5 Pro + optional Blu-ray drive, Steam Deck OLED, Nintendo Switch 2 and still have plenty of money left to spend on games.
I don't buy every generation and skip 1 if not 2. I have a 40xx series and will probably wait until the 70xx (I'm assumimg series naming here) before upgrading.
Uhhh, I went from a Radeon 1090 (or whatever they're called, it's an older numbering scheme from ~2010) to a Nvidia 780 to an Nvidia 3070 TI. Skipping upgrades is normal. Console games effectively do that as well. It's normal to not buy a GPU every year.
It's just because I'm not impressed, like the raster performance bump for 1440p was just not worth the price jump at all. On top of that they have manufacturing issues and issues with their stupid 12 pin connector? And all the shit on the business side not providing drivers to reviewers etc. Fuuucccckk all that man. I'm waiting until AMD gets a little better with ray tracing and switching to team red.
Unfortunately gamers aren't the real target audience for new GPUs, it's AI bros. Even if nobody buys a 4090/5090 for gaming, they're always out of stock as LLM enthusiasts and small companies use them for AI.
I have a 4090. I don't see any reason to pay $4K+ for fake frames and a few % better performance. Maybe post Trump next gen and/or if prices become reasonable and cables stop melting.
In the US, a new RTX 5090 currently costs $2899 at NewEgg, and has a max power draw of 575 watts.
(Lowest price I can find)
... That is a GPU, with roughly the cost and power usage of an entire, quite high end, gaming PC from 5 years ago... or even just a reasonably high end PC from right now.
...
The entire move to the realtime raytracing paradigm, which has enabled AAA game devs to get very sloppy with development by not really bothering to optimize any lighting, nor textures... which has necessitated the invention of intelligent temporal frame upscaling, and frame generation... the whole, originally advertised point of this all was to make hi fidelity 4k gaming an affordable reality.
This reality is a farce.
...
Meanwhile, if you jump down to 1440p, well, I've got a future build plan sitting in a NewEgg wishlist right now.
RX 9070 (220 W) + Minisforum BD795i SE (mobo + non removeable, high end AMD laptop CPU with performance comparable to a 9900X, but about half the wattage draw) ... so far my pretax total for the whole build is under $1500, and, while I need to double and triple check this, I think the math on the power draw works out to a 650 Watt power supply being all you'd need... potentially with enough room to also add in some extra internal HDD storage drives, ie, you've got leftover wattage headroom.
If you want to go a bit over the $1500 mark, you could fit this all in a console sized ITX case.
That is almost half the cost as the RTX 5090 alone, and will get you over 90fps in almost all modern games, with ultra settings at 1440p, though you will have to futz around with intelligent upscaling and frame gen if you want realtime raytracing as well with similar framerates, and realistically, probably wait another quarter or two for AMD driver support and FSR 4 to become a bit more mature and properly implemented in said games.
Or you could swap out for a maybe a 5070 (non TI, the TI is $1000 more) Nvidia card, but seeing as I'm making a linux gaming pc, you know, for the performance boost from not running Windows, AMD mesa drivers are where you wanna be.
For the price of one 5090 you could build 2-3 midrange gaming PCs lol. It's crazy that anyone would even consider buying it unless they're rich or actually need it for something important.
Not surprised. Many of these high end GPUs are bought not for gaming but for bitcoin mining and demand has driven prices beyond MSRP in some cases. Stupidly power hungry and overpriced.
My GPU which is an RTX2060 is getting a little long in the tooth and I'll hand it off to one of the kids for their PC but I need to find something that is a tangible performance improvement without costing eleventy stupid dollars. Nvidia seems to be lying a lot about the performance of that 5060 so I might look at AMD or Intel next time around. Probably need to replace my PSU while I'm at it.
I bought my most expensive dream machine last year (when the RTX-4090 was still the best) and I am proud of it. I hope it'll be my right for at least 10 years.
I've got RX 6800 XT for €400 in May 2023 which was at that point almost a 3y old card. Fastforward to today, the RX 9060 XT 16GB costs more and is still slower in raster. Only thing going for it is FSR4, better encoder and a bit better RT performance about which I couldn't care less about.
Oh totes. NVIDIA continuing to lie even more blatantly to their face, driver bricking issues on updates, missing GPU ROPS performance, even more burn problems with a connector they knew continued to be problematic and lied about it, they and their retail partners releasing very limited inventory and then serving internal scalping while also being increasingly hostile to the rest of their consumers, ray tracing performance improvements they have to exclusive push in certain games and the newest most expensive hardware to actually get any benefit from their cards, false MSRP pricing and no recourse for long time loyal customers except a lottery in the US while the rest of the regions get screwed. Totes just that it's "too expensive", because when have gamers ever splurged on their hobby?
I think the Steam Deck can offer some perspective. If you look at the top games on SD it's like Baldurs Gate, Elden Ring, Cyberpunk, etc., all games that run REALLY poorly. Gamers don't care that much.
I am just using my GTX 1650 4GB VRAM (GDDR6) and it works fine for most of the things i do I can use Linux + FSR Hack to squeeze framerates out of games that perform poorly
and it runs my SCP:SL and tf2 fine
SCP:SL am using FSR HACK To squeeze more framerate until Nvidia fixes VKD3D
Maybe my next card is RX 6650 XT/AMD but still i might stick with my GTX 1650
I'm still using my GTX 1070. There just aren't enough new high-spec games that I'm interested in to justify paying the outrageous prices that NVIDIA is demanding and that AMD follows too closely behind on. Even if there were enough games, I'd refuse to upgrade out of principle, I will not reward price gouging. There are so many older/lower-spec games that I haven't yet played that run perfectly for me to care. So many games, in fact, that I couldn't get through all of them in my lifetime.
First: Nobody gives a shit about the ray tracing craze, like not really. It applies to a thin margin of games, and is an option easily turned off or avoided. Seeing as AAA games are most of the ones developing for it anyway, and seeing as most of those are utter shit, yeah I'm not buying into the craze and spending obnoxious amounts of money on it.
Increasingly across many markets, companies are not targeting average or median consumers. They're only chasing whales, the people who can pay the premium. They've decided that more mid tier customers aren't worth it -- just chase the top. It also means a lower need for customer support.
I'm sitting on a 3060 TI and waiting for the 40-series prices to drop further. Ain't no universe where I would pay full price for the newest gens. I don't need to render anything for work with my PC, so a 2-3 year old GPU will do just fine
I bought a 3070 for far more than I should've back when that was new, and I don't plan to make that mistake twice. This GPU is likely going to be staying in this PC til it croaks. Never felt the need for anything more powerful anyway, it runs everything I need it to on high settings.
I was gifted a 2080Ti about a year or so ago and I have no intention on upgrading anytime soon. The former owner of my card is a friend who had it in their primary gaming rig, back when SLI wasn't dead, he had two.
So when he built a new main rig with a single 4090 a few years back he gifted me one and the other one he left in his old system and started using that as a spare/guest computer for having impromptu LANs. It's still a decent system, so I don't blame him.
In any case, that upgraded my primary computer from a 1060 3G.... So it was a welcome change to have sufficient video memory again.
The cards keep getting more and more power hungry and I don't see any benefit in upgrading... Not that I can afford it.... I haven't been in school for a long time, and lately, I barely have time to enjoy YouTube videos, nevermind a full assed game. I literally have to walk away from a game for so long between sessions that I forget the controls. So either I can beat the game in one sitting, or the controls are similar enough to the defaults I'm used to (left click to fire, right click to ADS, WASD for movement, ctrl or C for crouch, space to jump, E to interact, F for flashlight, etc etc...); that way I don't really need to relearn anything.
This is a big reason why I haven't finished some titles that I really wanted to, like TLoU, or Doom Eternal.... Too many buttons to remember. It's especially bad with doom, since if you don't remember how, and when to use your specials, you'll run out of life, armor, ammo, etc pretty fast. Remembering which special gives what and how to trigger it.... Uhhh .... Is it this button? Gets slaughtered by an imp ... Okay, not that button. Reload let's try this... Killed by the same imp not that either.... Hmmm. Goes and looks at the key mapping ohhhhhh. Okay. Reload I got it this time.... Dies anyways due to other reasons
Bought a 5700xt on release for £400, ran that til last year when the 7900gre released in the UK. Can't remember what I paid but it was a lot less than the flagship 7900 and I forsee lasting many years as I have no desire to go above 2K.
AMD GPUs have been pretty great value compared to nvidia recently as long as you're not tying your self worth to your average FPS figures.
I'm rocking a GTX 1660 and have no plans to upgrade. Ray-tracing is a scam and all the "AAA" titles that are too vram hungry for my card are not that attractive anyway.
Why not just buy a cheaper one? X060 or X070 series is usually fine in price and runs everything at high enough settings. Flagship is for maxed out everything on 4k+ resolutions. And in those cases, everything else is larger and more expensive as well; the monitor needs to be 4k, huge ass PSU, large case to fit the PSU and card in, even the power draw and energy... costs just start growing exponentially.
I'm on a 2080 or 2090 (I forget which). I thought I'd upgrade to the 40xx now that 5090s are out. I looked at the prices and absolutely not. The 5090s are around 500k JPY, and ordering from the US would work out to about the same with exchange, tax, and any possible tariff that exists this week. Salaries here are also much lower than in the west as well on average even for those of us in software.
4070s are still around 100k which is cheaper than last time I looked at 250k ish.