NVIDIA's RTX 50 series of GPU arrives later this month, with pricing starting at $549.
The first salvo of RTX 50 series GPU will arrive in January, with pricing starting at $549 for the RTX 5070 and topping out at an eye-watering $1,999 for the flagship RTX 5090. In between those are the $749 RTX 5070 Ti and $999 RTX 5080. Laptop variants of the desktop GPUs will follow in March, with pricing there starting at $1,299 for 5070-equipped PCs.
I've ditched my gaming PC and am currently playing my favorite game (Kingdom Come Deliverance) on an old laptop. Which means I can't go higher than 800x480.
And honestly, the immersion works. After a couple minutes I don't notice it anymore.
VR enthusiasts can put it to use. The higher end headsets have resolutions of over 5000 x 5000 pixels per eye.
You are basically rendering the entire game twice, once for each eye, and the resolution is like eight times as many pixels compared to your typical 1080p game
The prices are high, but what really is shocking are the power consumption figures. The 5090 is 575W(!!), while the 5080 is 360W, 5070Ti is 300W, and the 5070 is 250W.
If you are getting one of these, factor in the cost of a better PSU and your electric bill too. We're getting closer and closer to the limit of power from a US electrical socket.
Residential circuits in USA are 15-20A, very rarely are they 10 but I've seen some super old ones or split 20A breakers in the wild.
A single duplex outlet must be rated to the same amperage as the breaker in order to be code, so with a 5090 PC you're around half capacity of what you'd normally find, worst case. Nice big monitors take about an amp each, and other peripherals are negligible.
You could easily pop a breaker if you've got a bunch of other stuff on the same circuit, but that's true for anything.
I think the power draw on a 5090 is crazy, crazy high don't get me wrong, but let's be reasonable here - electricity costs yes, but we're not getting close to the limits of a circuit/receptacle (yet).
Actually the National Electric Code (NEC) limits loads for 15 Aac receptacles to 12 Aac, and for 20 Aac receptacles 16 Aac iirc because those are the breaker ratings and you size those at 125% of the load (conversely, 1/125% = 80% where loads should be 80% of the break ratings).
So with a 15 Aac outlet and a 1000 Wac load at minimum 95% power factor, you're drawing 8.8 Aac which is ~73% of the capacity of the outlet (8.8/12). For a 20 Aac outlet, 8.8 Aac is ~55%% capacity (8.8/16).
Nonetheless, you're totally right. We're not approaching the limit of the technology unlike electric car chargers.
That's just the GPU with efficient other parts. Now if we do 575W GPU + 350W CPU + 75W RGB fans + 200W monitors + 20% buffer, we are at 1440W, or 12A. Now we're close to popping a breaker.
This makes me curious: What is the cheapest way to get a breaker that can handle more power? It seems like all the ways I can think of would be many 5090s in cost.
Welp, looks like I'll start looking at AMD and Intel instead. Nvidia is pricing itself at a premium that's impossible to actually meet compared to competitors.
There will be people that buy it. Professionals that can actually use the hardware and can justify it via things like business tax benefits, and those with enough money to waste that it doesn't matter.
For everyone else, competitors are going to be much better options. Especially with Intel's very fast progression into the dedicated card game with Arc and generational improvements.
I'm probably one of those people. I don't have kids, I don't care much about fun things like vacations, fancy food, or yearly commodity electronics like phones or leased cars, and I'm lucky enough to not have any college debt left.
A Scrooge McDuck vault of unused money isn't going to do anything useful when I'm 6 feet underground, so I might as well spend a bit more (within reason*) on one of the few things that I do get enjoyment out of.
* Specifically: doing research on what I want; waiting for high-end parts to go on sale; never buying marked-up AIB partner GPUs; and only actually upgrading things every 5~6 years after I've gotten good value out of my last frivolous purchase.
Yeah, it’s all priorities. I don’t see myself buying a $2000 GPU any time soon, but if I was single and playing PC games every day in 4K or VR, I could get thousands of hours of use over the next few years from that GPU.
Compare that with other types of entertaining products and activities (vacations, cars, etc) and it starts to look not bad in comparison.
Still not in the plans for my particular situation though, lol.
My company could buy me this (for video editing), but I mostly need it for vram that should be cheap. I would like to be able to afford it without it doubling the price of my pc.
The RTX 4090 was released as the first model of the series on October 12, 2022, launched for $1,599 US, and the 16GB RTX 4080 was released on November 16, 2022 for $1,199 US.
So they dropped the 80 series in price by $200 while increasing the 5090 by $400.
Pretty smart honestly. Those who have to have the best are willing to spend more and I’m happy the 80 series is more affordable.
There's gonna be as many tariffs as there were walls that got built and paid for by Mexico.
Not because it's bad for the American people.
It's because the same people in congress who would install tariffs are making hundreds of millions hand over first on insider trading stocks. They aren't gonna fuck up the gravy train for Trumps dumb ass campaign ramblings.
Maybe. There's not much precedent for what's coming - Trump is FAR more influential to the GOP than he was in his first term. I certainly hope that nobody will be able to get anything done, but I'm also not counting on it.
My question is will the 5080 perform half as fast as the 5090. Or is it going to be like the 4080 vs 4090 again where the 4080 was like 80% the price for 60% the performance?
I think that at higher resolutions (4k) there is gonna be a bit bigger difference than in gen 40 bcs of 256bit vs 384bit mem bussy in 4080 vs 4090 compared to 256bit vs 512bit in 5080 vs 5090.
That memory throughput & bandwidth might not get such a big bump in the next gen or two.
Yeah I'm never willing to afford the best. I usually build a new computer with second best parts. With these prices my next computer will be with third best stuff I guess.
Yeah sure, the 5090 will be a 2k the same way a 3080 went for 800...i watched them peak at 3500 (seriously, i screenshotted it but it got lost as i gave up the salt).
The 4090 is sitting at 2400 ($2500)right now over here, i can 100% assure you the 5090 will cost more than that when it gets here.
It's called upcycling if the frame is generated then the kids have to mine less virgin frames from the mines. Before this breakthrough we were rendering so many frames the landfills were completely full.
I forgot they weren't doing an 80 tier or higher competitor next generation. Their pricing is still out of whack since they find the competing Nvidia card and price it a bit cheaper. I know people will still buy their cards and they're still usually better value than Nvidia but their prices are still blowing up