Thing is: there is always the "next better thing" around the corner. That's what progress is about. The only thing you can do is choose the best available option for you when you need new hardware and be done with it until you need another upgrade.
really my rule of thumb has always been when it's a significant upgrade.
for a long time i didn't really upgrade until it was a 4x increase over my old. certain exceptions were occasionally made. nowadays i'm a bit more opportunistic in my upgrades. but i still seek out 'meaningful' upgrades. upgrades that are a decent jump over the old. typically 50% improvement in performance, or upgrades i can get for really cheap.
Yeah it's always that: "I want to buy the new shiny thing! But it's expensive, so I'll wait for a while for its price to come down." You wait for a while, the price comes down, you buy the new shiny thing and then comes out the newest shiny thing.
Yep. There will always be "just wait N months and there will be the bestest thing that beats the old bestest thing". You are guaranteed to get buyers remorse when shopping for hardware. Just buy what best suits you or needs and budget at the time you decided is the best.time for you (or at the time your old component bites the dust) and then stop looking at any development on those components for at least a year. Just ignore any deals, new releases, whatever and be happy with the component you bought.
I bought a 1080 for my last PC build, downloaded the driver installer and ran the setup. There were ads in the setup for the 2k series that had launched the day before. FML
Yep. I bought a 4080 just a few weeks ago. Now there is ads for the refresh all over... Thing is: you card didn't get any worse. You thought the card was a good value proposition for you when you bought it and it hasn't lost any of that.
just 10-15 years at least, for smartphones\electronics overall too. Process nodes are now harder to reduce, more than ever. holding up to my 12nm ccp phone like there is no tomorrow ..
AMD is a better decision, but my nVidia works great with Linux, but I'm on OpenSUSE and nVidia hosts their own OpenSUSE drivers so it works out of the get go once you add the nVidia repo
I had an nvidia 660 GT back in 2013, it was a pain in the arse being on a leading edge distro, used to break xorg for a couple of months every time there was an xorg release (which admittedly are really rare these days since its in sunset mode). Buying an amd was the best hardware decision, no hassles and I've been on Wayland since Fedora 35.
The article speculates a 5% gain for the 4080 super but a 22% gain for the 4070 super which makes sense because the base 4070 was really disappointing compared to the 3070.
really the RTX 4080 is going to be a sweet spot in terms of performance envelope. that's a card you'll see with some decent longevity, even if it's not being recognized as such currently.
It will depend on the power upgrade offered by the 50XX and the game development studios appetite for more power.
But TBH I don’t see Nvidia able to massively produce a 2 times faster chip without increasing its price again
Meaning, nobody will get the next gen most powerful chip, game devs will have to take that into account and the RTX 4080 will stay relevant for longer time.
Besides, according to SteamDB, most of gamers still have an RTX 2080 or less powerful GPU. They won’t sell their games if you can play it decently on those cards.
The power gap between high-ends GPUs is growing exponentially. It won’t stay sustainable very long
I'm looking to get a 4090 this black Friday, and even with these refreshes, doesn't seem like my purchasing decision would really be affected, unless they're also refreshing the 4090.
For the vast majority of customers that aren't looking to spend close to a grand for a card that is infinitesimally better than a card for half the price, AMD has plenty to offer.
Only slightly related question: is there such a thing as an external nVidia GPU for AI models? I know I can rent cloud GPUs but I am wondering if long-term something like an external GPU might be worth it.
A 3090 (used) is the best bang for your buck for any LLM / StableDiffusion work right now. I've seen external GPU enclosures, though they probably cost as much as slapping a used 3090 into a barebones rig and running it headless in a closet.
Generally speaking, buying outright is always cheaper than renting, because you can always continue to run the device potentially for years, or sell it to reclaim some capital.
Perhaps this is a good place to ask now the topic has been raised. I have an ASUS TUF A15 laptop with an nVidia GTX 1650Ti graphics card and I am SO sick of 500MB driver "updates" that are basically beta tests that break one thing or another. What are the chances of upgrading to a Raedon/AMD graphics card? Or am I stuck with this shit?
have an ASUS TUF A15 laptop with an nVidia GTX 1650Ti graphics card and I am SO sick of 500MB driver “updates” that are basically beta tests that break one thing or another. What are the chances of upgrading to a Raedon/AMD graphics card? Or am I stuck with this shit?
in a laptop? practically none. there are some very rare 'laptops' out there - really chonk tops - that have full size desktop gpu's inside them. the vast majority, on the other hand, will have 'mobile' versions of these gpus that are basically permanently connected to the laptop's motherboard (if not being on the mobo itself).
In my experience, AMD is not more reliable on updates. I had to clean install trice to be able to have my RX 6600 function properly and months later, I have a freezing issue that may be caused by my GPU.
You could use an separate external gpu if you have thunderbolt ports. It's not cheap and you sacrifice some performance but worth it for the flexibility in my opinion. Check out https://egpu.io/