i just upgraded this year, to an r9 5900x, from my old r5 2600, still running a 1070 though.
I do video editing and more generally CPU intensive stuff on the side, as well as a lot of multitasking, so it's worth the money, in the long run at least.
I also mostly play minecraft, and factorio, so.
ryzen 5000 is a great upgrade path for those who don't want to buy into am5 yet. Very affordable. 7000 is not worth the money, unless you get a good deal, same for 9000, though you could justify it with a new motherboard and ram.
I'm the one person who people go to for PC part advice, but I actually try to talk them down. Like, do you need more RAM because your experience is negatively impacted by not having enough, or do you just think you should have more just because?
I built a PC in 2011 with an AMD Phenom II. Can't remember which one, it may have been a 740. And I'm pretty sure a Radeon HD 5450 until FO4 came out in 2015 and I needed a new graphics card. Upgraded to a Radeon R7 240, and some other AM3 socketed CPU I found for like, $40 on eBay. By no means was I high end gaming over here. And it stayed that way until 2020, when I finally gutted the whole thing and started over. It ran everything I wanted to play. So I got like, 9 years out of about $600 in parts. That's including disc drives, power supply, case, and RAM. And I'm still using the case. I got my money's worth out of it, for sure. The whole time we were in our apartment, it was hooked up to our dumb TV. So, it was our only source of Netflix, YouTube, DVDs, and Blu-rays. It was running all the time. Then, I gave all the innards to my buddy to make his dad a PC for web browsing. It could still be going in some form, as far as I know.
I showed this to my penultimate daughter, who coopted my (literal 2014) Dell PC, the only thing I'd ever done to it was add memory, it is a beast still. Said "look, your 4chan twin" and she cracked up. But if she does not steal it when she moves out I will probably be able to get ten more years out of it.
I originally built my current PC back in 2016 and only just "upgraded" it last year. I put upgrade in quotes because it was literally a free motherboard and GPU my buddy no longer needed. I went from a Core i5 6600K to a Ryzen 5 5500GT and a GTX960 4GB to a GTX1070. Still plays all the games I want it to, so I have no desire to upgrade it further right now. I think part of it is I'm still using 1080P 60Hz monitors.
I'm still using the i7 I built up back in 2017 or so... Upgraded to SSD some years ago, will be upping the ram to 64gigs (max the mb can handle) in a few days when it arrives...
if you had a top of the line pc in 2014 you'd be talking about a 290x/970/980 which would probably work really well for most games now. For CPU that'd be like a 4th gen intel or AMD Bulldozer which despite its terrible reputation probably runs better nowadays thanks to better multi-threading.
A lot of the trending tech inflating minimum requirements nowadays are stuff like raytracing (99% of games don't even need it) and higher FPS/resolution monitors that aren't that relevant if you're still pushing 1080p/60. Let's not even begin with Windows playing forced obsolescence every few years.
Hell, most games that push the envelope of minimum specs like Indiana Jones are IMO just unoptimised messes built on UE5 than legitimately out of scope of hardware from the last decade. Stuff like Ninite hasn't delivered in enabling photorealistic asset optimisation but HAS enabled studios to cut back on artist labour in favour of throwing money at marketing.
I thought anon was the normie? The average person doesnt upgrade their PC every two years. The average person buys a PC and replaced it when nothing works anymore. Anon is the normie, they are the enthusiasts. Anon is not hanging with a group of people with matching ideologies.
They're invested in PC gaming as social capital where the performance of your rig contributes to your social value. They're mad because you're not invested in the same way. People often get defensive when others don't care about the hobbies they care about because there's a false perception that the not caring implies what they care about is somehow less than, which feels insulting.
Don't yuck others' yum, but also don't expect everyone to yum the same thing.
People want shiny new things. I've had relatives say stuff like "I bought this computer 2 years ago and it's getting slower, it's awful how you have to buy a new one so quickly." I suggest things to improve it, most of which are free or very cheap and I'd happily do for them. But they just go out and buy a brand new one because that's secretly what they wanted to do in the first place, they just don't want to admit they're that materialistic.
One upside of AAA games turning into unimaginative shameless cash-grabs is that the biggest reason to upgrade is now gone. My computer is around 8 years old now. I still play games, including new games - but not the latest fancy massively marketed online rubbish games. (I bet there's a funner backronym, but this is good enough for now.)
I had an i5-2500k from when they came out (I think 2011? Around that era) until 2020 - overclocked to 4.5Ghz, ran solid the whole time. Upgraded graphics card, drives, memory, etc. but that was incremental as needed. Now on an i7-10700k. The other PC has been sat on the side and may become my daughters or wife's at some point.
The computer I built in 2011 lasted until last summer. I smiled widely when I came to tell my wife and my friend, where my friend then asked why I was smiling when my computer no longer worked.
"Because now he can buy a new one" my wife quickly replied 😁
My $90US AWOW mini with Celeron J4125, 8 gigs of shared memory, 128gig SSD seems to run FreeDoom as good as any of the other potatos them GamerBoi fancy water cooled custom boxes have.........
Yeah, I'm with you anon. Here's my rough upgrade path (dates are approximate):
2009 - built PC w/o GPU for $500, only onboard graphics; worked fine for Minecraft and Factorio
2014 - added GPU to play newer games (~$250)
2017 - build new PC (~$800; kept old GPU) because I need to compile stuff (WFH gig); old PC becomes NAS
2023 - new CPU, mobo, and GPU (~$600) because NAS uses way too much power since I'm now running it 24/7, and it's just as expensive to upgrade the NAS as to upgrade the PC and downcycle
So for ~$2200, I got a PC for ~15 years and a NAS (drive costs excluded) for ~7 years. That's less than most prebuilts, and similar to buying a console each gen. If I didn't have a NAS, the 2023 upgrade wouldn't have had a mobo, so it would've been $400 (just CPU and GPU), and the CPU would've been an extreme luxury (1700 -> 5600 is nice for sim games, but hardly necessary). I'm not planning any upgrades for a few years.
Yeah it's not top of the line, but I can play every game I want to on medium or high. Current specs: Ryzen 5600, RX 6650 XT, 16GB RAM.
People say PC gaming is expensive. I say hobbies are expensive, PC gaming can be inexpensive. This is ~$150/year, that's pretty affordable... And honestly, I could be running that OG PC from 2009 with just a second GPU upgrade for grand total of $800 over 15 years if all I wanted was to play games.
Still on a 1060 here. Sure, it's too slow for anything from the PS5 era, but that's what my PS5 is for.
It does have a 1 in 4 chance of bluescreening when I quit FFXIV, but I don't know what's causing that. Running it at 100% doesn't seem to crash it, possibly something about the drivers not freeing shit properly, I dunno.
I upgraded last year from i7-4700k to i7-12700k and from GTX 750Ti to RTX 3060Ti, because 8 threads and 2GB of vram was finally not enough for modern games. And my old machine still runs as a home server.
The jump was huge and I hope I'll have money to upgrade sooner this time, but if needed I can totally see that my current machine will work just fine in 6-8 years.
I've upgraded pretty much everything in my 2009 PC and only just finally bought a new CPU. I just need a new case.for everything. The last straws were Elden Ring being CPU bottle necked at 20 FPS and Helldivers 2 requiring some instruction that wasn't on my CPU.
Yeah I'm daily-ing a laptop from 2019 with an i7-9750, a GTX1650, and 16 gb of RAM. No upgrades except storage. The GPU is the only thing that sometimes makes me go "hm."
Only stopped using my Bulldozer-era box because it started crashing and freezing. And a BIOS fix Asus support suggested nuked my board. I had the thing maxed out... 12 SSDs in soft RAID, GTX570s in SLI. It was a monster. I still have most of the parts and I'm sure it would run a lot of stuff just fine at the cost of heat and noise :]
My current PC used for gaming is a self built one from 2014. I have upgraded a few things during the years, most notably GPU and memory, but it did an excellent job for over a decade.
Recently it started to show its age with various weird glitches and also some performance issues in several newer games and so I've just ordered a new one. But I'm pretty proud of my sustainable computing achievement.
Maybe it's just my CPU or something wrong with my setup, but i feel like new games (especially ones that run on Unreal Engine 5) really kick my computers ass at 1440p. Just got the 7900xtx last year and using a ryzen 9 3900xt i got from 2020 for reference. I remember getting new cards like 10 years ago and being able to crank the settings up to max with no worries, but nowadays I feel I gotta worry about lowering settings or having to resort to using upscaling or frame generation.
Games dont feel very optimized anymore, so I can see why people might be upgrading more frequently thinking it's just their pc being weak. I miss the days where we could just play games in native resolution.
I still have my 2014 machine. I've upgraded it with an M.2 drive and more RAM. Everything else is perfectly fine and I wouldn't see the difference with a newer machine. I'll keep it for a long as I can because the longer I wait the better the machine I replace it with will be.
Also I just wouldn't know what to do with it after. I can't bring myself to throwing away a perfectly good machine, but keeping it would be hoarding.
I was with them until my girlfriend gifted me a 180Hz monitor last year and now I can't deal with less than 90 FPS so I had to finally upgrade my RX580 (I just found out it stopped getting driver updates in January 2024 so I guess it was about time). High refresh rates ruin you.
Upgrading my ryzen 7 1700 and GTX 1080 for a 5800X3D and RX 7900 XT this weekend. Waiting for the CPU but it's cool to be able to go from first to last Gen that this motherboard can support
It's easy to go too far in either direction instead of just doing what fits your needs (which in fairness, can sometimes be difficult to precisely pin down). Blindly going "it's old, I need to upgrade" or "it still runs, it's not worth upgrading" will sometimes be right but it's not exactly tailored advice.
Someone I know was holding out for ages on a 4790K (2014), and upgraded a year or two ago to a then-current-gen system and said the difference it made to their workflow was huge - enough that they actually used that experience to tell their boss at work that the work systems (similar to what they had had themselves) should get upgraded.
At the end of 2022 I had had my current monitor(s) for about 10 years and had spent years of hearing everyone saying "wow upgrading my monitor was huge", saying that either 1440p was such an upgrade over 1080p and/or that high refresh rate (120+Hz) was such an upgrade over 60Hz. I am (or at least was in the past) a pretty competitive player in games so you'd think I'd be a prime candidate for it, but after swapping from a 60Hz 1200p screen to a 144Hz 1440p screen for my primary monitor I... honestly could barely notice the difference in games (yes, the higher refresh rate is definitely enabled, and ironically I can tell the difference easily outside of games lol).
I'm sensitive to input latency, so I can (or at least could, don't know if I still can) easily tell the difference between the responsiveness of ~90 FPS and ~150 FPS in games, so it's extra ironic that pumping the refresh rate of the screen itself didn't do much for me.
I'll do you onetwo better: my computer's from 2012. I can play even modern games on high settings sometimes. It wasn't even a high specced one at the time. I think I put about $1200 into the actual components AND monitor/keyboard.
My 2008 librebooted t440p thinkpad
Says hold my beer.
Browses the web like its a 2025 desktop
Its amazing
Except for the compile times (it runs gentoo :D)
My current PC is an asus rog with a gtx 1070 (and a piece of shit screen that gets all fucky if it heats) that I bought used, back in late 2019. The old hard drive failed some time ago and I had to change it, sometimes the main SSD seems to get strangely fucky (BSODs followed by disk scans), too, as does the memory (BSODs about "corrupted_page_memory", also complete freezes under Linux Mint, not even ctrl alt F1 worked), which makes me think the components aren't exactly high quality (considering how shitty the screen is and asus in general in the past years, that's no surprise)
Still, I fully intend to keep this bad boy as my main workhorse for at least another 2 years, possibly longer. After that, I'll probably relegate it to being the party game machine.
I could say I still run my 2014 (or 15, I don't remember) PC, but it's Ship of Theseus'd at this point, the only OG parts left are the CPU, PSU, case, and mobo.
My PC is still largely the same, in general spirit, as when I built it (c 2014-2015). But I have had to upgrade some key components over time. First was the move from a 1TB WD Blue HDD to a Samsung 860 Pro 128GB SSD (for my OS's drive), and, related to that, at some point soon after, I moved my games drive from an HDD to an SSD. Next, I upgraded my GPU from an Nvidia GeForce GTX 760 to a Nvidia GeForce GTX 1080. This build state lasted a decently long time until I switched from Windows to Linux, so I switched my Nvidia GPU to an AMD Radeon RX 6600 (not exactly an upgrade, but more of a side-grade) to improve the user experience. The most recent change (last year, iirc?) was upgrading my RAM from 8GB DDR3, to 16GB DDR3. My CPU (Intel Core i5-4690k) is starting to really show its age, though, so I've been wanting to upgrade that, but that will likely entail a near rebuild of my entire system, so I've been avoiding it, but, unfortunately, it's increasingly becoming more of an issue.
I always keep my PCs for about 8 years. Usually it is necessary to update the HDD/SSD and the GPU during that time, that is all. Mine will be 4 years old by the end of this year. I am now actively checking out 4TB SSDs in order to replace my current 1TB SSD.
This strategy may stop to work unfortunately. With the advent of ARM in desktop PCs, the PCs seem to become more monolithic. RAM and GPU not swappable, I think MACs don't even allow you to plop in more RAM. I don't like this development.
Same, same. Except I don't really play games, but use the computer for other hobbies. It's still plenty fast and does everything I need it to do. So why buy something that does exactly the same, just is newer and looks different?
When talking about hardware, if it works for you keep using it till it doesn't work. But when talking about desktop operating systems, you should be aware when it loses security updates support and try to upgrade to different one that works for you but has better security updates.