I think there's also a "Netflix effect" where old games are incresingly accessible as an alternative to newer crap, kinda like (from my personal observations) how a lot of young people seem to be really fluent in old movies and TV due to streaming and YT.
Indies I think helped younger gamers and old gamers become less impressed by graphics compared to the past. Gamers expect more and there's many indies and old games people haven't played.
I was playing the original not too long ago, and they freaking "updated" it with ai upscaling. It looks like absolute trash. I couldn't figure out how to get rid of it.
Why would you do that and also remaster it?
And that the requirements for those minimal improvements are vast. If you need to pull down 200GB for a minor graphical upgrade, that's just not really worth it compared to an older game that is a bit graphically worse, but is both smaller, and runs better on newer hardware.
I would even go lower than that. If you showed me Prey 2017 or even Alien Isolation 2014, and told me they came out today. I would probably believe you.
Which is why they were ignored for a decade or two despite game engines easily achieving those FPS numbers and were pulled out exactly when the hardware vendors ran out of any other arguments to convince people to replace their existing screens and GPUs?
Maybe you're only seeing the marketing now, probably because the customer base that would care about it are finally in large enough numbers due to the business around eSports, but higher frame rates give you better response times, and we've known this for a very long time. In my world, in fighting games, the games only draw at 60 FPS usually, but they can run at a 120 Hz or 144 Hz mode so that they can poll for inputs more frequently, which makes the games feel better to play. Resolution ought to have a tangible impact in FPS games as well.