What's your "this is totally fine and I'm going to have a great time" FPS (refresh rate). ?
For me, anything 25 FPS or higher is 100% fine and I'll be enjoying my time. I never play competitive online shooter games ever, though. All single player ones like GOW and the likes. I game on a 60 Hz 4k monitor. GPU is AMD RX 6600 alongside Ryzen 7 5700G and 32GB RAM. My games are set to meduim most of the time at 4k. Demanding titles are on low. Surprisingly, GOW and GOW Ragnarok are both set to ultra and I still get around 40ish FPS.
Depends on the game. If it's not really demanding on reaction time, and the game is locked framerate I'm fine with 30, like Okami. However if the game is not locked FPS and I still can't hit 60 FPS at least on my 1440p monitor I'd probably just play something else (because I know I could have better experience is I could run it).
However for shooter and reaction heavy games I always aim to max out my 144 Hz monitor, even 60 FPS can feel sluggish for me
Well, I first played Dragon Age Origins with the framerate fluctuating between 10 and 20 FPS. Wasn't the most fun I've ever had, but ever since 30 - 60 felt like luxury. So yeah, anywhere from 10 to 30 is fine for me, but the more active a game is the closer to 30 minimum with a target of 60
I have a very simple process for dealing with all of this - I never check my framerate in the first place, so I never know what it is.
I just play games If there's noticeable stuttering or lag then I maybe try to do something about it, and if there's not, then I just play and don't worry about it.
That's actually a good way of doing it. I used to be this way, but I don't know how and why I started using a team's built in FPS counter and mangohud. I'm going to stop using it so I don't have to keep glancing it all the time. Thank you.
It's not like I notice it more when I have a frame rate counter turned on, I'm just not questioning how bad or how often the drops are when I have it enabled.
I'm old enough that I remember when 28FPS @ 320x200 was considered a target, and my vision isn't as hot as it used to be. So long as I'm not noticing any obvious issues, I don't really care enough to check.
I think I'm a bit spoiled with my 144 Hz monitor; anything below maybe 120 FPS starts to bug me. Thankfully my PC is pretty powerful and I don't really play graphics-heavy games (mostly just Minecraft) so my framerate is usually quite stable.
I know OP has a point that they weren't asking your opinion on games, but I really like your stance of demanding performance from the game devs especially on older hardware. There is a culture of "must have newest hardware to run everything maxed " that's just dumb consumerism.
...back in the CRT era i needed at least a 72Hz refresh rate to not feel any discomfort; that doesn't exactly correlate with framerates on modern LCD displays but i think it's a good proxy for the threshold of general perceptiblity...
...are greater framerates smoother?..sure, especially in my peripheral vision, but 72 FPS is generally good-enough beyond which returns start diminishing...
There are a lot of games at 30 I've played through just fine, but for FPS games that extra 10-15 is about my minimum unless it's on console with aim assist. I grew up playing Saints Row 2 at single-digit framerates, but I just can't do that anymore.
25FPS and 480 pixels vertically is enough for me to get sucked in and forget the world around me.
Which is nice cause that way I can play open world RPGs like Kingdom Come on an old laptop.
40 is fine, I can go lower depending on how nadlyO want the experience. I grew up relatively poor, I am not going to completely pass up on an experience I am looking forward to over a lower framerate.
Anything realtime needs to be at least 60 fps, the closer to my monitor 144Hz the better. Something like a city builder or turn based strategy or non-time-critical relaxed co-op stuff is fine to be 30+.
I'd never want to play any shooter at lower than 60, no RTS, no racing game and so on.
My monitor dynamically adjusts it's refresh rate to match what my GPU is spitting out within reason. Anything above 40ish is fine, though competitive stuff does benefit from more. Below that even if my monitor is matching frame to fame I definitely notice.
For me, it highly depends. Turn-based strategy games, I can easily play at a much lower framerate (30 is fine tbh though I always appreciate more). FPS-style games? 60 is a bare minimum, but 100+ is what I would consider to be enjoyable.
25 and above with drops is fine for me. I grew up with an ati card on low end machine so if the stupid game runs im happy.
Don't understand the stupid " 4k 250hz perfect black oled or its shit for stupid people" attitude.
As long as my 1080p doesn't ghost its fine.
T. Made art for many games some of you have played.
Competitive FPS/action games I want 120, story games with FPS 60, anything turn based or slow paced is probably fine at 30 or 40. It also depends on a lot of other factors. On my handheld (steam deck like) I aim for 30 or 40, but my main PC always shoots for 60 or higher.
That and I usually tune my settings so I get a bit more than 60, then lock the framerate to reduce stutter.
If it's a fast-paced action game, 60 is a must. If it's turn-based, or otherwise just slow enough to not matter, I'll sometimes accept a stable 30 - but only if it's truly stable, any dips below that are not okay.
Highly depends on the type of game. For First person shooters, 120+ fps is a must. I skipped the more recent CoDs because I couldn't get them to run at that target consistently enough on my PC without turning them into blurry DLSS smear.
Racing games, where motion is typically always going in one direction with only smooth direction changes, a lower framerate is fine (like 60 to 80), although the added smoothness from high framerate is obviously still nice.
Slower paced or turn based games I'm fine with going as low as 40 FPS, as long as it's consistent without drops and frame pacing issues.
When I play it's usually solo games, and I never had an issue with 20fps+ . If performance drops below that, I'm visually ok with 16fps, but usually at that range my system is struggling with game mechanics and that's the deal breaker for me
I feel like 20 FPS would be OK for me if I had absolutely no ability to get at least a 25. But 15? 16? That's like very jittery. I remember that happening on Alan wake 2 and it was playable, but to be honest I was kind of annoyed with it.
I can comfortably play some games down to 12fps ±3ish, if it isn't something that's fast paced.
I have yet to play anything where I'm skilled enough for higher than 30fps to matter response-wise, and while I can notice the difference between 60fps and 240fps on my monitor, I gotta say it doesn't do much for me.
Maybe I just don't know what to look for, what I'm missing, or how to set up my laptop right, but who knows. My eyes could be stuck on 720p for all I know.
I play a ton of simulation and strategy games (and some that I would hazard to classify as virtual railfanning/model railroading, like Railroads Online and Transport Fever 2) so I crank up the prettiness, download as much custom content as will load and enjoy the scenery at 20-40 FPS
I too grew up on machines that were mid-low range and was constantly asking more of them than they could handle, so I learned to stomach pretty miserable FPS. In the end though it's highly context sensitive - the less movement (and in particular camera movement) the game has the lower the frame rate you can get away with.
As a general rule I would say 25 FPS is the absolute lower limit, but around 40 is probably more in line with your "this is fine and I'm going to have a great time" definition. However, for something like a fast paced shooter it's more like 60 FPS minimum.
Around maybe 40 or so I start to notice it. 50 and higher I'm content. My monitor only supports 60 Hz. Around 20 or less I'm annoyed. It's tolerable for turn based games though. Not enjoyable, just tolerable.
My target is 60, but depending on the game I find framerates down to 20 technically playable (if it's stable), but I need a bit of time to get used to it.
For framerates above 60, however, I can't really feel any difference so I usually set a cap at 60 to reduce heat and because the on board sound card is poorly isolated and picks up noise from the gpu.
If it's not 60 or higher, I can't stand it. But it has to be consistent. Even constant fluctuations between 120-140 are felt even if not necessarily seen. I generally just try to get 60 since my display is 60hz. What's annoying is that I could be doing 1440p at 60 with my specs, but for some reason setting the display to that specific resolution locks it to 30hz.
The display is 4k, and has 60hz available at 4k and every other resolution. My PC can't handle 4k @ 60 for most things, though.
Maybe this has changed since I've upgraded my gaming specs but I used to average 14 FPS on Kerbal Space Program and had a great time with it, docking is a nightmare at that frame rate but otherwise it's more than playable.
Back in my poverty gaming days I 100%-ed a pirated The Simpsons Hit and Run with potato graphics at slide show speeds, I'm talking like multiple seconds per frame with around 80% frame droppage.
Nowadays I just care that it looks decent and runs smoothly for the games I play, which is mostly Civilization and Stellaris
Anything VR really needs to be 90 or more, but around 60 is good for most things.
I actually think the choppy framerates in Cyberpunk is actually really immersive so it's cool all the way down to 30 or with the smearing of dlss-performance, but most games don't give you progressive brain damage in the first 2 hours like it does
Weirdly enough, I actually care more about framerate on "pancake" (non-vr) games than I do on VR games. I can deal with 10fps in vrchat in a crowded instance. I need more like 20~30 for non-vr games.
That said, I get mentally exhausted when the framerate is <30 for an extended period of time in VRChat.
There's a reason I only upgraded to a 2k monitor and not 4k, I'm not willing to sacrifice that much performance to just play at a higher resolution, 25 fps is way too low for me.
108 fps is what I play Fallout New Vegas at (to avoid physics behaving too weirdly) and I think that's fine. I think I've gone down to 90 and been somewhat ok with that, but anything below that is no bueno.
Non-fps games I'll cap lower, like 72 fps for a civilization game is perfectly fine.
But if you want beautiful games like God of War (or do you mean gears of war?) and are fine with a lower framerate, that makes sense to me.
I like how us humans have totally different likes and dislikes. I 100% understand you and will never judge you. You like what you like and that's very good. I mean God of war, yes. It's freaking gorgeous.
Maybe it's because I grew up with 60hz CRT monitors in the 90s, the ones that'd give you a headache if you sat in front of them for too long 😅 Or maybe you just get so used to 144 fps once you make the switch that it's impossible to go back.
GOW running at 40'ish fps as you say even at ultra must mean they cared to make a good game. I ought to give it a go just for the "Boy" meme.
I played BG3 at less than 15fps for a while, but upgraded my PC when the video card crashed on about half of the cutscenes and whenever fireworks were used at close range
I don't really obsess about framerates myself and I've never had the kind of budget to have the latest and greatest parts but from what I've seen, somewhere around 30fps is fine.
And even though you didn't ask, the last setting that I ever sacrifice is draw distance. I'll turn down textures and shadows and reflections and everything else before I sacrifice draw distance. I don't need realistic graphics to be able to immerse myself and have a good time. But things popping in and out of existence in front of your eyes are the ultimate immersion breaker for me.
30 is acceptable for most games but stuff where the gameplay is mainly the movement itself (platformer, racing, first person shooter) needs to hit 60.
I could go lower than 30 for the visuals on a lot of games but that’s the threshold where the interface starts feeling unresponsive and that really gets to me.
My personal minimum is a stable 40/s, which is roughly where I start noticing the lower framerate without paying attention to it.
With 30/s I need to get used to it, and I usually underclock (or, rather, power-limit) my GPU to hit an average 50 unless the game in question is either highly unstable (e.g. Helldivers 2) or the game is so light I don't have to care (e.g. Selaco).
That made me laugh 😂. Simple is always good, as long as you're having fun. 3070 would definitely do even 4k at 30FPS on medium or low. My RX 6600 does that easily on pretty much every game.
Back in the days of CRT displays I had a 120Hz Trinitron, to pair with the video card and 3D goggles (which shuttered each eye in turn) to give 60Hz per eye
No way could that system or video card keep up with anything more modern than Turok 1 but it was nice for the couple of years it was good enough.
I wish I still had that Trinitron, I'd need a deeper desk though
@psud I do like my displays cause they are so old right now and still working. One of them is that old that the most modern connection he got is DVI. To get that working I was forced to buy an adapter for my graphics card.
Because I got two displays the color of them is so different that if I move a window between them to be shown on both I can visually see how broken the colors are on both of them.
The also have issues with ghosting if there are fast movements. I still like them.
I started playing on a PC in the 90s so as long as it’s above 40 with consistent frame pacing it’s fine. Those VRR displays and games targeting 40 are a game changer for me and why I play on Xbox with a modern LG OLED.
For shooters, especially competitive ones, as high as possible up to my monitor's refresh rate (165Hz). Everything else 60 FPS is fine. Even 30 FPS can be fine, especially if I'm playing something on Switch.
i am 100% with you. there must be something to it if it's that important to so many people but i genuinely can't tell the difference as long as it's stable
and if it does make a difference, for competitive games wouldn't you want it to be consistent between all players instead of "better" based on whoever has more horsepower? it all makes no sense to me
@penquin oh! Well in that case I used to be a 1080p 60Hz monitor kinda guy, and about a year ago I had to upgrade to dual 1440p 165Hz monitors.
While I can definitely feel the difference, 60 FPS is barely noticeable, and even 30 FPS is acceptable.
I grew up with slower machines so sub-30 was fairly normal, even older consoles targeted 30 and faltered below that, so at this point I'll take anything above what's acceptable for film