I'm watching a streamer play the game, and what I see looks like I'd have some fun, and others probably feel the same way.
I'm just not interested in playing at like 30fps on a 3080. Maybe some patches or driver updates can improve things and I'll check it out in the Steam Winter Sale or something.
Microsoft? Fix bugs? Are we thinking of the same microsoft? Best they can do is integrate a weather app nobody asked for. And install candy crush without your consent.
Here are some numbers. I’m at 42 hours played. Resolution 2k, settings are on High. Ryzen 5800x and a Radeon RX 6800xt. The last session of ~5 hours had an average FPS of 108, Starfield is more optimized than BG:3 and Remnant 2.. at least for AMD. I had to lower a lot of Remnant 2 settings and it still averages around 55.
Ony 3080 with a 5900x I'm constantly getting 60fps at 1080p (unfortunately for now that's the only screen I have), meanwhile BG3 would dip to low 10s after a few minutes of playing every time
EDIT: I would also like to add that I didn't use DLSS or FSR in both games, since my hardware is more than capable of running both on maximum quality at 60fps 1080p.
That's exactly what I have, but I play on 3840x1600, 24:10 Ultrawide.
I don't remember BG3 giving me any problems, even in Act 3, before the last patch, that supposedly addresses some performance problems. I loaded up a save just now and get ~50fps running around in the Lower City (very short test, only like two minutes). That's with most settings maxed and DLSS Quality.
Depending on the area, I'd probably get similar numbers in Starfield (according to the benchmarks I've seen), but for me, it's a difference playing an FPS or isometric RPG.
I heard this is because NVIDIA didn’t fund optimizations. AMD did so it’s running a lot better for them. I said fuck it and bought an Xbox with 2 year payment plan and game pass included. Cause no way my 1080ti is ever gonna play this game that good I don’t think. End of an era.
My 980 is pulling 30 fps with most things on medium and high, and shadows on low because it has the most effect on performance.
Also turn off resolution scaling, for nvidia at least, don't know about AMD cards.
With resolution scaling it doesn't matter if you're using AMD or Nvidia, it's doing the same thing and looks the same on both vendors.
If your GPU supports it (RTX cards), you can mod DLSS into the game and then get (supposedly) better image quality, on the same level of scaling as the non-modded FSR2, or potentially lowering the scaling even more, for better performance, while still getting a comparable image as a higher FSR2 preset.
I've also watched some streams, and the performance hasn't even been my biggest concern. I'm just... not interested? It hasn't been gripping me. Even though there are these shiny new things and bells and whistles, it still just looks like another Bethesda game to me, but with a blander setting this time. Though maybe it's more fun to play than watch. I just haven't really seen anything that makes me go "goddamn I gotta get a piece of that".
AMD folks are having a good time, but nvidia folks will need to wait. The game is purposefully not optimized for nvidia at the moment due to AMD sponsorship. (Also potentially to point out that many AAA titles tend to be optimized for nvidia but not AMD at launch)
CPUs though, since with those, AMD is much worse than Intel
Simply untrue with later AMD. Slight advantage to Intel, but not the blowout it used to be. Intel loses entirely if power consumption and cost is taken into account.
But of course, games rely largely on GPU power, and the CPU concern is generally secondary.
In Starfield the 13900K is 20% better than the best AMD offering, the 7800X3D. Even the 13600K is better than any AMD CPU. A 13100 is on the same level as the 5800X3D. I wouldn't call that just a slight advantage.
It's only this game right now, that's why I'm saying something might be up.