Which? Because consoles just use AMD APUs which have the exact same hardware features as their current CPUs and GPUs. UE5 games run like crap on consoles too.
My comment on a different post relates to this well:
I think a lot of the plugin tooling for Unreal promotes bad practices with asset management, GPU optimization, and memory management. I'm trying to say that it allows shitty/lazy developers and asset designers to rely on over-expensive hardware to lift their unoptimized dogshit code, blueprints, models, and textures to acceptable modern fps/playability standards. This has been prevalent for a few years but it's especially egregious now. Young designers with polygon and vertex counts that are out of control. Extraneous surfaces and naked edges. Uncompressed audio. Unbaked lighting systems. Memory leaks.
I've found that in my personal experience in experimenting with Unreal, the priority matches developing DNNs and parametric CAD modelling applications for my day job: effective resource, memory, and parallelism management from the outset of a project is (or should be) axiomatic.
I think Unreal 5 runs exceptionally well when that's the case. A lot of the time, one can turn off all of the extra hardware acceleration and frame generation AI crap if your logic systems and assets are designed well.
I know this is a bit of an "old man yells at cloud" rant, but if one codes and makes models like ass, of course their game is gonna turn out like ass. And then they turn around and say "tHe EnGiNe SuCkS".
There are a lot of bold promises in Unreal Engine 5 advertisements, that get taken up by publishers and producers - and then end up in the game budgets...
And then, near the end of the project, when it turns out that performance isn't good because the advertisement promises have been a bit too bold, there is no money for optimization left...
Nothing annoys me more as a AAA dev than weird, annoying armchair critic nerd-ass dweebs thinking we're "lazy". Okay, you try this shit, then. I have the UE5 editor open right now, and I'm busy squashing bugs and optimizing our game. I've been doing it for 18 hours a day for months, sometimes 7 days a week. This stuff is way harder than you think, even with the proper amount of time, money, staffing and expertise.
Hellblade 2 looked and ran great on my Steam Deck. Fortnite must be running great or millions of kids would complain. So I posit that it is not the engine's fault.
Yep, developers will optimize their game for their wants and needs. It’s not the engine’s fault, it’s the developers’ for using techniques that don’t perform well on current technology. Unreal 5 could run well on launch on then-current technology.