This may be a shocker but games on the same level of scope as Cyberpunk 2077 take years of effort to make. We simply cannot pump them out as fast as consumers and shareholders demand their release.
Hello Games had a similar issue with No Man's Sky. Ubisoft also did with both Division games.
This revisionism is quite tiring but I guess that the development companies are counting on it.
The problem with Cyberpunk was not "just bugs" but a 40 minutes video that tells lots of lies and was clearly stated as "fake" to drive up the hype for it.
What you see today was shown as if "ready" 4 years ago. And today we still can't see the hacking as shown in that video.
On top of that there are all the design decision that are simply terrible but no amount of patches will fix, like the looter shooter approach to loot, levels on enemies, etc etc.
Overall, it's not a matter of "realistic expectations". We were lied to and that's just it.
One of the few games I don't regret buying before release was Baldur's Gate 3 but that's an anomaly. Most games I'm happy to wait a year or more when it's in better shape.
With realistic expectations, the game has always been a good experience, imo of course. I did not follow any coverage of the game until after release, so I wasn’t sure of what to expect. I’m not excusing their shortcomings, but I feel like the community leaned hard into the “bad game circlejerk” as soon as it came out. I played once at release and got the worst ending. After edgerunners, I played it through three times, the last of which on very hard and with all the endings earned.
I enjoyed it! The 2.0 update is an interesting shakeup. I’m playing through a 4th time and having a good time
Increasing complexity, tighter deadlines, demand for highwr profit margins, decrease in education quality. Theres a lot of reasons and not all of them are necessarily bad. Its good that we can simulate what we can. I think the profit motive is just starting to show its ruinous powers as shareholders demand more and more.
Unfortunately, it's also here again with 2.0 so far. I started playing the game in 1.3, so this is the most buggy I've ever seen it. Vertex explosions, jumpy character animations, skills not working correctly, incorrect sound effects being played.
This is indeed the new normal, and I shouldn't expect Phantom Liberty to run smoothly next week either. If took months after the recent big Witcher 3 update for it to play okay on mid-spec systems.
I think I was happier when I still catching up on games from a couple generations ago. Now that I've done that, I keep running into this stuff. 😕
This is the new narrative for Cyberpunk 2077. I'm guessing cdprojekt greased some palms ahead of the new DLC release.
But make no mistake, and don't fall for it; cyberpunk is still a wholly buggy and unfinished game with extremely janky mechanics that will never be patched out.
If and only if you can overlook such issues, and I know from personal experience some can, should you consider paying for the new DLC.
It was a very different experience for me. I had a blast playing this game when first released and didn't find any game breaking moment. This could be due to me playing on PC? So with the latest patch I loaded up my V and found noting of merit had changed. Seriously I found it to be the same game with small UI and Skill changes. I was shocked to say the least, due to the enormous patch size. I still haven't left training NC so I might find more changes but so far it's still an enjoyable game.
It’s weird - when I played at launch, I had precisely one bug that impacted my gameplay. Other than that, the game ran pretty smooth and was a joy to play.
Now mind you, I was playing on a PC with a Xeon, 64GB of RAM, and an RTX 2080ti. Nothing ram badly on that system three years ago. Nowadays the older CPU, slower RAM and admittedly older GPU without all the newest bells and whistles (DLSS Framegen I’m looking at you) can’t quite measure up to the latest titles.
Cyberpunk, at launch, was great. For me. Specifically for me. I loved it and still do. But this article hits a point for me that I’ve been struggling to find reason to write about without feeling like I’m ignoring people who primarily play on consoles or can’t afford a nice PC. Regardless…
Man it fuckin’ sucks how you can spend a huge amount of money on a new GPU and then four months later a new one comes out that blows it out of the water. New hardware is so much better and - because all the game devs are using that hardware to design their games both on and for - systems like mine that are still fairly new can’t run the latest games at high settings anymore.
It used to be that if you ponied up the money for a high-end rig, you could expect decent performance for years to come. But I guess blowing a grand on a GPU these days just means you’ll be doing it again in a year or something, instead of the decade or so before.
I’m not saying my PC is bad. Most of what I play runs excellently. But when I spend a grand on just a GPU I expect that GPU to run the newest games at high settings for a long time. Jedi Survivor, Starfield, both run like crap on my system. Never mind the 2TB NVMe drive everything’s installed on.
Well you got here by waiting 3 years to play a game that was perfectly fine about a month post-launch.
As someone who played on launch, with less than "best" specs - it was fine. I had the odd T-posing NPC or texture flicker, but I had decent frames almost everywhere and had a blast with a great game.
Hard to disagree with the article, it seems it's safer to wait at least a few weeks or months to play a new game because there are often things to be fixed after launch. Many games have multiple ambitious and complex systems that need to be tuned post launch. Combine this with high expectations/hype that marketing teams foster and you have a recipe for regret and disappointment on day 1 experiences
I first learned the wisdom of waiting until after the bulk of the bug-squashing was done before expecting to play a reasonably stable game with Oblivion, 17 years ago.
Granted that Cyberpunk 2077 was a particularly egregious example of the problem, but still...