Skip Navigation

Valve launches Proton 9.0 with improved game compatibility and more - KitGuru

www.kitguru.net Valve launches Proton 9.0 with improved game compatibility and more - KitGuru

Valve's Proton, the open-source tool for running Windows games on Linux, has received a significant

Valve launches Proton 9.0 with improved game compatibility and more - KitGuru
38

You're viewing a single thread.

38 comments
  • From the article ...

    Older PC games have also received some love, as the new version addresses the issue of playing these games on high-core count CPUs. Proton reduces the number of CPU cores observed by games such as Far Cry 2 and 4, The Witcher 2: Assassins of Kings Enhanced Edition, Lara Croft and the Guardian of Light, Warhammer 40,000: Space Marine, Dawn of War II, Dawn of War II—Chaos Rising, Dawn of War II—Retribution, Outcast—Second Contact, and Prototype, allowing them to run more smoothly.

    Anyone know of any details as to why this becomes an issue, why many cores causes older games to not work properly, requiring proton to hide extra cores from them?

    Anti Commercial-AI license (CC BY-NC-SA 4.0)

    • I am not aware of any games having a problem with too many cores*. But most of those (from memory) seem like peak Pentium era games. For the sake of this explanation I will only focus on Intel because AMD was kind of a dumpster fire for the pertinent parts of this.

      Up until probably the late 00s/early 10s, the basic idea was that a computer processor should be really really fast and powerful. Intel's Pentium line was basically the peak of this for consumers. One core with little to no threading but holy crap was it fast and had a lot of nice architectural features to make it faster. But once we hit the 4 Ghz clock speed range, the technology required to go considerably faster started to get really messy (and started having to care about fundamental laws of physics...). And it was around this time that we started to see the rise of the "Core" line of processors. The idea being that rather than have one really powerful processor you would have 2 or 4 or 8 "kind of powerful" processors. Think "i4" as it were. And now we are at the point where we have a bunch of really powerful processors and life is great.

      But the problem is that games (and most software outside of HPC) were very much written for those single powerful cores. So if Dawn of War ran past on a chonky 4 Ghz Pentium, it didn't have the logic to split that load across two or three cores of a 3 Ghz i4. So you were effectively taking a game meant to run on one powerful CPU core and putting it on one weaker CPU core that also may have lower bandwidth to memory or be missing instructions that helped speed things up.

      To put it in video game (so really gun) terms: it is the difference between playing with a high powered DMR and going to a machine gun, but still treating it like it is semiauto.

      But the nice thing is that compatibility layers (whether it is settings in Windows or funkiness with wine/proton) can increasingly use common tricks to make a few threads of your latest AMD chip behave like a pretty chonky Pentium processor.

      *: Speculation as I am not aware of any games that did this but I have seen a lot of code that did it. A fundamental concept in parallel/multithreaded programming is the "parallel for". Let's say you have ten screws to tighten on your ikea furniture. The serial version of that is that you tighten each one, in order. The parallel version is that you have a second allen key and tell your buddy to do the five on that side while you do the five on this side. But a lot of junior programmers won't constrain that parallel for. So there might be ten screws to tighten... and they have a crew of thirty people fighting over who gets to hold the allen key and who tightens what. So it ends up being a lot slower than if you just did it yourself.

    • Doesn't The Witcher 2 support Linux natively anyway?

38 comments