"Easy fix with my massive unparalleled intellect. Just turn off the sensor"
If you needed any more proof that Nvidia is continuing to enshittify their monopoly and milk consumers. Hey lets remove one of the critical things that lets you diagnose a bad card and catch bad situations that might result in gpu deathdoors! Dont need that shit, Just buy new ones every 2 years you poors!
If you buy a Nvidia GPU, you are part of the problem here.
The only good thing this generation about nvidia, is that their prices were lower than expected on the low end cards, Forcing AMD to cut the lunacy of their prices of their 9070/others in half.
Surely if the card is damaged due to overheating, the customer won't be blamed since they can't keep track of the hottest part of the card, right? Right?
Yeah NVIDIA is a bullshit company and has been for a while.
AMD and Intel need to get their raytracing game up so they become a real competitor for NVIDIA especially now when there are more games that require raytracing.
This is incorrect. The new indiana Jones game requires raytracing as does the upcoming doom game. As much as you may or may not like it traditional rasterized graphics are (starting) to be phased out. At least across the AAA gaming space. The theoretical benefits to workload for developers make it pretty much an inevitability at this point once workflows and optimizations are figured out. Though I doubt rasterized graphics will completely go away. Much like how pixel art games are very much still a thing decades after becoming obsolete.
I've never bought Nvidia but they become more like Apple every day. Why be consumer friendly for niche PC builders? The average gamer already associates Nvidia with performance so it's time to rely on good ol brand loyalty!
The problem is, it's not just an assosiation. NVIDIA cards are the fastest cards hands down. I wish Intel and AMD would provide competition on the high end, but they just don't do it.
Even worse, the best next gen AMD GPU won't even beat AMDs best last gen GPU, they even say this themselves.
To me, buying Nvidia for performance is like buying an APC as a daily driver for work because of it's safety rating. The cost long term does not at all seem worth it.
The drop in clocks in certain situations that a lof of outlets are "conveniently" attributing to CPU limitations, has all the hallmarks of throttling... It's hard to criticise the incumbent monopoly holder when they have a history of blacklisting outlets that espouse consumer advocacy.
I wonder if there was some other reason for this removal, e.g. I could imagine some change in this generation that could have made the hotspot sensor redundant for some reason.
But yeah it's far more likely to be for the reasons you outlined. Absolutely diabolical.
Unlikely, as the hotspot sensors/detection logic is baked into the chip silicon and it's microcode. AIB's can only change the PCB around the die. I'd almost guarantee the thermal sensors are still present to avoid fires, but if Nvidia has turned off external reporting outside the chip itself (beyond telling the driver that thermal limit has been reached), I doubt AIB's are going to be able to crack it too.
Also the way Nvidia operates, if an AIB deviates from Nvidia's mandatory process, they'll get black balled and put out of business. So they won't. Daddy Jensen knows best!