Diablo IV Beta Reportedly Causing Issues with GeForce RTX 3080 Ti, Potentially Leading to Card Failure

Published by

Click here to post a comment for Diablo IV Beta Reportedly Causing Issues with GeForce RTX 3080 Ti, Potentially Leading to Card Failure on our message forum
https://forums.guru3d.com/data/avatars/m/251/251189.jpg
It would just mask failure by faulty hardware.
https://forums.guru3d.com/data/avatars/m/296/296575.jpg
aufkrawall2:

It would just mask failure by faulty hardware.
Okay, maybe I'm giving the manufacturers ideas here so they can keep using cheap components but my point still stands.
data/avatar/default/avatar01.webp
Paying for the actual values of GPUs is mandatory high quality componentes, not cheap. I can agree on the cheaper models, lets give example of Asus (apply to other brands the equivalent), the Dual model is lower quality and is acceptable more cheap componentes, simpler power delivery and cooler, but at same time lower clocks. But if I'm going to buy TUF or Strix, I expect the best of coolling and components, that's why I pay Asus tax.
data/avatar/default/avatar11.webp
such safeguard limit should be implemented in firmware of GPUs, not in drivers by users or apps by devs.
Thrasher1984:

Hot take (pun intended): Manufacturers should introduce a hardware limit of 500 FPS via BIOS. It would prevent a lot of clueless people from BBQing their cards or unnecessarily RMAing because "coil whine". As a result, a lot of energy and resources could be saved. Also, the GPU BIOS modding community would get a popularity boost. Everyone wins!
2-3K should be enough. 500 is bad for development purposes.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
Alessio1989:

such safeguard limit should be implemented in firmware of GPUs, not in drivers by users or apps by devs. 2-3K should be enough. 500 is bad for development purposes.
@Thrasher1984 's idea is not bad though but I would say there is an easier solution limit by default on drivers the fps to 500 and who ever wants it disabled ... They can disable it !
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
Pryme:

Paying for the actual values of GPUs is mandatory high quality componentes, not cheap. I can agree on the cheaper models, lets give example of Asus (apply to other brands the equivalent), the Dual model is lower quality and is acceptable more cheap componentes, simpler power delivery and cooler, but at same time lower clocks. But if I'm going to buy TUF or Strix, I expect the best of coolling and components, that's why I pay Asus tax.
On a 6400xt sure .... Now Asus saving on a 700++ product couple of bucks at best in components is unacceptable ...at the end of the day sell em 702 bucks 😏(or gigabyte or Msi ...what ever manufacturer)
https://forums.guru3d.com/data/avatars/m/258/258688.jpg
If the software is in fact bricking a GPU, I would without hesitation call that a defective GPU. It just so happens that this particular software was what brought the defect out and caused it to fail. Like Solfaur mentions above, any software could serve to cause a defect to fault in the case of a defective GPU. It could be a failing PSU as well, I would guess. I will have to say I've never seen a defective GPU that just dies like that. Many years ago, I had a Voodoo 5.5k that I got to stop booting by fitting custom heat sinks on it that didn't quite fit...;) Fixed it, though, by putting it in the oven (don't remember if I baked it or broiled it) for about 10 minutes, and voilà, it booted once again! I guess when the solder slightly liquified, a contact was remade on the PCB that I had somehow loosened. Never had another problem. One other time even before that I bought an original TNT2 GPU that came clocked at 150MHz, and I found that I could not overclock it even 1MHz higher...;) 151MHz and it crashed during boot every time! Run all day at 150MHz, though. But that's the closest I've had to a GPU dying.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
Neo Cyrus:

The cards blowing up is not a software issue; it never is. The hardware should be fine no matter what software is running, all D4 has done is expose design flaws in the hardware that managed to stay hidden until now.
Despite that, you can make most hardware fail with software... but it's real bad.
https://forums.guru3d.com/data/avatars/m/267/267153.jpg
Astyanax:

no it doesn't
That was deep. Care to elaborate? In my opinion, if for ex my 3080 runs at max 250w instead of 330w, in case its somehow extremely utilized by any SW, it cant get really utilized to 100% (or "more") so theres more headroom. I cant imagine a scenario, where SW destroys card just by some unexpected "command" while keeping the card at low usage. This destruction happens thru extreme utilization of some component. When all components are forced to work at low usage thru undervolting, it prevents this to happen to some extent.
https://forums.guru3d.com/data/avatars/m/269/269645.jpg
I always set Nvidia max fps cap to 125 on a 120 screen. Always plays great. I had a few older games where there was no cap and some game menus would go 999 and fans and coil whine would try to launch like a rocket.
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
If the software can cause a condition where the card faults, its almost always faulty components, It is possible that there is a condition where the firmware is at fault (like say not limiting voltage properly under certain load conditions) but it is very unlikely.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
JiveTurkey:

I always set Nvidia max fps cap to 125 on a 120 screen. Always plays great. I had a few older games where there was no cap and some game menus would go 999 and fans and coil whine would try to launch like a rocket.
If you're using an adaptive sync panel, set it to below your screen's max refresh, BlurBusters recommend by 3.
https://forums.guru3d.com/data/avatars/m/54/54823.jpg
Thrasher1984:

Hot take (pun intended): Manufacturers should introduce a hardware limit of 500 FPS via BIOS. It would prevent a lot of clueless people from BBQing their cards or unnecessarily RMAing because "coil whine". As a result, a lot of energy and resources could be saved. Also, the GPU BIOS modding community would get a popularity boost. Everyone wins!
Goodbye 3dmark night raid. You didn't think this idea through. This benchmark is more important than anything.
https://forums.guru3d.com/data/avatars/m/248/248627.jpg
I cant change a single setting in my radeon driver settings or my card crashes out completely forcing my pc to reset and come back with the drivers completely missing requiring me to use DDU and reinstall fresh drivers, this issue has persisted with at least 4 or 5 different drivers at this point and my card is a 6750xt its quite frustrating but as long as i don't touch a single setting in the radeon graphics settings its fine. the only thing im currently able to use is custom resolution and refresh rate profiles. this isn't even in diablo btw its in any game i try
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
Rich_Guy:

Its not just Nvidias cards, as AMDs are blowing up too on it. 😛
My 6700XT handled it just fine. Framerates were a bit lower than I was expecting, but no hardware failures. Thermals were well within spec for the 4-5 hours I was playing.
https://forums.guru3d.com/data/avatars/m/279/279250.jpg
Played on my Vega 56 card, May 2022 drivers, temps around 73c all the time, sometimes around 69c. There's some stuttering in the cut scenes though and in the game bit, but besides that mostly 55fps, no problems. The storyline have me cracking up though, when was i drinking in the cabin and i fell out and that killer came along and tried to murder me when i was sleeping lmao.
data/avatar/default/avatar35.webp
I played D4 yesterday for a few hours with my shitty Radeon 6700XT (2700mhz core, 2150mhz memory) with no issues (luckily). But I was scared, so before playing the game, I forced FPS LOCK to my monitor refresh rate (1440p - 75hz freesync) using rtss. The game is great, graphics is excelent. GPU was boosting from 500-2400mhz, using 11 of 12gb of gpu memory available and 13 of 16gb ram.
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
I was watching Sasha Grey play this on Twitch, it looked no different than 3
data/avatar/default/avatar06.webp
first thing I do after install drivers is limiting global max FPS to monitor refresh rate -4
https://forums.guru3d.com/data/avatars/m/269/269645.jpg
Neo Cyrus:

If you're using an adaptive sync panel, set it to below your screen's max refresh, BlurBusters recommend by 3.
I have seen talk of that -3. Gsync already caps at 118 when it's on. Just letting Gsync max out has been smoothest gameplay for me. I know there are layers of the way these things interact with each other and unoptimal settings can cause stutter even while providing decent frames. Just haven't had time to dig in and test. I run a 4090 water-cooled and play mostly older games, so usually capped out anyway. Still. The mess of overlapping settings is annoying.