AMD Talks Radeon RX 5700 Temps And Max Boost on Their Blog

Published by

Click here to post a comment for AMD Talks Radeon RX 5700 Temps And Max Boost on Their Blog on our message forum
https://forums.guru3d.com/data/avatars/m/232/232130.jpg
AMD: we already set the shit on fire, no need to light it further!
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Thanks HH. AMD did talk about GPU temperatures. But I more of wonder about memory. Because with blower, memory temperature is very close to GPU's as they share same cooling.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Fox2232:

Thanks HH. AMD did talk about GPU temperatures. But I more of wonder about memory. Because with blower, memory temperature is very close to GPU's as they share same cooling.
Memory is probably even hotter than the gpu itself. Isnt that case on Polaris and Vega aswell?
data/avatar/default/avatar12.webp
I remember the old Nvidia drivers (my 7900 i think) stating GPU slow down temperature 110 degrees C, you where just not allowed to see the highest temperature. AMD lets you see the temperature, Instead of hiding it like Nvidia. Think about it, why does a 1080ti turn down the auto boost already at 50-55 degrees, when that is a very low temperature for a GPU?
https://forums.guru3d.com/data/avatars/m/271/271684.jpg
Undying:

Memory is probably even hotter than the gpu itself. Isnt that case on Polaris and Vega aswell?
GN review of the Sapphire Pulse showed that memory runs about 15 degrees hotter than the core, at least in a noise normalized test (40Dba). Don't know about Polaris, but on my Vega 56 HBM temps are usually within a few degrees from core (I use Vega 64 bios so my HBM runs on 1000MHz). But you can't really compare them, HBM being on the die and having lower power consumption than GDDR.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
TLD LARS:

I remember the old Nvidia drivers (my 7900 i think) stating GPU slow down temperature 110 degrees C, you where just not allowed to see the highest temperature. AMD lets you see the temperature, Instead of hiding it like Nvidia. Think about it, why does a 1080ti turn down the auto boost already at 50-55 degrees, when that is a very low temperature for a GPU?
Not sure about the accuracy of that, but can you provide links or show ANY scenario where performance is compromised on a 1080ti because of this? I smell fanboy BS.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Vananovion:

GN review of the Sapphire Pulse showed that memory runs about 15 degrees hotter than the core, at least in a noise normalized test (40Dba). Don't know about Polaris, but on my Vega 56 HBM temps are usually within a few degrees from core (I use Vega 64 bios so my HBM runs on 1000MHz). But you can't really compare them, HBM being on the die and having lower power consumption than GDDR.
Polaris also runs hotter on the memory. For instance my core tops out at 65c and vram is at 78c.
https://forums.guru3d.com/data/avatars/m/63/63170.jpg
Angushades:

frack off fanboi ur full of crap. Go back to brown-nosing companies.
Yeah, lets stay on topic and keep cool heads please ? No need for yet another thread full of crap to sift through. on topic: Temps are directly related to the Fan speed, and noise. You want a cooler chip, you're going to have to have a bit more noise. We are always complaining that the GPU is too hot, too noisy, too something "I" don't like..... I've personally always tweaked the fan speed of my GPU's. They are always set for low noise, and I prefer a bit better base cooling.
data/avatar/default/avatar28.webp
alanm:

Not sure about the accuracy of that, but can you provide links or show ANY scenario where performance is compromised on a 1080ti because of this? I smell fanboy BS.
Angushades:

frack off fanboi ur full of crap. Go back to brown-nosing companies.
------------------------------------------------------------------------------ A 2080ti also starts to down its boost clock around 50-60 degrees, even though that is a very low temperature. Link gamers nexus [youtube=PpDG13PrNPg] https://img.purch.com/2080-ti-noise-temp-voltage/o/aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9RL08vNzk4MDAwL29yaWdpbmFsLzAxLUNsb2NrLVJhdGUtR2FtaW5nLVRILnBuZw==
data/avatar/default/avatar04.webp
Caesar:

https://i.ibb.co/Qv4yVDW/xx.jpg
cryohellinc:

https://i.imgur.com/c4jt321.png
sverek:

AMD: we already set the crap on fire, no need to light it further!
Where is the junction temperatures for the Nvidia GPUs? oh forgot, there isn't any exposed to the plebs to know that actual temperature. And you ignore the fact that while Nvidia doesn't publish the internal junction temperature, when the core temp passes 32C, both Pascal & Turing cards drop clocks. 23Mhz per 10C, until hit 53C, then the clocks drop much faster. Anyone who has watercooled Turing & Pascal can confirm it.
data/avatar/default/avatar37.webp
Fediuld:

Where is the junction temperatures for the Nvidia GPUs? oh forgot, there isn't any exposed to the plebs to know that actual temperature. And you ignore the fact that while Nvidia doesn't publish the internal junction temperature, when the core temp passes 32C, both Pascal & Turing cards drop clocks. 23Mhz per 10C, until hit 53C, then the clocks drop much faster. Anyone who has watercooled Turing & Pascal can confirm it.
NVIDIA GPUs are designed to operate reliably up to their maximum specified operating temperature. This maximum temperature varies by GPU, but is generally in the 105C range (refer to the nvidia.com product page for individual GPU specifications).
https://nvidia.custhelp.com/app/answers/detail/a_id/2752/~/nvidia-gpu-maximum-operating-temperature-and-overheating#targetText=NVIDIA GPUs are designed to,page for individual GPU specifications). Edit: For a GTX 1080 it's about 94c: https://www.geforce.com/hardware/desktop-gpus/geforce-gtx-1080/specifications Temperature normally decreases for node changes efficiency gains.
https://forums.guru3d.com/data/avatars/m/256/256969.jpg
No more cold winter night, with AMD you get the full package !
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
Hilbert Hagedoorn:

In what can only be described as an informative article, AMD uncovered some design features on the RX 5700 series in terms of internal temperatures and basically, they are saying that any RX 5700 is... AMD Talks RX 5700 Temps And Max Boost on Their Blog
On other hand i remember my NVidia 260 GTX reaching the 100°c and never burn (and it is still working despite transformed as a brick to not let paper fly away).
https://forums.guru3d.com/data/avatars/m/224/224796.jpg
This is essentially the same as the Radeon VII in regards to junction temperatures and safe ranges etc... In my experience (both with Radeon VII and a 5700 XT) the temperatures are not an issue at all if you are willing to have a little bit more noise. Personally I don't mind a louder fan if it's just air movement noise and isn't high pitched. I always have fans running in whatever room I am (ceiling fan, table top fan, AC unit etc.) so I'm probably more desensitized to air movement/noise then the typical person.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
XenthorX:

No more cold winter night, with AMD you get the full package !
Sure, then same applies to nVidia. Your GPU is proof.
https://forums.guru3d.com/data/avatars/m/256/256969.jpg
Fox2232:

Sure, then same applies to nVidia. Your GPU is proof.
Stop taking anything as a personnal attack really, it isn't Kindergarten. Would have been an Nvidia product i would have said the same, chill buddy.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
alanm:

Not sure about the accuracy of that, but can you provide links or show ANY scenario where performance is compromised on a 1080ti because of this? I smell fanboy BS.
He didn't say anything about performance being compromised.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
sykozis:

He didn't say anything about performance being compromised.
Its clearly implied when he mentioned 'auto boost turns down' at 50-55c (as if its a flaw). He then gives a 2080ti clocks/temp chart that shows clocks going down as temp rises. Well whoopy do. All known benchmarks indicate the card performs well and above anything else on the GPU market. The cards main appeal (and the 1080ti) is the fantastic performance it offers, despite any boost/temp behavior. So whats the point in him mentioning it, other than obfuscation?