Cyberpunk 2077 / GeForce RTX 4090 runs up to 2850 MHz / 50C temperatures

Published by

Click here to post a comment for Cyberpunk 2077 / GeForce RTX 4090 runs up to 2850 MHz / 50C temperatures on our message forum
https://forums.guru3d.com/data/avatars/m/253/253369.jpg
Wow. Those temps are really impressive actually consider the clock and power. Steady 60fps do you think?
data/avatar/default/avatar22.webp
This is the only chart that mattered to me. No 60 FPS RTRT @ 4K without DLSS or upscaling. Not buying anything until that GPU performance target has been hit. Thanks, moving onto other things.
data/avatar/default/avatar04.webp
Valken:

This is the only chart that mattered to me. No 60 FPS RTRT @ 4K without DLSS or upscaling. Not buying anything until that GPU performance target has been hit. Thanks, moving onto other things.
When rtx5000 is out, there will be another demanding game that will do 30 fps. So? You can't judge the usefulness of a card evaluating a software with arbitrary necessity to render graphics. Unless all you do is cyberpunk h24, but you would get tired fast and then move to something else.
https://forums.guru3d.com/data/avatars/m/54/54063.jpg
TaskMaster:

Wow. Those temps are really impressive actually consider the clock and power. Steady 60fps do you think?
The temperatures are impressive indeed. My 3090 temps range from 65c to 75c. There's been a few times I've seen 80c. Granted mine is overclocked. Stock temps normally between 58c to 65c depending on the game. To be honest Gurus it looks like I'm rolling with the 4090. There is so much information being released now and most of it is positive in regard to performance and tech. I may kick myself in November when AMD delivers an awesome card but hey, I will still be very happy. ๐Ÿ™‚
data/avatar/default/avatar18.webp
TaskMaster:

Wow. Those temps are really impressive actually consider the clock and power. Steady 60fps do you think?
Despite all the memes this is why big GPU coolers are good. Silent and runs cool. I'm not missing these 2slot GPUs from 5 years ago that ran 84C with fans screaming at 90% speed.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
TaskMaster:

Wow. Those temps are really impressive actually consider the clock and power. Steady 60fps do you think?
Seems too good to be true. And I find curious that they testing such an high card at only 1440p instead of 4k...
https://forums.guru3d.com/data/avatars/m/218/218363.jpg
Valken:

This is the only chart that mattered to me. No 60 FPS RTRT @ 4K without DLSS or upscaling. Not buying anything until that GPU performance target has been hit. Thanks, moving onto other things.
You cannot generalize it like that as it depends on the game. Sure, if you wait a generation maybe next one will hit a steady 60fps in Cyberpunk but what about other games that will look even better and be even more demanding?
TheDigitalJedi:

The temperatures are impressive indeed. My 3090 temps range from 65c to 75c. There's been a few times I've seen 80c. Granted mine is overclocked. Stock temps normally between 58c to 65c depending on the game. To be honest Gurus it looks like I'm rolling with the 4090. There is so much information being released now and most of it is positive in regard to performance and tech. I may kick myself in November when AMD delivers an awesome card but hey, I will still be very happy. ๐Ÿ™‚
That's some nice temps. Sure, one can never correctly compare as ambient temps, chassis, chassis fans and GPU fan speeds matter a lot but my 3080 runs at 60-65C undervolted at up to 60% speed which is silent enough for me.
H83:

Seems too good to be true. And I find curious that they testing such an high card at only 1440p instead of 4k...
Agreed, it looks a bit too good. And you're right - why not test at 4K? This card is definitely supposed to be a true 4K card.
data/avatar/default/avatar09.webp
asturur:

When rtx5000 is out, there will be another demanding game that will do 30 fps. So? You can't judge the usefulness of a card evaluating a software with arbitrary necessity to render graphics. Unless all you do is cyberpunk h24, but you would get tired fast and then move to something else.
For me I only play certain games and I play it for a long time. So using CP2077 as a benchmark, it tells me the RAW performance at 4K RTRT.
Netherwind:

You cannot generalize it like that as it depends on the game. Sure, if you wait a generation maybe next one will hit a steady 60fps in Cyberpunk but what about other games that will look even better and be even more demanding? Agreed, it looks a bit too good. And you're right - why not test at 4K? This card is definitely supposed to be a true 4K card.
I can. It's my money and how I want to spend it. Again, where are the 4K benchmarks like you quoted? It's comparing Flagship class and price GPUs at 1440p! Come one... At least be honest that in RAW 4K performance with RTRT, they only progress 20% so we consumers can choose to drop down resolution, use blurry upscaling rendered at 1080p or skip it. There is ALREADY enough RASTER performance in the market with AMD 6x00 series and RTX 3x00/2080 series. The next LEAP is really when will get really good RTRT performance. It's not the end of the world so why let us know facts and choose?
https://forums.guru3d.com/data/avatars/m/218/218363.jpg
Valken:

I can. It's my money and how I want to spend it. Again, where are the 4K benchmarks like you quoted? It's comparing Flagship class and price GPUs at 1440p!
Well in that case, wait for the next generation.
TheDigitalJedi:

Ouch! This is with overclocks on a "Water Cooled" system LOL! All settings maxed. DLSS on Balanced @4k https://imgur.com/Cz8gdDY https://imgur.com/nl0pSdF https://imgur.com/GRByqfQ https://imgur.com/0q2SeXS https://imgur.com/hZVaDPV
That's pretty far from 55-57C and 2800MHz boost ๐Ÿ˜€
data/avatar/default/avatar25.webp
Waiting for HH and the gang.
data/avatar/default/avatar24.webp
Netherwind:

Well in that case, wait for the next generation. That's pretty far from 55-57C and 2800MHz boost ๐Ÿ˜€
I think @TheDigitalJedi numbers are from his own rig with a 3090 - NOT a 4090.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
50-57ยบc while using 460W is impressive. The 4090 does have one gigantic cooler, but still. I hope this is not just nvidia massaging the numbers, again.
data/avatar/default/avatar16.webp
Horus-Anhur:

50-57ยบc while using 460W is impressive. The 4090 does have one gigantic cooler, but still. I hope this is not just nvidia massaging the numbers, again.
If true, then honestly no one has anything to worry about in regards to case temps either, as lots of 57c air wont be nearly as bad as getting scorching hot 80c air to the rest of the components. But even if they aren't true, i think i will be ok - even with my 3090 strix running full tilt at 480 watts, all temps are looking very decent - cpu maxing out at 59c, mobo 45c etc. https://i.imgur.com/E4LVPki.jpg So with a 4090 strix running at potentially 100 watts more... maybe things would get 5c hotter, but i don't think it will make too much of a difference.
https://forums.guru3d.com/data/avatars/m/244/244094.jpg
Touting the lower temperatures definitely tells me something about this series.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
TaskMaster:

Wow. Those temps are really impressive actually consider the clock and power. Steady 60fps do you think?
Horus-Anhur:

50-57ยบc while using 460W is impressive. The 4090 does have one gigantic cooler, but still. I hope this is not just nvidia massaging the numbers, again.
Well when you consider how gargantuan the heatsink is, it better cool below 60C. It's honestly a little irritating because that means they could have made it one slot thinner. Granted, different workloads cause different amounts of heat, and CB2077 isn't exactly well-optimized; I wouldn't be surprised if there's a bottleneck somewhere. But for argument's sake, let's say even Furmark doesn't exceed 70C - why do we need such a huge heatsink? If it's for the sake of noise, just use an AIO cooler. At least then it would make the price more understandable.
asturur:

When rtx5000 is out, there will be another demanding game that will do 30 fps. So? You can't judge the usefulness of a card evaluating a software with arbitrary necessity to render graphics. Unless all you do is cyberpunk h24, but you would get tired fast and then move to something else.
Or in another perspective, just lower some settings a notch. Of course in 4K you're kinda limited by what you can lower without it becoming obvious, but AA is something that doesn't need to be totally maxed out, yet doing so can put a big dent in performance. Personally, I don't even like 8x or especially 16x AA, since it sometimes just looks blurry. As far as I understand (but maybe I'm wrong; I never tried it myself), AA is effectively useless if you use supersampling.
https://forums.guru3d.com/data/avatars/m/218/218363.jpg
Dragam1337:

I think @TheDigitalJedi numbers are from his own rig with a 3090 - NOT a 4090.
...oh ๐Ÿ˜ณ I thought it was leaked numbers. My bad!
data/avatar/default/avatar29.webp
schmidtbag:

Well when you consider how gargantuan the heatsink is, it better cool below 60C. It's honestly a little irritating because that means they could have made it one slot thinner. Granted, different workloads cause different amounts of heat, and CB2077 isn't exactly well-optimized; I wouldn't be surprised if there's a bottleneck somewhere. But for argument's sake, let's say even Furmark doesn't exceed 70C - why do we need such a huge heatsink? If it's for the sake of noise, just use an AIO cooler. At least then it would make the price more understandable. Or in another perspective, just lower some settings a notch. Of course in 4K you're kinda limited by what you can lower without it becoming obvious, but AA is something that doesn't need to be totally maxed out, yet doing so can put a big dent in performance. Personally, I don't even like 8x or especially 16x AA, since it sometimes just looks blurry. As far as I understand (but maybe I'm wrong; I never tried it myself), AA is effectively useless if you use supersampling.
Love all the hating from people who were never gonna buy a 4090 regardless... And you are incorrect about using anti aliasing and downsampling/supersampling in conjunction - it works great. Skyrim with 4x sgsaa + 4x msaa looks amazingly clean. Though your AA comment is entirely off the mark, as this game uses TAA, which is either on or off.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Dragam1337:

Love all the hating from people who were never gonna buy a 4090 regardless...
You don't have to buy a product to know whether there was a poor design choice. Case in point - the other article posted today: https://www.guru3d.com/news-story/zotac-mentions-that-atx-12vhpwr-adapter-can-connect-disconnect-30x-only.html The thing is, such design choices can prevent people from buying the GPU. What if your PC needs more than just a single PCIe slot? What if you wanted to do an ITX build? What if you planned to do a pre-built PC for someone and the weight of this honking heatsink just breaks out of the PCIe slot? What if you don't want to burn $50 (rough guess) on just the heatsink alone when you know you're going to replace it with a water block? You can have the full power of a 4090 without something so comically large.
And you are incorrect about using anti aliasing and downsampling/supersampling in conjunction - it works great. Skyrim with 4x sgsaa + 4x msaa looks amazingly clean. Though your AA comment is entirely off the mark, as this game uses TAA, which is either on or off.
Got screenshots? And I was speaking generally, where there are plenty of games that let you choose many different levels or even different types of AA. Some yield better results than others. Some drain performance a lot more than others.