Scythe Mugen 5 Rev.C CPU Cooler review
be quiet Pure Loop 2 FX 280mm LCS review
HP FX900 1 TB NVMe Review
Scythe FUMA2 Rev.B CPU Cooler review
SK Hynix Platinum P41 2TB M.2 NVMe SSD Review
Corsair K70 RGB PRO Mini Wireless review
MSI MPG A1000G - 1000W PSU Review
Goodram IRDM PRO M.2 SSD 2 TB NVMe SSD Review
Samsung T7 Shield Portable 1TB USB SSD review
DeepCool LS720 (LCS) review
AMD Radeon RX 6700 XT (reference) review




Priced at $479 USD, AMD released their 'mainstream to high-end Radeon RX 6700 XT. A product that is to battle with the RTX 3060 Ti and 3070 from team green. Armed with 12GB of graphics memory, will it offer enough performance for a graphics card with such a massive pricetag?
Read article
Advertisement
« Radeon Series RX 6700 XT preview & analysis · AMD Radeon RX 6700 XT (reference) review
· ASRock Z590 PG Velocita review »
pages « < 41 42 43 44 > »
cucaulay malkin
Senior Member
Posts: 5519
Senior Member
Posts: 5519
Posted on: 03/28/2021 06:22 PM
then do not accuse ME of taking things out of context,alright sport ?
base clock for intel has nothing to do with anything except running desktop apps.the 2500k I had a decade ago was 3.3G base my current one has 2.9G base and 4.6G turbo.but what difference 1.7GHz makes to you.
Base clock. That's simple.
then do not accuse ME of taking things out of context,alright sport ?
base clock for intel has nothing to do with anything except running desktop apps.the 2500k I had a decade ago was 3.3G base my current one has 2.9G base and 4.6G turbo.but what difference 1.7GHz makes to you.
Fox2232
Senior Member
Posts: 11809
Senior Member
Posts: 11809
Posted on: 03/28/2021 06:38 PM
then do not accuse ME of taking things out of context,alright sport ?
base clock for intel has nothing to do with anything except running desktop apps.the 2500k I had a decade ago was 3.3G base my current one has 2.9G base and 4.6G turbo.but what difference 1.7GHz makes to you.
Turbo depends on number of loaded cores.
AMD's GPU in system: less threading in driver, smaller load on CPU = Higher clock due to fewer loaded cores
nVidia's GPU in system:
GPU in system: more threading in driver, higher load on CPU = Lower clock due to fewer loaded cores
- - - -
And yes, when entire nVidia's stack is affected and problem is visible even on mainstream GPUs, attempt to put it only as 3090 discussion is taking things out of context.
You made argument:
how does this number relate to the number of rtx 3090 owners,the card used for their tests ? is it more than 5.02% ? is that who you are concerned about ? people who can afford a 3090 and can't get a 4GHz cpu cause life is hard ?
You tried to take $1500 MSRP GPU as base for argument that people who would be affected can as well buy proportionally expensive CPU.
But reality is that $300 GPUs are affected too. And $300 is not some magical end of problem. It is lowest they tested on both sides. And it still shown quite some difference.
Now, go and make statement on what's usual CPU clock while gaming for 90% of gamers. You can look at steam stats on number of CPU cores, brands and base clocks.
You will not like it. As it does not fit your narrative.
Only tiny fraction of nVidia users are not affected by this.
- - - -
And that post of yours with increased boost clocks? Composition Fallacy. So here is correction: "Some people have new non-K intel CPUs which clock under few low threaded workloads to 4.6GHz. But that does not mean All People have them."
Still overwhelming majority have those low clocked CPUs.
then do not accuse ME of taking things out of context,alright sport ?
base clock for intel has nothing to do with anything except running desktop apps.the 2500k I had a decade ago was 3.3G base my current one has 2.9G base and 4.6G turbo.but what difference 1.7GHz makes to you.
Turbo depends on number of loaded cores.
AMD's GPU in system: less threading in driver, smaller load on CPU = Higher clock due to fewer loaded cores
nVidia's GPU in system:
GPU in system: more threading in driver, higher load on CPU = Lower clock due to fewer loaded cores
- - - -
And yes, when entire nVidia's stack is affected and problem is visible even on mainstream GPUs, attempt to put it only as 3090 discussion is taking things out of context.
You made argument:
how does this number relate to the number of rtx 3090 owners,the card used for their tests ? is it more than 5.02% ? is that who you are concerned about ? people who can afford a 3090 and can't get a 4GHz cpu cause life is hard ?
You tried to take $1500 MSRP GPU as base for argument that people who would be affected can as well buy proportionally expensive CPU.
But reality is that $300 GPUs are affected too. And $300 is not some magical end of problem. It is lowest they tested on both sides. And it still shown quite some difference.
Now, go and make statement on what's usual CPU clock while gaming for 90% of gamers. You can look at steam stats on number of CPU cores, brands and base clocks.
You will not like it. As it does not fit your narrative.
Only tiny fraction of nVidia users are not affected by this.
- - - -
And that post of yours with increased boost clocks? Composition Fallacy. So here is correction: "Some people have new non-K intel CPUs which clock under few low threaded workloads to 4.6GHz. But that does not mean All People have them."
Still overwhelming majority have those low clocked CPUs.
cucaulay malkin
Senior Member
Posts: 5519
Senior Member
Posts: 5519
Posted on: 03/28/2021 07:03 PM
Turbo depends on number of loaded cores.
show us how many intel cpus come without turbo.please.or don't bother responding.
attempt to put it only as 3090 discussion is taking things out of context.
You tried to take $1500 MSRP GPU as base for argument

they did
Turbo depends on number of loaded cores.
Only tiny fraction of nVidia users are not affected by this.
as if you could know that
please...stop
I gotta watch the second video,but knowing hub,it's misleading one way or another.why isn't any normal tech site reporting on this ?
im trying to find average performance charts,there is a part of a video with this title,but no charts.
why ?
Turbo depends on number of loaded cores.
show us how many intel cpus come without turbo.please.or don't bother responding.
attempt to put it only as 3090 discussion is taking things out of context.
You tried to take $1500 MSRP GPU as base for argument




they did
Turbo depends on number of loaded cores.
Only tiny fraction of nVidia users are not affected by this.
as if you could know that
please...stop
I gotta watch the second video,but knowing hub,it's misleading one way or another.why isn't any normal tech site reporting on this ?
im trying to find average performance charts,there is a part of a video with this title,but no charts.
why ?
Fox2232
Senior Member
Posts: 11809
Senior Member
Posts: 11809
Posted on: 03/28/2021 07:13 PM
show us how many intel cpus come without turbo.please.or don't bother responding.

as if you could know that
please...stop
I gotta watch the second video,but knowing hub,it's misleading one way or another.why isn't any normal tech site reporting on this ?
bold : Read what you quoted. My statement: "Turbo depends on number of loaded cores."
Your reply: "show us how many intel cpus come without turbo."
Where did I imply that intel's CPU have no Turbo? Your reply is complete false construct presuming I wrote something I did not!
underlined : Only tiny portion of gaming systems are running overclocked CPUs. You know it too. Asking me to "stop" will not change reality.
show us how many intel cpus come without turbo.please.or don't bother responding.




they did
as if you could know that
please...stop
I gotta watch the second video,but knowing hub,it's misleading one way or another.why isn't any normal tech site reporting on this ?
bold : Read what you quoted. My statement: "Turbo depends on number of loaded cores."
Your reply: "show us how many intel cpus come without turbo."
Where did I imply that intel's CPU have no Turbo? Your reply is complete false construct presuming I wrote something I did not!
RED
: They shown same effect on wide variety of GPUs. You specifically took GPU with MSRP of $1500 USD as base for fallacious argument.underlined : Only tiny portion of gaming systems are running overclocked CPUs. You know it too. Asking me to "stop" will not change reality.
pages « < 41 42 43 44 > »
Click here to post a comment for this article on the message forum.
Senior Member
Posts: 11809
care to answer my question ?
what is being reported in those clock charts ? hm ?
Base clock. That's simple. In other words clock 4/6 core CPUs have in CPU intensive games. And reason why non-OC CPU results are more important. Because most of CPUs in gaming PCs simply can't override boost clock.