Guru3D.com
  • HOME
  • NEWS
    • Channels
    • Archive
  • DOWNLOADS
    • New Downloads
    • Categories
    • Archive
  • GAME REVIEWS
  • ARTICLES
    • Rig of the Month
    • Join ROTM
    • PC Buyers Guide
    • Guru3D VGA Charts
    • Editorials
    • Dated content
  • HARDWARE REVIEWS
    • Videocards
    • Processors
    • Audio
    • Motherboards
    • Memory and Flash
    • SSD Storage
    • Chassis
    • Media Players
    • Power Supply
    • Laptop and Mobile
    • Smartphone
    • Networking
    • Keyboard Mouse
    • Cooling
    • Search articles
    • Knowledgebase
    • More Categories
  • FORUMS
  • NEWSLETTER
  • CONTACT

New Reviews
ASUS ROG Rapture GT-AXE11000 WIFI6E router review
Backforce One Plus Gaming Chair review
ASUS GeForce RTX 3080 Noctua OC review
AMD Ryzen 5 5600 review
PowerColor RX 6650 XT Hellhound White review
FSP Hydro PTM Pro (1200W PSU) review
ASUS ROG Radeon RX 6750 XT STRIX review
AMD FidelityFX Super Resolution 2.0 - preview
Sapphire Radeon RX 6650 XT Nitro+ review
Sapphire Radeon RX 6950 XT Sapphire Nitro+ Pure review

New Downloads
Corsair Utility Engine Download (iCUE) Download v4.24.193
Intel HD graphics Driver Download Version: 30.0.101.1994
GeForce 512.95 WHQL driver download
AMD Radeon Software Adrenalin 22.5.2 driver download
AIDA64 Download Version 6.70
FurMark Download v1.30
Display Driver Uninstaller Download version 18.0.5.1
Download Samsung Magician v7.1.1.820
Intel ARC graphics Driver Download Version: 30.0.101.1732
HWiNFO Download v7.24


New Forum Topics
Netflix 4k windows 11 broken AMD Radeon Software Adrenalin 22.5.1 WHQL driver download and discussion A 500Hz refresh rate NVIDIA G-Sync compatible gaming LCD is in the works Nvidia Shadercache setting. Review: ASUS ROG Rapture GT-AXE11000 WIFI6E router [3rd-Party Driver] Amernime Zone Radeon Insight 22.5.1 WHQL Driver Pack (Released) NVIDIA Re-Enables LHR On Its Graphics Cards With New Driver Rumor: Intel Raptor Lake and Sapphire Rapids processors will be available in October AMD Radeon Software Adrenalin 22.5.2 driver download and discussion RadeonMod (Tweak Utility)




Guru3D.com » Review » AMD Radeon RX 6700 XT (reference) review 4

AMD Radeon RX 6700 XT (reference) review 4

Posted by: Hilbert Hagedoorn on: 03/17/2021 03:12 PM [ 228 comment(s) ]

Priced at $479 USD, AMD released their 'mainstream to high-end Radeon RX 6700 XT. A product that is to battle with the RTX 3060 Ti and 3070 from team green. Armed with 12GB of graphics memory, will it offer enough performance for a graphics card with such a massive pricetag?

Read article


Advertisement



« Radeon Series RX 6700 XT preview & analysis · AMD Radeon RX 6700 XT (reference) review · ASRock Z590 PG Velocita review »

pages « < 42 43 44 45 > »

cucaulay malkin
Senior Member



Posts: 4644
Posted on: 03/28/2021 07:20 PM
bold : Read what you quoted. My statement: "Turbo depends on number of loaded cores."

yes,and there is a reason why it's called turbo and base is called base.I would imagine a person of your knowledge would figure it out.

RED
: They shown same effect on wide variety of GPUs. You specifically took GPU with MSRP of $1500 USD as base for fallacious argument.
yeah I'm just seeing the second video now,wasn't aware of it before.in the first one they tried 3090 and 10100.they must've felt the response wasn't positive.

there is just about a couple of games out of those they chose where rtx2060 is comparable to 5600xt,like they are very close in performance.results are okay there,or the impact is very tiny in e.g. death stranding.
they are limiting cpus either on thread count or single core.there is certainly someting going on there on nvidia side on those.

9400f+2060 is a good,realistic example.but why is no 6/12 cpu tested in either of those videos ?

from what I can gather,hub made two videos,and while the problem was evident in the first one already,they made like 50 minutes of material and it's still impossible to judge the extent of the problem.results vary from margin of error when hardware is paired well to huge when they're testing entry level cpus with high end cards.

I still haven't got the answer to the first thing I asked thoug.Is this something nvidia can address and fix or alleviate or is it just how turing and ampere work.

Fox2232
Senior Member



Posts: 11809
Posted on: 03/28/2021 07:38 PM
yes,and there is a reason why it's called turbo and base is called base.I would imagine a person of your knowledge would figure it out.

yeah I'm just seeing the second video now,wasn't aware of it before.in the first one they tried 3090 and 10100.they must've felt the response wasn't positive.

there is just about a couple of games out of those they chose where rtx2060 is comparable to 5600xt,like they are very close in performance.results are okay there,or the impact is very tiny in e.g. death stranding.
they are limiting cpus either on thread count or single core.there is certainly someting going on there on nvidia side on those.

9400f+2060 is a good,realistic example.but why is no 6/12 cpu tested in either of those videos ?

from what I can gather,hub made two videos,and while the problem was evident in the first one already,they made like 50 minutes of material and it's still impossible to judge the extent of the problem.results vary from margin of error when hardware is paired well to huge when they're testing entry level cpus with high end cards.

I still haven't got the answer to the first thing I asked thoug.Is this something nvidia can address and fix or alleviate or is it just how turing and ampere work.
i5-9400F being 6C CPU. Not long ago, this very forum would argue that 6C/6T CPUs are to last for another 4 years.
And when people come to me for new system advice or to build it for them, I sometimes have to persuade them to get 6C/12T as they would be "fine" with 4C/8T.
And I even tell them that when they build new system which is to last till AM5 prices settle, they should aim for 8C16T.

So while 6C/12T CPUs may help in those constrained situations, they would have lower turbo clock. And people on the ground generally felt OKish with fewer cores.
- - - -
Thing you wrote about other tech sites not reporting similar results on not testing is simple. They decided against or have articles in works.
Because even article saying: "We did tests on this and this HW and found no difference from our OC test platform." Would be valuable and could make for big article.

So, maybe when HH is done with his tests, he may bring much more detailed data.

cucaulay malkin
Senior Member



Posts: 4644
Posted on: 03/28/2021 08:09 PM
i5-9400F being 6C CPU. Not long ago, this very forum would argue that 6C/6T CPUs are to last for another 4 years.
And when people come to me for new system advice or to build it for them, I sometimes have to persuade them to get 6C/12T as they would be "fine" with 4C/8T.
And I even tell them that when they build new system which is to last till AM5 prices settle, they should aim for 8C16T.

So while 6C/12T CPUs may help in those constrained situations, they would have lower turbo clock. And people on the ground generally felt OKish with fewer cores.
- - - -
Thing you wrote about other tech sites not reporting similar results on not testing is simple. They decided against or have articles in works.
Because even article saying: "We did tests on this and this HW and found no difference from our OC test platform." Would be valuable and could make for big article.

So, maybe when HH is done with his tests, he may bring much more detailed data.
this is different kind of fish.how good or bad 6/6 is another matter.the question is how much performance hit there is across a stack of cpus,and it's a question hub have not answered.maybe it gave other tech journalists a clue to check that out tho.

6/6 was never an option for me,neither was 8/8 frankly.I only got 6/12 cause it was cheap at the time but the plan was to get 8/16 from the very beginning once they drop in price.going with intel turned out better than amd,i paid less for 10500 than 3600 and got a faster cpu,same for 10700f vs 3700x.
I'd like an example of current midrange tested,eg. 10400.

and let's not kid ourselves,a review that would find no difference would earn hub very little clicks,and one that would put nvidia in any way over amd would straight up fire up their comment section and make tremendous damage.YT channels are not like tech sites.If you wanna make money you gotta find the right people to subscribe with content suited for them.That's why when I watch a video on human evolution the next thing they're suggesting I watch is not the genesis to show me all kinds of different perspectives.

Robbo9999
Senior Member



Posts: 1588
Posted on: 03/28/2021 08:53 PM
Base clock. That's simple. In other words clock 4/6 core CPUs have in CPU intensive games. And reason why non-OC CPU results are more important. Because most of CPUs in gaming PCs simply can't override boost clock.

As a quick aside, (& related), I'm in a queue for a 3080 GPU, and I'd be planning to use it with my 6700K which is at 4.69GHz with 16GB (2 sticks) DDR3 RAM at 3233Mhz (14-15-15-32-240-1T, dual rank). I was/am concerned that I'll get less fps in CPU limited games with my prospective 3080 vs my existing GTX 1070, what do you reckon? (I included my RAM details as dual rank RAM combined with the quite tight timings I have has proven to boost CPU performance in games.) I've not looked into enough detail to know the answer to this question, but I can see you have investigated this topic and figured I'd get your viewpoint. I've got a 180Hz G-sync monitor, so in my case I'm aiming for 171fps/Hz stable in games - for example I can keep that stable pretty much constantly in BF1 except for on Amiens map.

(As it stands I probably won't ever get a 3080 GPU as I believe my vendor will provide a refund rather than live with the loss they'd make given I bought the GPU slightly above MSRP).

Stormyandcold
Senior Member



Posts: 5772
Posted on: 03/28/2021 11:03 PM
As a quick aside, (& related), I'm in a queue for a 3080 GPU, and I'd be planning to use it with my 6700K which is at 4.69GHz with 16GB (2 sticks) DDR3 RAM at 3233Mhz (14-15-15-32-240-1T, dual rank). I was/am concerned that I'll get less fps in CPU limited games with my prospective 3080 vs my existing GTX 1070, what do you reckon? (I included my RAM details as dual rank RAM combined with the quite tight timings I have has proven to boost CPU performance in games.) I've not looked into enough detail to know the answer to this question, but I can see you have investigated this topic and figured I'd get your viewpoint. I've got a 180Hz G-sync monitor, so in my case I'm aiming for 171fps/Hz stable in games - for example I can keep that stable pretty much constantly in BF1 except for on Amiens map.

(As it stands I probably won't ever get a 3080 GPU as I believe my vendor will provide a refund rather than live with the loss they'd make given I bought the GPU slightly above MSRP).

I'm looking at the same upgrade and the cpu bottleneck is roughly 30% or more. Forgot the exact numbers, but, there's definitely a bottleneck vs latest cpus.

pages « < 42 43 44 45 > »

Post New Comment
Click here to post a comment for this article on the message forum.

Guru3D.com » Articles » AMD Radeon RX 6700 XT (reference) review 4

Guru3D.com © 2022