Guru3D.com
  • HOME
  • NEWS
    • Channels
    • Archive
  • DOWNLOADS
    • New Downloads
    • Categories
    • Archive
  • GAME REVIEWS
  • ARTICLES
    • Rig of the Month
    • Join ROTM
    • PC Buyers Guide
    • Guru3D VGA Charts
    • Editorials
    • Dated content
  • HARDWARE REVIEWS
    • Videocards
    • Processors
    • Audio
    • Motherboards
    • Memory and Flash
    • SSD Storage
    • Chassis
    • Media Players
    • Power Supply
    • Laptop and Mobile
    • Smartphone
    • Networking
    • Keyboard Mouse
    • Cooling
    • Search articles
    • Knowledgebase
    • More Categories
  • FORUMS
  • NEWSLETTER
  • CONTACT

New Reviews
G.Skill TridentZ5 RGB DDR5 7200 CL34 2x16 GB review
ASUS TUF Gaming B760-PLUS WIFI D4 review
Netac NV7000 2 TB NVMe SSD Review
ASUS GeForce RTX 4080 Noctua OC Edition review
MSI Clutch GM51 Wireless mouse review
ASUS ROG STRIX B760-F Gaming WIFI review
Asus ROG Harpe Ace Aim Lab Edition mouse review
SteelSeries Arctis Nova Pro Headset review
Ryzen 7800X3D preview - 7950X3D One CCD Disabled
MSI VIGOR GK71 SONIC Blue keyboard review

New Downloads
Intel ARC graphics Driver Download Version: 31.0.101.4257
CrystalDiskInfo 9.0.0 Beta4 Download
AIDA64 Download Version 6.88
GeForce 531.41 WHQL driver download
AMD Radeon Software Adrenalin 23.3.2 WHQL download
GeForce 531.29 WHQL driver download
AMD Ryzen Master Utility Download 2.10.2.2367
AMD Radeon Software Adrenalin 23.3.1 WHQL download
Display Driver Uninstaller Download version 18.0.6.1
CPU-Z download v2.05


New Forum Topics
Windows: Line-Based vs. Message Signaled-Based Interrupts. MSI tool. Fine Utilise Power of RadeonPRO Software & SweetFX Part 2 The Last of Us Part I PC Port Receives 77% negative ratings on Steam, due to poor optimization Valve to Discontinue Support for Windows 7, 8, and 8.1 on Steam Starting 2024 Amernime Zone AMD Software: Adrenalin / Pro Driver - Release Discovery 22.12.2 WHQL Forza Horizon 5 Receives NVIDIA DLSS 3 and Reflex Update, Boosting Gameplay Experience Windows 12 - News, rumors, info, etc. Intel LGA 7529 Processors are Nearly 10cm in Length AMD Software: Adrenalin Edition 22.40.43.05 for The Last of Us™ Part 1 Release Notes Any reason not to enable "Prefer Max Performance" for individual game profiles? Thanks,




Guru3D.com » News » NVIDIA Sells Two SKUs of each Turing GPU (a normal and OC model)

NVIDIA Sells Two SKUs of each Turing GPU (a normal and OC model)

by Hilbert Hagedoorn on: 09/17/2018 02:13 PM | source: techpowerup | 50 comment(s)
NVIDIA Sells Two SKUs of each Turing GPU (a normal and OC model)

So a colleague website just posted this, and since the cat is out of the bag we might as well post it also. A while ago we noticed separated (two) GPU hardware IDs for the same Turning GPU. Let's call it an A and B model for each GPU. The one is a standard SKU, the other an OC version. And here is how that works.

So let's take the GeForce RTX 2080 Ti as an example, the GPU in there is a TU102. So basically NVIDIA offers two chips based on TU102, one is the TU102-300 and the other TU102-300-A. That A model is short for an OC SKU. So when a board partner is using the TU102-300 (and not TU102-300-A), then they are not allowed to factory tweak it. Thus such a product would end up at reference clock frequencies and would end up in the cheaper blower cooler style products right? Likely the better yielded GPUs end up as an A model.

Now here's why I wanted to write a news item on this: people can still manually tweak that non-A model. So you as a person could grab Afterburner or any tool of your preference to overclock yourself. Chances, however, are higher than the overclock might be less than the A model.

I hope that clarifies a thing or two. BTW TPU who reported this is spot on, we've verified this weeks ago already with many AIBs. And from what we learned, all of them are simply opting the A (OC) series GPUs.







« ASUS Launches its ROG Thor Series Power Supplies · NVIDIA Sells Two SKUs of each Turing GPU (a normal and OC model) · Fallout 4: New Vegas mod gameplay video »

Related Stories

NVIDIA Shield Android TV to Get Fully Functional GeForce Now Service - 07/11/2018 09:43 AM
Nvidia's cloud-based streaming service for games, GeForce Now will be updated for the Shield set-top box with Android TV. The new update invokes the option to run the service in full on the Shield. N...

NVIDIA Shield Experience Upgrade 7.0 Inbound (Android 8.0 Oreo) - 05/24/2018 06:36 PM
Nvidia Shield updates are the gift that keeps on giving. Your SHIELD TV is about to get a big upgrade! Shield experience Upgrade 7.0 has started to roll out to users today, bringing a brand new home ...

Nvidia Shield TV to get an upgrade towards Android O - 01/10/2018 06:50 PM
Despite earlier rumors that the Shield series would not see Android 8, Nvidia is currently working on getting Shield TV updated towards Android TV version 8 Oreo. A specific introduction date was not ...

NVIDIA Star Wars TITAN Xp "Jedi Order" Collectors Edition review - 11/23/2017 04:01 PM
Join our force in the guru order of a review of the GeForce Star Wars TITAN Xp Collector Edition. The card is the fastest consumer graphics card on the globe and is available in a Dark Side and Jedi v...

Nvidia Silently Adds GeForce MX110 and MX130 GPUS for laptops - 11/16/2017 09:47 AM
Nvidia made the GeForce MX110 and MX130 available, both are laptop GPUs positioned under the MX150. Specifications are a little unclear but presumably these are are rebrands of the 920MX and 940MX....


10 pages « < 6 7 8 9 > »


vbetts
Moderator



Posts: 15142
Joined: 2006-07-04

#5585671 Posted on: 09/17/2018 08:29 PM
Vega is not bad at all. What really killed Vega is availability, which was due to low yields and also miners snatching up the cards as soon as they were available. Because Pascal had higher yields and had more market time before Vega came out, it's easier to find a 1080 at MSRP or lower compared to finding a Vega at MSRP or lower. Now reference design, the 1080 sells for $550 straight from Nvidia. I don't think AMD Sells their own branded Vega 64? Just board partners do?

Fox2232
Senior Member



Posts: 11808
Joined: 2012-07-20

#5585678 Posted on: 09/17/2018 08:59 PM
Here you go again with your nonsense taking the meaning far beyond the point's made.

And yes, RX580 uses nearly 2x the wattage.

Don't like facts?







Whatever you want to believe, it's a fact that NV is significantly more power efficient out of the box. Regardless that the above is non-factory card, power difference between it and stock is what 30watts? Still a huge difference between 1060 power consumption.
Seeing those graphs. Lovely. Facts:
- Fury X is Hard-capped at 300W. It required vBIOS edit to allow it eat up to 360W.
- I have access to two RX-580, by twist of faith... Yes, exactly mentioned Sapphire RX-580 Nitro+ 8GB is one of them. It does not even go to 180W without moving Power slider in MSI AB. And then at time it does, workload is taking it to ~60fps and less.
- And guess why RX 570/580 were favorite mining cards. Do you remember ethereum? 23MHash/s on GTX 1060, 28MHash/s on RX-580. You should have advised those guys better, maybe they would not buy as many of those RX-570/580s.
- Quite a few of those wattages in your TPU images are quite ridiculous, and I mean Hilbert's calculated values are actually much more accurate (and both of those RX-580 are special OC editions).

Yogi
Senior Member



Posts: 324
Joined: 2015-06-25

#5585683 Posted on: 09/17/2018 09:03 PM
Here you go again with your nonsense taking the meaning far beyond the point's made.

And yes, RX580 uses nearly 2x the wattage.

Don't like facts?

Whatever you want to believe, it's a fact that NV is significantly more power efficient out of the box. Regardless that the above is non-factory card, power difference between it and stock is what 30watts? Still a huge difference between 1060 power consumption.

Edit: guru3d shows rx580 stock using 191watts, just so I'm not picking and choosing.
That's nearly a 60% increase in power consumption.

Which is pretty bad.

Going off average US electric rates that's about a difference of $2.50 for a solid week of running furmark. Or 10 bucks a month difference running the cards 24/7.

D3M1G0D
Senior Member



Posts: 2068
Joined: 2017-03-10

#5585688 Posted on: 09/17/2018 09:12 PM
Here you go again with your nonsense taking the meaning far beyond the point's made.

And yes, RX580 uses nearly 2x the wattage.

Don't like facts?
That's an extreme example. The Nitro cards are highly overclocked and not representative of most 580s. For instance, Guru3D's review of MSI's RX 580 Gaming X shows 191 watts (vs 134 watts for the 1060).

https://www.guru3d.com/articles-pages/msi-radeon-rx-580-gaming-x-review,5.html

Pascal is much more power-efficient overall, but saying that it's 2X is overblown.

- And guess why RX 570/580 were favorite mining cards. Do you remember ethereum? 23MHash/s on GTX 1060, 28MHash/s on RX-580. You should have advised those guys better, maybe they would not buy as many of those RX-570/580s.
To be fair, many of those miners were modding and undervolting their GPUs. You can also reduce power consumption greatly on a 1060, although the efficiency per watt is still in favor of Polaris. Also, DaggerHashimoto just ran better on AMD hardware (Nvidia GPUs are better for other algorithms).

Agent-A01
Senior Member



Posts: 11621
Joined: 2010-12-27

#5585690 Posted on: 09/17/2018 09:14 PM
Seeing those graphs. Lovely. Facts:
- Fury X is Hard-capped at 300W. It required vBIOS edit to allow it eat up to 360W.
- I have access to two RX-580, by twist of faith... Yes, exactly mentioned Sapphire RX-580 Nitro+ 8GB is one of them. It does not even go to 180W without moving Power slider in MSI AB. And then at time it does, workload is taking it to ~60fps and less.
- And guess why RX 570/580 were favorite mining cards. Do you remember ethereum? 23MHash/s on GTX 1060, 28MHash/s on RX-580. You should have advised those guys better, maybe they would not buy as many of those RX-570/580s.
- Quite a few of those wattages in your TPU images are quite ridiculous, and I mean Hilbert's calculated values are actually much more accurate (and both of those RX-580 are special OC editions).

Well considering I've seen several reviews with just 2 mins of looking up reviews that show fury x > 300w, that proves your statement is incorrect.



Another 100w+ difference between OC 1060 and OC 580.

Also tomshardware shows 224 watts under load for nitro 580 as well.

Are you saying every reviewer out there is wrong?
I mean, the data proves you wrong several times again.

Lastly, you bring in more things that have absolutely zero relevance to the post.
How exactly is mining etherum performance relevant to the previous statements?

It's not.

and btw, techpowerup doesn't use calculations.
It uses hardware to measure power consumption of DC input directly , which gets rid of human error. Mind you, that equipment costs several thousand dollars.

10 pages « < 6 7 8 9 > »


Post New Comment
Click here to post a comment for this news story on the message forum.


Guru3D.com © 2023