NVIDIA Sells Two SKUs of each Turing GPU (a normal and OC model)
So a colleague website just posted this, and since the cat is out of the bag we might as well post it also. A while ago we noticed separated (two) GPU hardware IDs for the same Turning GPU. Let's call it an A and B model for each GPU. The one is a standard SKU, the other an OC version. And here is how that works.
So let's take the GeForce RTX 2080 Ti as an example, the GPU in there is a TU102. So basically NVIDIA offers two chips based on TU102, one is the TU102-300 and the other TU102-300-A. That A model is short for an OC SKU. So when a board partner is using the TU102-300 (and not TU102-300-A), then they are not allowed to factory tweak it. Thus such a product would end up at reference clock frequencies and would end up in the cheaper blower cooler style products right? Likely the better yielded GPUs end up as an A model.
Now here's why I wanted to write a news item on this: people can still manually tweak that non-A model. So you as a person could grab Afterburner or any tool of your preference to overclock yourself. Chances, however, are higher than the overclock might be less than the A model.
I hope that clarifies a thing or two. BTW TPU who reported this is spot on, we've verified this weeks ago already with many AIBs. And from what we learned, all of them are simply opting the A (OC) series GPUs.
NVIDIA Shield Android TV to Get Fully Functional GeForce Now Service - 07/11/2018 09:43 AM
Nvidia's cloud-based streaming service for games, GeForce Now will be updated for the Shield set-top box with Android TV. The new update invokes the option to run the service in full on the Shield. N...
NVIDIA Shield Experience Upgrade 7.0 Inbound (Android 8.0 Oreo) - 05/24/2018 06:36 PM
Nvidia Shield updates are the gift that keeps on giving. Your SHIELD TV is about to get a big upgrade! Shield experience Upgrade 7.0 has started to roll out to users today, bringing a brand new home ...
Nvidia Shield TV to get an upgrade towards Android O - 01/10/2018 06:50 PM
Despite earlier rumors that the Shield series would not see Android 8, Nvidia is currently working on getting Shield TV updated towards Android TV version 8 Oreo. A specific introduction date was not ...
NVIDIA Star Wars TITAN Xp "Jedi Order" Collectors Edition review - 11/23/2017 04:01 PM
Join our force in the guru order of a review of the GeForce Star Wars TITAN Xp Collector Edition. The card is the fastest consumer graphics card on the globe and is available in a Dark Side and Jedi v...
Nvidia Silently Adds GeForce MX110 and MX130 GPUS for laptops - 11/16/2017 09:47 AM
Nvidia made the GeForce MX110 and MX130 available, both are laptop GPUs positioned under the MX150. Specifications are a little unclear but presumably these are are rebrands of the 920MX and 940MX....
Senior Member
Posts: 2478
Joined: 2010-05-26
So basically 70+% ASIC is now being turned into a money grab and will probably always be this way from now on. Nvidia wants more for the higher ASIC cards.
Senior Member
Posts: 319
Joined: 2015-06-25
J
@Yogi
you forgot the time frame you want to cover.
there is a difference if i need to get something to use the pc/play games for a couple of month before getting something bigger,
or if im willing to spend more on a bigger chip, cause i want to keep for years/future game releases, even if its not the best ratio...
That's still determined by a prospective buyers budget and again perf/€
If I was to give advice it would be 1)what resolution do you want to use
2)what's your budget
3) if the budget exceeds the min tier of gpu for that resolution would you be better off spending on a adaptive sync monitor or a higher tier gpu or something else like larger ssd etc.
Senior Member
Posts: 11808
Joined: 2012-07-20
People want AMD to produce better products in every way
AMD the last few generations have usually been marginally slower, while being hotter and significantly more power hungry.
Selling RX vega64 for more than a 1080Ti(that is significantly faster) is not a good product for the price.
Unfortunately miners did jack up the prices but AMD cards just haven't been comparable to alternate NV cards as a whole.
RX580 is a bit faster than the equivalent 1060 but uses more than double the wattage to do so.
That's not a competitive architecture.
Similar comparison for vega vs 1070/1080
The extreme power gap between AMD and NV shows that AMD is forced to have higher voltages and clock speeds to reach similar performance.
Pascal and kepler are clocked low out-of-the-box with a huge clock ceiling.
That's like a chevy trying to tune their V8 to the max while sacrificing efficiency/MPG to get comparable performance to another manufacturer.
So the statement "I wish AMD was more competitive" is a good one.
If they could produce all around better cards than NV, NV would have to put much more effort in their designs while not raising their prices.
1200USD for a 2080Ti is because AMD is not competitive.
If vega 64 was 500 USD and performed better than the 1080Ti do you think 2080Ti would be 1200?
Absolutely not.
and lastly, your experience is not everyone's experience.
That's your opinion.
I can tell you that vega64 is definitely not fast enough for me and my experience would not be fantastic either(even my 2100mhz 1080ti is not fast enough)
I play 165hz 1440P and find <100 fps jarring.
Well, maybe AMD would be more competitive if people did not persuade everyone around (including themselves) that they should not be buying their products.
Especially if they use false statements to achieve that.
So, you think that RX580 uses more than double wattage than GTX 1060? I guess that 2 * 120 < 185 now. And in reality that RX580 usually goes to 140~160W depending on game. Performance per Watt is almost same for those 2 cards.
So, good job. Persuade more people. Then you can blame AMD some more for nVidia's pricing. Maybe blame AMD for intel's pricing... Sorry, I forgot you already did.
Now imagine what would happen if AMD fitted Ryzen exactly to intel's pricing. Both would have sales at those lovely intel's prices.
Now imagine that AMD releases Navi next year and prices it to fit nVidia's new prices. Who will you blame? What will be statement? "Evil AMD is not reducing prices?"
Surprise, surprise... AMD is not here to make nVidia's GPUs more affordable. That's not their purpose at all.
Senior Member
Posts: 11616
Joined: 2010-12-27
Well, maybe AMD would be more competitive if people did not persuade everyone around (including themselves) that they should not be buying their products.
Especially if they use false statements to achieve that.
So, you think that RX580 uses more than double wattage than GTX 1060? I guess that 2 * 120 < 185 now. And in reality that RX580 usually goes to 140~160W depending on game. Performance per Watt is almost same for those 2 cards.
So, good job. Persuade more people. Then you can blame AMD some more for nVidia's pricing. Maybe blame AMD for intel's pricing... Sorry, I forgot you already did.
Now imagine what would happen if AMD fitted Ryzen exactly to intel's pricing. Both would have sales at those lovely intel's prices.
Now imagine that AMD releases Navi next year and prices it to fit nVidia's new prices. Who will you blame? What will be statement? "Evil AMD is not reducing prices?"
Surprise, surprise... AMD is not here to make nVidia's GPUs more affordable. That's not their purpose at all.
Here you go again with your nonsense taking the meaning far beyond the point's made.
And yes, RX580 uses nearly 2x the wattage.
Don't like facts?



Whatever you want to believe, it's a fact that NV is significantly more power efficient out of the box. Regardless that the above is non-factory card, power difference between it and stock is what 30watts? Still a huge difference between 1060 power consumption.
Edit: guru3d shows rx580 stock using 191watts, just so I'm not picking and choosing.
That's nearly a 60% increase in power consumption.
Which is pretty bad.
Senior Member
Posts: 8091
Joined: 2014-09-27
I find it confusing when people say i wish AMD was more competitive.
My gaming experience is fantastic i can max all my games out so the statement is kind of nonsense to me.
Its clear people dont own a VEGA because if they did they would realise how stupid that statement sounds.
The thing is that it is not. I would get the Vega cards over their counterparts, but not over the 2070.
Hence AMD has nothing for this level of performance and up.
As long as there are still several AIB competitors and prices aren't affected, why dies it matter? Sure, it sucks for some of the smaller partners, but like I said, the market is already over-saturated. The only thing to worry about is if Nvidia attempts to weed out all AIB partners.
It matters because it's yet another tickbox that gets the final prices to be even higher than the supposed msrp. Instead of having the standard of yields and average oc of chips produced later stabilizing, now you can sell them on a premium over your supposed msrp, because there is no one out there to stop you.
That's because people have stupid logic that goes "if AMD can't create 1080Ti performance, none of their products are worth considering", even if they have a display that that doesn't warrant so much processing power.
AMD matters just until a cheap offer on a Vega 56 now. Even a Vega 64 in msrp is not worth it over the 2070.