NVIDIA Sells Two SKUs of each Turing GPU (a normal and OC model)

Published by

Click here to post a comment for NVIDIA Sells Two SKUs of each Turing GPU (a normal and OC model) on our message forum
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
@Yogi you forgot the time frame you want to cover. there is a difference if i need to get something to use the pc/play games for a couple of month before getting something bigger, or if im willing to spend more on a bigger chip, cause i want to keep for years/future game releases, even if its not the best ratio...
https://forums.guru3d.com/data/avatars/m/216/216490.jpg
Does this only applies to AIB partners selection/preference? Or it also applies to the ones sold directly at Nvidia's website? If so, how do the customers know which one they will get if pre-ordered directly from Nvidia, if that's the case? Nvm. 2080Ti Founders have already a factory oc applied which indicates the use of TU102-300-A sku. And if they were only selling TU102-300-A(oc sku) at the Nvidia website, what will happen to all the rest non-oc TU102-300, since no one is opting for them? Or the AIB's do indeed opt for the non-oc TU102-300 but only for the blower style/cheapest release version of each brand? Last but not least, wasn't the RTX 2080 NDA supposed to be lifted today(Sept 17th)? Just checked and it seems it's been pushed back to September 19th.
https://forums.guru3d.com/data/avatars/m/231/231931.jpg
Pimpiklem:

I find it confusing when people say i wish AMD was more competitive. My gaming experience is fantastic i can max all my games out so the statement is kind of nonsense to me. Its clear people dont own a VEGA because if they did they would realise how stupid that statement sounds.
People want AMD to produce better products in every way AMD the last few generations have usually been marginally slower, while being hotter and significantly more power hungry. Selling RX vega64 for more than a 1080Ti(that is significantly faster) is not a good product for the price. Unfortunately miners did jack up the prices but AMD cards just haven't been comparable to alternate NV cards as a whole. RX580 is a bit faster than the equivalent 1060 but uses more than double the wattage to do so. That's not a competitive architecture. Similar comparison for vega vs 1070/1080 The extreme power gap between AMD and NV shows that AMD is forced to have higher voltages and clock speeds to reach similar performance. Pascal and kepler are clocked low out-of-the-box with a huge clock ceiling. That's like a chevy trying to tune their V8 to the max while sacrificing efficiency/MPG to get comparable performance to another manufacturer. So the statement "I wish AMD was more competitive" is a good one. If they could produce all around better cards than NV, NV would have to put much more effort in their designs while not raising their prices. 1200USD for a 2080Ti is because AMD is not competitive. If vega 64 was 500 USD and performed better than the 1080Ti do you think 2080Ti would be 1200? Absolutely not. and lastly, your experience is not everyone's experience. That's your opinion. I can tell you that vega64 is definitely not fast enough for me and my experience would not be fantastic either(even my 2100mhz 1080ti is not fast enough) I play 165hz 1440P and find <100 fps jarring.
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
Yogi:

It's not even at the higher end of the market where nvidia has this superior mystique. I remember a few years ago a mate of mine was convinced that a gtx970 was an upgrade over his rx480. There's no doubt that in the highest tiers like the 1080ti/titan and the new 20xx series have no competition but at the lower tiers I would not recommend nvidia over amd or vice versa The only recommendation I would give would be based on performance/€
Yeah, Nvidia dominates in the mainstream and budget markets as well, with the GTX 1060 vastly outselling the RX 480/580. In reality, these gamers would be far better off with a AMD GPU + FreeSync monitor but Nvidia's brand is extremely strong.
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
not if they dont need free/gsync. the 1060/6 is fast enough to do most games with decent settings @60 with reg vsync. and its not like a lot of those buying 1060s have gsync monitors, so removing the freesync "advantage" from the occasion, means you need a 580 (rather than a 480) to be on 1060 level.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Pimpiklem:

I find it confusing when people say i wish AMD was more competitive. My gaming experience is fantastic i can max all my games out so the statement is kind of nonsense to me. Its clear people dont own a VEGA because if they did they would realise how stupid that statement sounds.
The thing is that it is not. I would get the Vega cards over their counterparts, but not over the 2070. Hence AMD has nothing for this level of performance and up.
schmidtbag:

As long as there are still several AIB competitors and prices aren't affected, why dies it matter? Sure, it sucks for some of the smaller partners, but like I said, the market is already over-saturated. The only thing to worry about is if Nvidia attempts to weed out all AIB partners.
It matters because it's yet another tickbox that gets the final prices to be even higher than the supposed msrp. Instead of having the standard of yields and average oc of chips produced later stabilizing, now you can sell them on a premium over your supposed msrp, because there is no one out there to stop you.
schmidtbag:

That's because people have stupid logic that goes "if AMD can't create 1080Ti performance, none of their products are worth considering", even if they have a display that that doesn't warrant so much processing power.
AMD matters just until a cheap offer on a Vega 56 now. Even a Vega 64 in msrp is not worth it over the 2070.
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
So basically 70+% ASIC is now being turned into a money grab and will probably always be this way from now on. Nvidia wants more for the higher ASIC cards.
https://forums.guru3d.com/data/avatars/m/263/263841.jpg
J
fry178:

@Yogi you forgot the time frame you want to cover. there is a difference if i need to get something to use the pc/play games for a couple of month before getting something bigger, or if im willing to spend more on a bigger chip, cause i want to keep for years/future game releases, even if its not the best ratio...
That's still determined by a prospective buyers budget and again perf/€ If I was to give advice it would be 1)what resolution do you want to use 2)what's your budget 3) if the budget exceeds the min tier of gpu for that resolution would you be better off spending on a adaptive sync monitor or a higher tier gpu or something else like larger ssd etc.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Agent-A01:

People want AMD to produce better products in every way AMD the last few generations have usually been marginally slower, while being hotter and significantly more power hungry. Selling RX vega64 for more than a 1080Ti(that is significantly faster) is not a good product for the price. Unfortunately miners did jack up the prices but AMD cards just haven't been comparable to alternate NV cards as a whole. RX580 is a bit faster than the equivalent 1060 but uses more than double the wattage to do so. That's not a competitive architecture. Similar comparison for vega vs 1070/1080 The extreme power gap between AMD and NV shows that AMD is forced to have higher voltages and clock speeds to reach similar performance. Pascal and kepler are clocked low out-of-the-box with a huge clock ceiling. That's like a chevy trying to tune their V8 to the max while sacrificing efficiency/MPG to get comparable performance to another manufacturer. So the statement "I wish AMD was more competitive" is a good one. If they could produce all around better cards than NV, NV would have to put much more effort in their designs while not raising their prices. 1200USD for a 2080Ti is because AMD is not competitive. If vega 64 was 500 USD and performed better than the 1080Ti do you think 2080Ti would be 1200? Absolutely not. and lastly, your experience is not everyone's experience. That's your opinion. I can tell you that vega64 is definitely not fast enough for me and my experience would not be fantastic either(even my 2100mhz 1080ti is not fast enough) I play 165hz 1440P and find <100 fps jarring.
Well, maybe AMD would be more competitive if people did not persuade everyone around (including themselves) that they should not be buying their products. Especially if they use false statements to achieve that. So, you think that RX580 uses more than double wattage than GTX 1060? I guess that 2 * 120 < 185 now. And in reality that RX580 usually goes to 140~160W depending on game. Performance per Watt is almost same for those 2 cards. So, good job. Persuade more people. Then you can blame AMD some more for nVidia's pricing. Maybe blame AMD for intel's pricing... Sorry, I forgot you already did. Now imagine what would happen if AMD fitted Ryzen exactly to intel's pricing. Both would have sales at those lovely intel's prices. Now imagine that AMD releases Navi next year and prices it to fit nVidia's new prices. Who will you blame? What will be statement? "Evil AMD is not reducing prices?" Surprise, surprise... AMD is not here to make nVidia's GPUs more affordable. That's not their purpose at all.
https://forums.guru3d.com/data/avatars/m/231/231931.jpg
Fox2232:

Well, maybe AMD would be more competitive if people did not persuade everyone around (including themselves) that they should not be buying their products. Especially if they use false statements to achieve that. So, you think that RX580 uses more than double wattage than GTX 1060? I guess that 2 * 120 < 185 now. And in reality that RX580 usually goes to 140~160W depending on game. Performance per Watt is almost same for those 2 cards. So, good job. Persuade more people. Then you can blame AMD some more for nVidia's pricing. Maybe blame AMD for intel's pricing... Sorry, I forgot you already did. Now imagine what would happen if AMD fitted Ryzen exactly to intel's pricing. Both would have sales at those lovely intel's prices. Now imagine that AMD releases Navi next year and prices it to fit nVidia's new prices. Who will you blame? What will be statement? "Evil AMD is not reducing prices?" Surprise, surprise... AMD is not here to make nVidia's GPUs more affordable. That's not their purpose at all.
Here you go again with your nonsense taking the meaning far beyond the point's made. And yes, RX580 uses nearly 2x the wattage. Don't like facts? https://tpucdn.com/reviews/Sapphire/RX_580_Nitro_Plus/images/power_maximum.png https://tpucdn.com/reviews/Sapphire/RX_580_Nitro_Plus/images/power_average.png https://tpucdn.com/reviews/Sapphire/RX_580_Nitro_Plus/images/perfwatt_2560_1440.png Whatever you want to believe, it's a fact that NV is significantly more power efficient out of the box. Regardless that the above is non-factory card, power difference between it and stock is what 30watts? Still a huge difference between 1060 power consumption. Edit: guru3d shows rx580 stock using 191watts, just so I'm not picking and choosing. That's nearly a 60% increase in power consumption. Which is pretty bad.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
Vega is not bad at all. What really killed Vega is availability, which was due to low yields and also miners snatching up the cards as soon as they were available. Because Pascal had higher yields and had more market time before Vega came out, it's easier to find a 1080 at MSRP or lower compared to finding a Vega at MSRP or lower. Now reference design, the 1080 sells for $550 straight from Nvidia. I don't think AMD Sells their own branded Vega 64? Just board partners do?
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Agent-A01:

Here you go again with your nonsense taking the meaning far beyond the point's made. And yes, RX580 uses nearly 2x the wattage. Don't like facts? https://tpucdn.com/reviews/Sapphire/RX_580_Nitro_Plus/images/power_maximum.png https://tpucdn.com/reviews/Sapphire/RX_580_Nitro_Plus/images/power_average.png https://tpucdn.com/reviews/Sapphire/RX_580_Nitro_Plus/images/perfwatt_2560_1440.png Whatever you want to believe, it's a fact that NV is significantly more power efficient out of the box. Regardless that the above is non-factory card, power difference between it and stock is what 30watts? Still a huge difference between 1060 power consumption.
Seeing those graphs. Lovely. Facts: - Fury X is Hard-capped at 300W. It required vBIOS edit to allow it eat up to 360W. - I have access to two RX-580, by twist of faith... Yes, exactly mentioned Sapphire RX-580 Nitro+ 8GB is one of them. It does not even go to 180W without moving Power slider in MSI AB. And then at time it does, workload is taking it to ~60fps and less. - And guess why RX 570/580 were favorite mining cards. Do you remember ethereum? 23MHash/s on GTX 1060, 28MHash/s on RX-580. You should have advised those guys better, maybe they would not buy as many of those RX-570/580s. - Quite a few of those wattages in your TPU images are quite ridiculous, and I mean Hilbert's calculated values are actually much more accurate (and both of those RX-580 are special OC editions).
https://forums.guru3d.com/data/avatars/m/263/263841.jpg
Agent-A01:

Here you go again with your nonsense taking the meaning far beyond the point's made. And yes, RX580 uses nearly 2x the wattage. Don't like facts? Whatever you want to believe, it's a fact that NV is significantly more power efficient out of the box. Regardless that the above is non-factory card, power difference between it and stock is what 30watts? Still a huge difference between 1060 power consumption. Edit: guru3d shows rx580 stock using 191watts, just so I'm not picking and choosing. That's nearly a 60% increase in power consumption. Which is pretty bad.
Going off average US electric rates that's about a difference of $2.50 for a solid week of running furmark. Or 10 bucks a month difference running the cards 24/7.
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
Agent-A01:

Here you go again with your nonsense taking the meaning far beyond the point's made. And yes, RX580 uses nearly 2x the wattage. Don't like facts?
That's an extreme example. The Nitro cards are highly overclocked and not representative of most 580s. For instance, Guru3D's review of MSI's RX 580 Gaming X shows 191 watts (vs 134 watts for the 1060). https://www.guru3d.com/articles_pages/msi_radeon_rx_580_gaming_x_review,5.html Pascal is much more power-efficient overall, but saying that it's 2X is overblown.
Fox2232:

- And guess why RX 570/580 were favorite mining cards. Do you remember ethereum? 23MHash/s on GTX 1060, 28MHash/s on RX-580. You should have advised those guys better, maybe they would not buy as many of those RX-570/580s.
To be fair, many of those miners were modding and undervolting their GPUs. You can also reduce power consumption greatly on a 1060, although the efficiency per watt is still in favor of Polaris. Also, DaggerHashimoto just ran better on AMD hardware (Nvidia GPUs are better for other algorithms).
https://forums.guru3d.com/data/avatars/m/231/231931.jpg
Fox2232:

Seeing those graphs. Lovely. Facts: - Fury X is Hard-capped at 300W. It required vBIOS edit to allow it eat up to 360W. - I have access to two RX-580, by twist of faith... Yes, exactly mentioned Sapphire RX-580 Nitro+ 8GB is one of them. It does not even go to 180W without moving Power slider in MSI AB. And then at time it does, workload is taking it to ~60fps and less. - And guess why RX 570/580 were favorite mining cards. Do you remember ethereum? 23MHash/s on GTX 1060, 28MHash/s on RX-580. You should have advised those guys better, maybe they would not buy as many of those RX-570/580s. - Quite a few of those wattages in your TPU images are quite ridiculous, and I mean Hilbert's calculated values are actually much more accurate (and both of those RX-580 are special OC editions).
Well considering I've seen several reviews with just 2 mins of looking up reviews that show fury x > 300w, that proves your statement is incorrect. https://i.imgur.com/c7B4hHj.png Another 100w+ difference between OC 1060 and OC 580. Also tomshardware shows 224 watts under load for nitro 580 as well. Are you saying every reviewer out there is wrong? I mean, the data proves you wrong several times again. Lastly, you bring in more things that have absolutely zero relevance to the post. How exactly is mining etherum performance relevant to the previous statements? It's not. and btw, techpowerup doesn't use calculations. It uses hardware to measure power consumption of DC input directly , which gets rid of human error. Mind you, that equipment costs several thousand dollars.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
Enough of this conversation. Only warning.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
as schmitty and vbetts said...this reeks of Intel marketing. there has been a lot of engineering cross-pollination, i guess that some in marketing have moved as well... honestly this is a major fail that should never have seen the light of day. and it is very true that AIB manufacturers have very thin margins that are being squeezed thinner. in the past the lower binned chips would be a different model if the variance in headroom and performance was so high. again, greed and sloppy marketing for a very short-lived product staring obsolescence in the face. (from its own company at the very least, not counting AMD and Intel).
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
PrMinisterGR:

It matters because it's yet another tickbox that gets the final prices to be even higher than the supposed msrp. Instead of having the standard of yields and average oc of chips produced later stabilizing, now you can sell them on a premium over your supposed msrp, because there is no one out there to stop you.
That's assuming absolute worst case scenario where there isn't enough AIB competition. I don't see that happening any time soon.
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
@Yogi and i still prefer to buy the more efficient unit, as the power has to be made somewhere, the less my pc needs, the less greenhouse effect (while doing the same thing with it)... same reason i would have prefered a golf gti over a pontiac firebird @same HP (1.6L vs 7L), not that i was old enough to drive 😉
https://forums.guru3d.com/data/avatars/m/251/251862.jpg
Obviously Nvidia is in business to make money, but is this really a problem? Since the partners can't stay in the lines and want to make "Super OC/RGB" everything, this lets Nvidia try to maintain some quality. I'd prefer to know my factory OC'd card was binned by Nvidia, and not just slapped with a bigger heatsink and price tag. This also seems like it might be a good way to distinguish product lines like Strix, and Aorus. Since the source was TPU I'll quote them: "Two device IDs per GPU is very unusual. For example, all GTX 1080 Ti cards, whether reference or custom design, are marked as 1B06. Titan Xp on the other hand, which uses the same physical GPU, is marked as 1B02. NVIDIA has always used just one ID per SKU, no matter if custom-design, reference or Founders Edition." After this they show a picture from 'nv_dispi.inf' which shows 2 2080ti's, 2 2080's, and 2 2070's. The quote is just wrong from what I can tell. They've used multiple ID's per GPU forever. Here are a few: [spoiler] NVIDIA_DEV.0401 = "NVIDIA GeForce 8600 GT" NVIDIA_DEV.0402 = "NVIDIA GeForce 8600 GT" NVIDIA_DEV.0422 = "NVIDIA GeForce 8400 GS " NVIDIA_DEV.0424 = "NVIDIA GeForce 8400 GS " NVIDIA_DEV.05E2 = "NVIDIA GeForce GTX 260" NVIDIA_DEV.05EA = "NVIDIA GeForce GTX 260 " NVIDIA_DEV.0601 = "NVIDIA GeForce 9800 GT" NVIDIA_DEV.0605 = "NVIDIA GeForce 9800 GT " NVIDIA_DEV.0612 = "NVIDIA GeForce 9800 GTX/9800 GTX+" NVIDIA_DEV.0613 = "NVIDIA GeForce 9800 GTX+" NVIDIA_DEV.062D = "NVIDIA GeForce 9600 GT " NVIDIA_DEV.062E = "NVIDIA GeForce 9600 GT " NVIDIA_DEV.0637 = "NVIDIA GeForce 9600 GT " NVIDIA_DEV.0610 = "NVIDIA GeForce 9600 GSO" NVIDIA_DEV.0635 = "NVIDIA GeForce 9600 GSO " NVIDIA_DEV.0CA0 = "NVIDIA GeForce GT 330 " NVIDIA_DEV.0CA7 = "NVIDIA GeForce GT 330 " NVIDIA_DEV.0CA5 = "NVIDIA GeForce GT 220 " NVIDIA_DEV.0CAC = "NVIDIA GeForce GT 220 " [/spoiler]