AMD Radeon RX rumor: 12nm Polaris 30 in the 4th quarter

Published by

Click here to post a comment for AMD Radeon RX rumor: 12nm Polaris 30 in the 4th quarter on our message forum
data/avatar/default/avatar28.webp
Shrinking an existing chip to a new process isn't free - that's why I don't think we'll ever see Polaris on 7nm, not without some major retooling. However, 12nm is actually a "very much improved" 12nm: the chip's size won't change, but circuit paths will be more precise - causing less leakage thus allowing higher frequencies (like we saw for Zen+ compared with Zen). Polaris by itself works well - provided newer APIs are used. As such, if we consider a 10% clock increase over Polaris 20 (which is already 10% faster than Polaris 10) and it goes along with slightly faster VRAM, this would keep this mid-range chip right where it's best at: 1080p with all settings maxed out, or 1440p with lower settings. Driver will be mature right away, and RAM quantity is already good (8Gb). It may still not compete with the Geforce 1070, but if it finally shows up at MSRP, it will be very interesting.
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
All I want is to be able to get one at MSRP, I don't care about anything else.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
a couple of thoughts... i've been hearing about a refresh/shrink mid-range, heard a stupid (economically) rumor it was Vega. Polaris at 12nm would actually make a lot of sense and it could be sold dirt cheap at profit (both process and architecture being mature). but that doesn't mean it's true. the other thought was if anyone thought the Vega release was a bomb or ineffective, that person is a fanboy. they've sold every one they could make and are filling back-orders. while the behemoth (cue dark metal music) Nvidia is eating shiploads of product. from a business perspective, that "lack of production" at AMD literally saved them from over-production. both companies are doing quite well, thank you.
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
AMDfan:

Why would they shrink Polaris to 12 Nm when Vega 20 on 7Nm is in full production (as we speak), launch Q4 2018 Q1 2019.... I say there will be no Polaris 30 cards
It makes some sense, whatever mid-range vega product was intended to ship, it probably didn't due to the cost of hbm2, (mobile vega in desktop form perhaps), obviously navi will not arrive fast enough to fill the gap this year, using an existing design on 12nm (a 14nm derivative) is far cheaper to do and faster than trying to speed up the development on whatever comes after, and will better compete with what nvidia has on the market especially since the current cards are getting stale in peoples minds. Really any improvement is better than nothing for amd.
data/avatar/default/avatar29.webp
Seikon:

Well my Rx480 is chocked by memory bandwidth , using better timings gives me e better performance boost then going from 1303 to 1420 on the core , so if they manage to bring a 1600mhz polaris +gddr6 it should beat a gtx1070 if you ask me :/
first off to all the "nay sayers" Polaris and GCN has never been a "slouch" people need to get this through their thick skulls, the ONLY reason why Ngreedia stuff appears "so fast" is because they chopped out as much as they possibly could AND emulat and or fk with the software/hardware as much as they possibly can to ensure the ability to clock things up as high as they possibly can,per mm/2 Polaris and GCN have overall been VERY good since the time they were launched with first gen being Radeon 7xxx. now as far as the memory being the "bottleneck" keep in mine, the design ups or lowers the timings based on the speed the memory operates at GDDR5 (especially) there are some "sweet spots" where on YOUR card you may notice for example at 1300 on the core (for memory clock) you see a massive jump in performance where on mine I may notices a more gradual jump up to a maximum of 1290 and then performance drops out. there are many subtimings, clock gating, power control etc that WE DO NOT HAVE ACCESS TO, every single gpu or cpu is "unique" +/- a whole bunch of factors, even if it is the SAME model from the SAME company (such as MSI or ASUS, or even directly from AMD) upping the memory to say GDDR6 will not automatically mean it will for sure be "faster" when maybe the shaders and all the stuff was NOT designed to take full advantage of it (let alone the games, software, OS being able to effectively "tap into it") =============================== ============ one has to run multiple benchmarks, games, mining programs where you can take YOUR card and test things out to see where the performance increases as well as where it decreases, these things are not like cars where giving it X more power will ALWAYS mean X more speed to the "wheels" Either way I will take a full fat Radeon over a gimped out Ngreedia Geforce any day of the week and twice on sundays, they are built better (use higher quality components) and are backed by a company trying to help everyone, not just their own wallet. just remember with Nvidia "It is the way you got played" basically paying for a souped up V4 instead of a V8 like it "should be" funny because if you take them extra pricey Ngreedia cards and ask them to do all the extras that Radeons can without resorting to any tricks, they fall flat on their ass, it has been this way since kepler/fermi and somewhat with maxwell (because they still had majority of things in hardware) Just like a V4 with a nitro shot can win many races and not use as much fuel, when you need the "power" they do NOT have it, Nv wants everyone to play by the rules they want everyone to play by, but it is ok if they ":cheat" the system to make the race unfair, funny as hell when many TWIMTBP titles as well as PhysX (when it was able) ran WAY better 99% of the time with Radeons, that is until Nv threw hissy fits and mad devs/OS maker screw with things to make themselves "appear" fast and everyone else (mainly AMD) look like it was using more power than it should be or running slower in comparison (but still looked far nicer) ================================ ============== it was not all that many generations ago that Nv cards were the ones using god awful amount of power and shooting out significant amounts of heat (often enough burning themselves out sooner than they should have) because Nv refused (and still does) to give the best quality component selection on their products, but sure enough when Nv decides to "chop things away" to get power use to generally more reasonable levels THEN AMD is the "bad guy" because they are not "identical" to what Ngreedia is doing, demonize AMD when they do not give "enough" power which slows themselves down, but give them crap when they use the same power as they more or less always have (or Nv used to) and OMG they are the devil. Not AMD fault that everyone puckers up and kisses Intel/Nvidia arse every 2 seconds and dumping their wallet for the "pleasure" of puckering up......AMD builds the best they possibly can, and does the most they are able where and when they are able, Intel and Nv really do not have to try very hard, hell they can royally fk up and people still just throw money at them. anyways, per mm/2, performance per watt, amount of transistors, ACTUAL power consumption (constant and absolute, as well as the amps being drawn from the 12v) heat being dished out, what the gpu is capable of from Radeon 7xxx all the way to current "Vega" Radeons have been VERY competitive if people base them on their own merits AND AMD has some really good software backing it these days not just "flash but no substance" that Nv likes to throw around, we would NOT have GDDR2-3-4-5/HBM without AMD putting it on THEIR GPU in the first place and Nv letting AMD do all the "hard work" to prove it was worth using before Nv "wised up" and started using it as well, Nv likes everyone else's cookies but does not want to share any. I personally hope that whatever this so called polaris 30 is, they end up having many on the shelf so that the "selling price" does not jump through the roof within the first month or 2 where gamers either have to go without or pay through the nose for it because sellers (such as amazon, newegg, or the AIB) end up being greedy mofo and charging more than they should be. If AMD deems they should be being sold at $200 USD, then they damn well should be within this amount, not automatically be an extra 50% because of a fancier fan and the sellers throw another 20% on top of that number as well FFS ^.^
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
@Dragonstongue : Your wall of text is full of things even HC AMD fans would disagree with. I think you kind of mix into current state things older than 8 years. Time when nVidia delivered higher performance/$ by sacrificing IQ are long gone. Yes, there were few instances where game optimization reduced texture quality here and there, but generally IQ is pretty much same on both sides. And sadly, while you may be right about nVidia taking technological shortcuts to achieve higher clock or performance in general, who's to say it is wrong? They still deliver bit higher gaming performance on cheaper cards than AMD. Yes, AMD can shine under heavy workloads and therefore one can say that they are better in some way... but it is situation like with i5 Sandy vs Bulldozer 8C/8T. i5 was better choice for very long time and even now it is for most of the games. Bulldozers overall strength did not make it better product. Just product which does outlive i5 in usability. With perf/watt, AMD GPUs are not power hungry by design, but by AMD's choice. AMD released all GCN High end cards and then even higher mid range ones well above inflection point in power efficiency curve. On almost all those cards you can cut Power Consumption by good 10~15% by cutting clock by 4~5%. Downclocking by 10% and Power goes down 30~35%. AMD could have released all those GPUs clocked lower, but give them those better cooling and leave OC to client or AIB partner. Then everyone would say that they are good OCers and that they are power efficient as most of sites measure power efficiency on stock clock. Hey, but I agree with sentiment that nVidia is harmful to PC environment. I do not like them, but they still make good GPUs.
data/avatar/default/avatar10.webp
I do need to use less power. My 1080ti is used at full pelt 6 hours a day as I'm a heavy gamer and the power company must love me.
data/avatar/default/avatar36.webp
bemaniac:

I do need to use less power. My 1080ti is used at full pelt 6 hours a day as I'm a heavy gamer and the power company must love me.
Mine will go broke with me. 48 Watts of hardcore gaming 😛 https://abload.de/img/dd2g8sao.png
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
They should release a fatter Polaris with more CUs and ROPs and couple it with GDDR6, upping also the clocks with the more refined process technology. It could be an interesting GPU, even if it ate as much or even a little more energy than the current RX580. That is, if they can't make a Vega version with a memory controller using GDDR6 instead of the troublesome HBM2.
data/avatar/default/avatar27.webp
Celcius:

Carping about the energy efficiency of any GCN-based video card in mid-2018 is very much like flogging the proverbial dead horse. GCN has never been about energy effiency, and likely never will be.
Ummmmm, what are you or did you partake of? AMD for many years now has targeted the "middle" in regards to having acceptable performance at a reasonable cost and power consumption. GCN first generation (radeon 7700/7800/7900 (well maybe not 7900 as far as outright power effectiveness) went up against Nvidia Fermi/Kepler (500 and 600 series) in most cases the AMD cards ended up using less power and had the muscle to back it up whereas the Nv counterparts might have been "mighty" but were also not quite as effective in some thing (without resorting to BS to make it happen) GCN 2 7790/8790 R7/R9 260 through 290s as well as some of the 300s went up against Nv 600 series primarily and were once again "comparable" overall in regards to price, performance, power use (obviously there were come exceptions but those exceptions usually came with very potent performance to back up the power and heat produced....no different then say Nv with their 400-500-600-700 where some were quite "efficient" but many were not built nearly as well overall, started cutting things back but still chewed power and belched heat like nobody's' business) GCN 3 primarily R9 380/X, R9 Fury, RX and RX 530 were more or less directly vs Nv 700 series and once again they actually came out ahead with the sweet spot strategy AMD had been using since HD 4xxx series, there were some "odd balls" such as 750Ti, but for the most part AMD was very very comparable not always winning in the absolute performance crown, no, but more often then not buyers were getting a good deal as far as price, performance, power use and such (if they paid attention) GCN 4 pretty much RX 460+ and RX 540+ this was much more "difficult" but effectively as just an "optimized" in some cases rebadged things from previous generations (probably because AMD was having a hell of a time keeping doors open from bulldozer, working on the smi customs for console maybe even turning into having to re do things to make sure Zen was successful) they pretty much ran headfirst into the "top" Nv chips from 700/900 generations it was obvious there was a much more difficult fight for AMD when it came to the "top" performance cards, against some like the Titans or 980Ti and such, but still they did have some competition AND did not cut corners to "make it happen" their power use (in nearly all cases) was expected levels (besides the RX 470/480 not following pci-sig specs, but then again Nv has broken this protocol many many times over the years going all the way back to Geforce 6000 series, but no one remembers that huh) at least overall AMD did what they could to support DX11 and 12 much more so then Nv had without resorting to BS tricks and software emulation etc etc. Where AMD "dropped the ball" is they did not even try to throw something out there to compete directly with Nv 1000 series, the RX 500 were left to deal with it though they were meant to deal with Nv 900 series (which they did for the most part, but, they ran out of muscle for the highest end ones and it cost a bunch of extra power to do so, but then again 980/980Ti or Titans were not exactly power sipping or low temperature cards Now we have GCN 5 which is Vega generation, for good or for bad this has been the Radeon that competes directly with 1070/1080 and so far the 1200 series, yes they use a good chunk of power, but they also do pretty much everything in hardware not picking and choosing software emulation to "play a game" as far as supported features etc which Nv has been getting more and more at since 600 series, is Nv "clever" about this, sure, but, when they say FULL DX 11 or DX 12 support but they do not have the capability to support them in full, they are lying through their teeth. --------------------------------- ----------- anyways, it has been very interesting "battle" since AMD released their HD 4000 and Nv with their GTX 200 series, AMD IMO brought far far more "to the table" as far as added features that benefit everyone including Nvidia (when they choose to use it, including tessellation which they convinced MSFT to remake the "rules" to allow them (Nv) to software emulate it but require AMD to hardware emulate it) conversely since this GTX 200 vs HD 4000 Nv has done more and more to "lock things down" even if it meant even their own customers "suffer" just so long as their competitors (AMD and Intel) suffer the most.... Nv does the least amount possible, wants to charge absolutely the most for the "experience", often enough use under spec components on their products to keep more in their pocket as well. MY OPINION. --------------------------------- ------------------ GCN (like VLIW 5 and 4 before it) followed what AMD was doing for many generations, start in the middle and pump it up for the "big boys" or chop the die down for the "budget/mobile" they have done a pretty good job as far as performance, price, heat, power, durability of the design, as well as "staying power" when it comes to years and years of use. Of course there were "exceptions" there always is, when something was "designed" for say 180w ACTUAL power use, it is not exactly easy to keep it running well when you try to drive it to 240-350w instead things start falling apart and performance ends up not being where it should be for this extra power given, but the more performance/mainstream/budget Radeons actually did fairly well at this. Nv on the other hand tend to target the TOP first and cut things back to go to the middle and then to the bottom, there are exceptions where they did a more "custom" for the segment design. anyways, to state GCN was not built for power efficiency is a dumb arse thing to be stating, it depends on the generation, what they went up against, what they were capable of doing and so forth. but GCN was very much designed around "use what is required" and tweak from there to address different parts of the market...yes it is 2018, and no GCN does not directly compare to say GTX 1xxx generation (they were not meant to) but even then, there are some cards that actually DO compare very very well against them as far as ACTUAL power usage (not the moronic TDP everyone like to throw around) anyways...dumb thing to say is all I am getting at....GTX 1000 series one needs to take with a grain of salt only because they chopped things away to "speed them up" coming from the 900 series, they were "clever" and they did have the benefit of being mostly TSMC 16nm vs AMD with RX (most of them) being on GF14nm which unfortunately suffered from higher then expected power draw when pushed above X speed or more leakage so came at somewhat more per transistor required power levels. AMD "kept the fat" Nv decided to "cut the fat" the main difference here IN MY OPINION at least GCN can "do it all" unless it is an Nv proprietary way of doing things, so in that aspect GCN is the more "capable" architecture when it comes to DX and obviously Vulkan. ------------------------ ----------------- do some reading, take off the rose (geen and blue) colored glasses, use the brain to try to be "unbiased" and compare them on their own merits, Radeon is not a "pretender" they are built well and do things well, there is always limits no matter what the product is, at least they do not try to "hide" these limitations unlike Nv who pays developers to tweak the code or find others to blame for the short comings "don't look at what we did wrong, look at what they did"