AMD Radeon RX rumor: 12nm Polaris 30 in the 4th quarter
Click here to post a comment for AMD Radeon RX rumor: 12nm Polaris 30 in the 4th quarter on our message forum
Mitch 74
Shrinking an existing chip to a new process isn't free - that's why I don't think we'll ever see Polaris on 7nm, not without some major retooling. However, 12nm is actually a "very much improved" 12nm: the chip's size won't change, but circuit paths will be more precise - causing less leakage thus allowing higher frequencies (like we saw for Zen+ compared with Zen).
Polaris by itself works well - provided newer APIs are used. As such, if we consider a 10% clock increase over Polaris 20 (which is already 10% faster than Polaris 10) and it goes along with slightly faster VRAM, this would keep this mid-range chip right where it's best at: 1080p with all settings maxed out, or 1440p with lower settings. Driver will be mature right away, and RAM quantity is already good (8Gb). It may still not compete with the Geforce 1070, but if it finally shows up at MSRP, it will be very interesting.
Silva
All I want is to be able to get one at MSRP, I don't care about anything else.
tunejunky
a couple of thoughts...
i've been hearing about a refresh/shrink mid-range, heard a stupid (economically) rumor it was Vega. Polaris at 12nm would actually make a lot of sense and it could be sold dirt cheap at profit (both process and architecture being mature).
but that doesn't mean it's true.
the other thought was if anyone thought the Vega release was a bomb or ineffective, that person is a fanboy.
they've sold every one they could make and are filling back-orders. while the behemoth (cue dark metal music) Nvidia is eating shiploads of product.
from a business perspective, that "lack of production" at AMD literally saved them from over-production.
both companies are doing quite well, thank you.
user1
Dragonstongue
Fox2232
@Dragonstongue : Your wall of text is full of things even HC AMD fans would disagree with. I think you kind of mix into current state things older than 8 years. Time when nVidia delivered higher performance/$ by sacrificing IQ are long gone. Yes, there were few instances where game optimization reduced texture quality here and there, but generally IQ is pretty much same on both sides.
And sadly, while you may be right about nVidia taking technological shortcuts to achieve higher clock or performance in general, who's to say it is wrong? They still deliver bit higher gaming performance on cheaper cards than AMD.
Yes, AMD can shine under heavy workloads and therefore one can say that they are better in some way... but it is situation like with i5 Sandy vs Bulldozer 8C/8T. i5 was better choice for very long time and even now it is for most of the games. Bulldozers overall strength did not make it better product. Just product which does outlive i5 in usability.
With perf/watt, AMD GPUs are not power hungry by design, but by AMD's choice. AMD released all GCN High end cards and then even higher mid range ones well above inflection point in power efficiency curve.
On almost all those cards you can cut Power Consumption by good 10~15% by cutting clock by 4~5%. Downclocking by 10% and Power goes down 30~35%. AMD could have released all those GPUs clocked lower, but give them those better cooling and leave OC to client or AIB partner. Then everyone would say that they are good OCers and that they are power efficient as most of sites measure power efficiency on stock clock.
Hey, but I agree with sentiment that nVidia is harmful to PC environment. I do not like them, but they still make good GPUs.
bemaniac
I do need to use less power. My 1080ti is used at full pelt 6 hours a day as I'm a heavy gamer and the power company must love me.
Noisiv
https://abload.de/img/dd2g8sao.png
Mine will go broke with me.
48 Watts of hardcore gaming 😛
Kaarme
They should release a fatter Polaris with more CUs and ROPs and couple it with GDDR6, upping also the clocks with the more refined process technology. It could be an interesting GPU, even if it ate as much or even a little more energy than the current RX580. That is, if they can't make a Vega version with a memory controller using GDDR6 instead of the troublesome HBM2.
Dragonstongue