Rumor: AMD's Navi 21 GPU has 255 watts TGP and Boost up-to 2.4 GHz
AMD will reveal its first RDNA 2 Radeon based products on October 28th. However new specs of the AMD Navi 21 XT “Big Navi” GPU for the Radeon RX 6900 XT graphics card have been reported by a credible dude with information. Based on the reported specs, the AMD Radeon RX 6900 XT seems to be a graphics card beast with good GPU clocks.
In his tweet, Patrick Schur claims that the AMD Navi 21 XT GPU, which is supposedly the "Big Navi" GPU that all enthusiastic gamers have been waiting for, will be used in the Radeon RX 6900 XT graphics card; there are also reports that a faster XTX variant exists. The Navi 21 XT GPU will reportedly have a powerful chip clocked at 2.4 GHz, the highest ever on a desktop graphics card. This also suggests that we can expect even higher Turbo frequencies. These days, the thing with AMD cards is that the turbo boost and actual gaming frequency are two different things. So we're not quite sure what to make of that.
The GPU is also said to feature a TGP (total graphics power) of 255W. This does not mean all the power of the card, but the GPU itself. The GeForce RTX 3080 has a TDP of 320W, and the GeForce RTX 3070 has a TDP of 220W. The graphics card is said to have 16GB of GDDR6 memory, which is also what we have seen in several previous leaks. As for the benchmarks, the performance seems very close to that of the NVIDIA GeForce RTX 3080 graphics card. AMD will officially introduce its family of Radeon RX 6000 series graphics cards on October 28.
Rumor: AMD Radeon Big Navi has 16GB VRAM (NAVI21), and 12GB for NAVI22 - 09/22/2020 06:01 PM
As we get closer to its announcements, AMD Big Navi is around the corner and they would like some momentum. More rumors are spreading the spiders' cyber web right now. AMD would be announcing two Big...
Rumor: Ampere based GeForce RTX 3000 series might arrive earlier than expected - 07/23/2020 09:43 AM
Anything regarding this product series is kept under a tight leash from NVIDIA, however winner might be coming sooner than expected, French overclocking reports that the announcements and release date...
Rumor: AMD allegedly to release seven NAVI GPUs but faces efficiency issues - 05/06/2019 09:20 AM
YouTube channel Adored TV discussed alleged specifications of AMD's upcoming Navi GPUs in a video. It is mentioned that AMD will release the GPUs as a Radeon RX 30xx series with the Radeon RX 3090 X...
Rumor: AMD Seeds Board partners Ryzen 3000 Samples - Runs 4.5 GHz and show 15% Extra IPC - 04/29/2019 04:52 PM
Well, loads of AMD related news today. Earlier today in this news item we discussed the X570 chipset, well, that same source mentions some more spicy info on their website. Apparently, AMD Ryzen samp...
Rumor: AMD Epyc2 processors could get 64 cores over 8+1 dies - 10/31/2018 02:48 PM
That AMD has been going insanely strong with many-core processors is not a surprise, you've read all our Threadripper reviews and have learned that the top tier processors (e.g. 32-core versions) hav...
Senior Member
Posts: 3951
Joined: 2009-09-08
This aspect for me is going to be very interesting in the near future. Now that AMD, Intel and Nvidia can no longer rely on smaller nodes to increase performance how are they going to do? Chiplets seem to be the answer on the CPU side and maybe on GPUs too but even those have physical limits and then what´s next??? Are software solutions like DLSS the answer or is perfomance going to stall for a few years until something new solves the issue?
Senior Member
Posts: 13807
Joined: 2004-05-16
Chiplets for GPUs are definitely coming. Nvidia already de-risked them and have working inference chiplet prototypes. AMD certainly is going the same route. I imagine for high performance computing, both companies will utilize chiplet based designs for their next architectures (Hopper & CDNA2) - gaming variants will probably take longer due to the sophistication of scheduling gaming workloads. DLSS, RT and similar shifts in graphics paradigms are still necessary though. Not only from a performance scaling perspective but visual quality as well. RT for example, while in it's infancy, can theoretically push far greater image fidelity than traditional raster based techniques ever could. The problem is there is going to be a transition period - which Nvidia felt the need to start with Turing - which from a financial/competition/timing standpoint was a really good idea. They essentially paused general performance momentum and spent a whole generation just developing new feature set but they did this in a time where their competitor completely surrendered the higher end market. But now they established this new model for graphics that have a whole slew of possibilities for optimization - 5nm node shrink fails? Who cares, next architecture can just improve RT cores 200%, or the denoising process becomes massively improved - or some new technique for ray traversal is developed and suddenly you can do twice the rays per scene, enabling higher fidelity/faster performance without a node shrink. DLSS is the same way, improvements over time to the technology will enable performance shifts that don't need to be realized via die shrinks.
But all of this could only happen if someone made that lateral move to enable all these features and start heading down these paths. And when you compare a company like Nvidia, with a massive monopoly yet still attempting in innovate and someone like Intel - who probably deserves to have their entire board of directors ousted, it's night and day.
Senior Member
Posts: 11809
Joined: 2012-07-20
Almost like skipping one generation, right? Almost... but they were selling Turing GPUs on cheaper manufacturing process for a lot. And that makes current prices look like good improvement.
As market leader they were and still are dictating prices. Current prices are exactly as nVidia wants them. They feel no pressure. And it does not matter how many or few GPUs they get from Samsung.
So, yes, they are doing Turing in sense of making fools out of their clients.
I am sure, you completely misinterpret nVidia's reasons. They went for raytracing paradigm shift from profit reasons.
Would they give you traditional rasterization GPUs with so many transistors, it would be last generation people would buy for next 4 years. Those GPUs would just cut through 4K without problems. And game developers would have hard time writing shader code that would reduce fps while improving IQ.
Jensen explained his business strategy long time ago and it did not change one bit.
Senior Member
Posts: 13807
Joined: 2004-05-16
Almost like skipping one generation, right? Almost... but they were selling Turing GPUs on cheaper manufacturing process for a lot. And that makes current prices look like good improvement.
As market leader they were and still are dictating prices. Current prices are exactly as nVidia wants them. They feel no pressure. And it does not matter how many or few GPUs they get from Samsung.
So, yes, they are doing Turing in sense of making fools out of their clients.
Cheaper manufacturing process but they were also massive cards. Their profit margins didn't significantly increase over the last two years - in fact it actually dipped. $700 for a 40% performance increase on today's nodes isn't a bad improvement, regardless to previous generation pricing. Based on AMD's earlier numbers they are literally in the same boat.
Keep in mind VII was also $700 for a card that went defunct in a year but for some reason everyone writes that off as a one time event or something.
I am sure, you completely misinterpret nVidia's reasons. They went for raytracing paradigm shift from profit reasons.
Would they give you traditional rasterization GPUs with so many transistors, it would be last generation people would buy for next 4 years. Those GPUs would just cut through 4K without problems. And game developers would have hard time writing shader code that would reduce fps while improving IQ.
Jensen explained his business strategy long time ago and it did not change one bit.
Everything a company does is for profit reasons. They are legally obligated to do things for profit reasons - but it's short term vs long term and you can make profit while establishing good foundational technologies. Those two things are not mutually exclusive.
Technological shifts have to come at some point. They chose Turing, probably because they knew AMD was going to completely abdicate the high end market. It's a perfect time to make a shift like that and introduce a bunch of technologies you've been working on for decades in order to gain a favorable position over competition. It creates a leverage point for the future, it creates yet more value-add for Nvidia customers and potentially creates more sales. In this case it didn't - nearly all of Nvidia's growth during Turing was in datacenter and it was there BECAUSE of their investments in Tensorcores. Which they just pivoted to gaming. Yet Nvidia forecasted this - probably because they knew Turing was a long term investment and not a short term one. Probably because Nvidia knew they could save a ton of money by pivoting technology they were building for Datacenter into gaming.
10 years from now when everything is raytraced, beautiful looking, AI upscaled from 480p to 16K or whatever - are we really going to sit here go "man I wish we had 770mm2 worth of shaders in Turing instead". No. Literally no one is going care.
Senior Member
Posts: 13807
Joined: 2004-05-16
Put anyone with the general knowledge of what to do in his shoes and watch them perform. nVidia is basically a monopoly in the mid to high end and in mindshare, he doesn't need to do crap to succeed. Steam surveys until recently had AMD users at under 2%. His leadership in the last 2 years has been a joke and I find it hilarious when people praise him. Oh wow, he made them big profits, great. At the cost of everyone, but his biggest nuthuggers, loathing him when he could have made more profit without everyone hating him by being less eager to please stock holders... according to everyone not named Jensen.
He had his company sandbag for an entire generation due to the lack of competition and basically said as much, all but verbatim. Now it's going to come back to bite him in his obese overpaid overhyped overvalued ass. I think anyone who thinks he's a good leader is insane.
This is such a edgy 4head take lol
Nvidia is in the position it's in because of Jens, not despite him. They made big profits because the investments and risks they took in AI - the majority of Nvidia's growth the last several years has been datacenter, not gaming. They pivoted the datacenter technology to the mainstream market with Turing - which again was essentially a risky investment in the future, but a necessary one, as anyone with a brain sees the writing on the wall for traditional performance and graphics paradigms. We know node shrinks are becoming exponentially more expensive, while delivering less and less density gains each year - you can't just keep slapping more cores on and praying that TSMC bails you out every generation. RT opens up a different avenue for performance and better graphics. DLSS, as a technology, has the potential to overcome the breakdown of moores law and dennard scaling. For everyone else they had to sit through a single generation where if they wanted more performance they had to pay a few hundred more, for what's essentially a luxury product - boohoo, cry a river about it. Nvidia took an opportunity to massively change up their architecture in a time where they had no competition - that's literally the definition of good leadership. Some of those changes, notably RT, are becoming the bedrock for the future of graphics. They did all this while being one of the top companies to work for and paying their engineers above average salaries.
I don't think the guy is a saint, I'm not saying he was responsible for all these things and I'm not saying you can't point out mistakes the company has made in hindsight (which is easy) but the idea that he should be ousted is asinine. Nvidia took risks instead of stagnated, they massively increased their value while diversifying their market, they delivered some excellent products for consumers in the last decade and they did all of this while treating their employees really well. If these qualities aren't the mark of good leadership, then what is?
Also I don't think you know what the word obese means, which makes me question your ability to judge anything.
The performance jumps from Turing and Ampere have been somewhat lackluster vs previous models, and I feel that it's by intention. I think they got the idea that they dominated the market and lowered performance goals in order to make it easier for the engineers. As Jensen said, that when jumping on the shoulders of giants, you take small hops. Now that they have competition, they should return to attempt larger performance gains, if they can solve the power draw issues.
The performance jump with Turing was lackluster because Nvidia and all semi-conductor manufacturers that are operating on the bleeding edge are facing a massive problem. Node shrinks are fucked. All the density and switching gains they used to bring are basically dried up and the ones they do bring cost like 5x what they did just 5 years ago. So what do you do? Do you just keep smashing your head against the wall hoping a company like TSMC/Samsung/ASML/etc has a breakthrough? Or do you switch up your designs to create new possibilities to build on? Nvidia chose to take a generation and dedicate it to just establishing a boatload of new technologies - RT, DLSS, Mesh Shaders, VRS, etc. The majority of which required a lot of die space, but the trade off is that they could then improve upon these in the future, giving them the opportunity to create gains in performance whereas they might not be able to just relying on node shrinks.