This is looking more and more like RDNA2 but with better RT performance and better video codecs. Rather underwhelming.
This would be fine, great even if they at least game em better prices while doing so. It seems neither Nvidia or AMD are willing to budge on pricing, I think ppl would be ok staying on current gen gpus if they dropped prices so ppl could actually afford em.
"7000 series" is short for: "How to sell the massive existing stock of 6000 series"
Frikin' cryptomining f'ed up two full generations of GPUs. Seeesh...
And to my surprise, one year later since ETH mining shutdown... at least 40% of that mining power is STILL operational in other coins.
(And yes, while miners have stopped buying new GPUs, they aren't selling existing farms either as long as they have access to cheap or free electricity)
As always, the price dictates if its a good or bad product.
Very true, but Nvidia sets the price trend and AMD has been cocky enough lately to not fall far behind Nvidia's pricing. So, since this is bound to perform better than a 4060Ti in most tests (bear in mind AMD has improved their drivers a lot in the past few months) then I expect this is going to cost a bare minimum of $500.
As always, the price dictates if its a good or bad product.
Prices are not going to be very good because AMD can`t get enough chips from TSMC, just like Nvidia.
Apparently, Apple has already bought all of next year`s 3nm process from TSMC, so everyone else will have to wait for their turn...
Isn't 6700XT already a direct competitor for the 4060 ti even the 16GB (althought rather more expensive) and for 350?
Yeah they are not lasting forever (or who knows) but you have to have really bad electricity prices and/or play 24/7 to make the savings in electricity actually meaninful over a realistic period of time.
Saying you save 50 bucks over 5 years of use is pretty dumb to take a more expensive product just for power efficiency.
So if they launch the 7800XT with very similar performance than 6800XT but better efficiency and some extra features here and there, I would say it is at least somewhat enticing, otherwise it is the good ol' strat of forcing people to buy new more expensive garbage cause you stopped producing the previous generation which actually was better in perpective.
AMD and Nvidia both have completely reworked their memory/cache subsystems over the last couple of generations. These numbers are basically arbitrary in a vacuum. Is it enough for the GPU it's feeding? That's the only question to ask.
Some Nintendo emulator developer whined because newer cards can't play Zelda at a billion-x resolution scaling, and then all of the lemming retards on Reddit decided that bus-width is now some super important metric and that AMD/Nvidia are cutting corners. I really do hate that website.
Some Nintendo emulator developer whined because newer cards can't play Zelda at a billion-x resolution scaling, and then all of the lemming retards on Reddit decided that bus-width is now some super important metric and that AMD/Nvidia are cutting corners. I really do hate that website.
and then theres forum users that don't comprehend the developers message.
and then theres forum users that don't comprehend the developers message.
What's not to comprehend? "if you saturate the cache, you're left with the performance of a 128-bit wide card, and it's very easy to saturate the cache when using the resolution scaler", was literally the message. A very specific scenario that literally has not affected a single other piece of gaming software. The rest of you are trying to make it into something it isn't.
If these 7700XT specs are correct then it should be a bit faster than a 6800