AMD was always good at gaming.
I would dare to say the best budget card in history would be the HD 5770....that card sold like hotcakes with extra syrup!
I would dare to say the best budget card in history would be the HD 5770....that card sold like hotcakes with extra syrup!
hmm, the 5770 was in an interesting space, a feature level 11 card at a point in time where nvidia had none so it had basically no competition feature wise at the time,
When competition did arrive, it sat between the GTS 450 and GTX 460, the former being 10 dollars cheaper but offering only 84% of the performance of the 5770 (where not using tesselation). The latter was a fairbit faster but came with additional cost since it had twice as many memory modules.
I am really not a fan of this.
Ultimately, this approach means AMD cards would offer much less value to the customer compared to Nvidia.
With Nvidia, you basically get the same architecture that supercomputers use in your home for a similar price point that you can use for all kinds of different purposes, for example training ML models, use DLSS, use really handy AI features in content creation software, accelerate Blender, noise cancellation, AI effects in video conferencing and many more.
You're not getting any of that with an AMD GPU because it lacks ML acceleration as it's not a supercomputer architecture, but is dumbed down for gaming. So with AMD you basically get a GPU that is good at gaming but nothing else, while with Nvidia you get a GPU that you can do anything with and at a high performance.
Until AMD steps up their game by adding proper ML acceleration and is also competitive in software, I will always choose Nvidia because I get much more value for my money there.
technically you didn't skip the 4** series since the 500 series was just a tweak to it.
You know what i mean, the 580 was a serious leap from the 480. Ok maybe not the same leap that the 8800gtx was over the 7800gtx but a leap non the less.
You know what i mean, the 580 was a serious leap from the 480. Ok maybe not the same leap that the 8800gtx was over the 7800gtx but a leap non the less.
It was on the same arch and it has same 1.5gb vram. Not until kepler nvidia actually didnt had a leap in perfromance and amd had legendary tahiti. 🙂
It was on the same arch and it has same 1.5gb vram. Not until kepler nvidia actually didnt had a leap in perfromance and amd had legendary tahiti. 🙂
we can argue the differences between gf110 and gf100 if we want, i think it was texture samplers ported from gf104 and full rate fp16?
It was on the same arch and it has same 1.5gb vram. Not until kepler nvidia actually didnt had a leap in performance and amd had legendary tahiti. 🙂
Many of you guys are very technically knowledgeable, im not, i just judge things by heat, performance, OCability etc, i dont get into the weeds on the technical side to be honest. I dont even know what names any particular CPU or GPU core has, but ive built dozens of custom WCed rigs, fabricated panels and modded cases etc. I was an aircraft engineer before i decided joining the Army was way sexier.
With Nvidia, you basically get the same architecture that supercomputers use in your home for a similar price point that you can use for all kinds of different purposes, for example training ML models, use DLSS, use really handy AI features in content creation software, accelerate Blender, noise cancellation, AI effects in video conferencing and many more.
AMD did use that approach, up until RDNA - arguably RDNA 2.
GCN had excellent compute performance and did better than Nvidia's (gaming) offerings in many specialized tasks due to being essentially the same cards as their pro lines.
Unfortunately that also meant the cards weren't as ideally suited for actual gaming as they could be, considering heat/power and die size.
So we got RDNA and CDNA.
I, too, like the idea of 'fully enabled' gaming graphics cards - in theory.
In practice I'll gladly sacrifice the features that I, who primarily watch YouTube and argue with strangers on the web game with my card, don't have much use for to get more performance in my primary usage scenario.
And here we are.
Once we've established what new features are actually, long-term, usable for gaming those will trickle down regardless. A good bet is to keep a look out for what gets introduced on the consoles.
watch YouTube and argue with strangers on the webgame with my card, don't have much use for to get more performance in my primary usage scenario. And here we are. Once we've established what new features are actually, long-term, usable for gaming those will trickle down regardless. A good bet is to keep a look out for what gets introduced on the consoles.