AMD Radeon Vega Frontier Edition vs Nvidia Titan Xp

Published by

Click here to post a comment for AMD Radeon Vega Frontier Edition vs Nvidia Titan Xp on our message forum
https://forums.guru3d.com/data/avatars/m/231/231931.jpg
Only faster than a 1080? That's a bummer.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I don't get why AMD is choosing to show these two tests that Nvidia could easily counter with driver updates.. they should be showcasing FP16 performance in compute/deep learning workloads. This whole release just seems so weird to me.
https://forums.guru3d.com/data/avatars/m/169/169351.jpg
Only faster than a 1080? That's a bummer.
Indeed, I reckon for the majority of benchmarks, Vega will be on the cusp of 1080ti performance but I do imagine in DX12/Vulkan workloads it'll be significant, particularly at high resolutions anyway (just like how FuryX can maintain relatively high performance at high resolutions). But time will tell.
data/avatar/default/avatar12.webp
The point is that you now have a choice. 1080ti or Vega with similar performance so just buy the cheap one. I have a 1080ti I'd swap for Vega if it's got more vram though. I used 10gb in ROTR today. After 12 solid hours of DiRT4 without a break strangely I also had about 10gb of vram used but I think the unpatched game had a leak. Looks like AMD went with 8gb.............urgh I can already think of 8 games already out that I could overfill that with.
data/avatar/default/avatar30.webp
No surprises, Vega will be just above regular (non-TI) 1080. Just as expected and discussed before. At 300W. Good. Very good.
https://forums.guru3d.com/data/avatars/m/101/101279.jpg
So wait, they put a workstation card against a gaming card and it won in workstation tests and lost in gaming tests. okaaaayy.
https://forums.guru3d.com/data/avatars/m/231/231931.jpg
I don't understand the OpenGL cinenbench results.. Stock Ti scores near 140fps
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
So wait, they put a workstation card against a gaming card and it won in workstation tests and lost in gaming tests. okaaaayy.
It's most likely on normal Re-Live drivers just by going with the fact that it would be barely faster then w7100 workstation card. It's no more workstation card then Titan Xp in this situation. Just going by the driver numbers here that is is "basic" drivers not workstation.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
It's most likely on normal Re-Live drivers just by going with the fact that it would be barely faster then w7100 workstation card. It's no more workstation card then Titan Xp in this situation. Just going by the driver numbers here that is is "basic" drivers not workstation.
Yeah but I guess my question is - what's the point? It's like "our non-workstation card outperforms nvidia's non-workstation card in workstation benchmarks" ok? But who is buying either one of those cards for workstations? Like if they were showing other benchmarks in the mix, fine, but they aren't - they keep benching workloads that neither card is intended for. Why?
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
Only faster than a 1080? That's a bummer.
You expected for AMD to leapfrog Nvidia? 1 year later they can win over 1080 and that's good, considering the new tech developed for the card. She's still late af though.
I don't get why AMD is choosing to show these two tests that Nvidia could easily counter with driver updates.. they should be showcasing FP16 performance in compute/deep learning workloads. This whole release just seems so weird to me.
Because the cards can just perform average compared to Nvidia, unfortunately.
Looks like AMD went with 8gb.............urgh I can already think of 8 games already out that I could overfill that with.
With the new memory controller it doesn't matter, the card will manage the memory and the game will work even if you try to overfill it.
data/avatar/default/avatar12.webp
I don't understand the OpenGL cinenbench results.. Stock Ti scores near 140fps
Cinebench OpenGL is bugged on Nvidia Pro line of cards, let alone on consumer grade drivers. FE scores are low as well - Polaris pro cards score 180fps. https://hothardware.com/reviews/nvidia-quadro-p6000-and-p5000-workstation-gpu-reviews?page=5 The main takeaway is this: AMD went through a great trouble picking up tests that tell us nothing. We already knew that Vega FE > Titan Xp in Specview and such, yet it gets annihilated even by Maxwell Pro cards.
https://forums.guru3d.com/data/avatars/m/101/101279.jpg
It's most likely on normal Re-Live drivers just by going with the fact that it would be barely faster then w7100 workstation card. It's no more workstation card then Titan Xp in this situation. Just going by the driver numbers here that is is "basic" drivers not workstation.
Fair enough but do you think those drivers have been optimised at all for that Vega card?
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
I don't understand the OpenGL cinenbench results.. Stock Ti scores near 140fps
Even my old 780 got ~180-190fps, its highly cpu bound.. they used r7 1800x there so thats why its so low, but then again amd still scored higher vs titanxp..
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I don't get why AMD is choosing to show these two tests that Nvidia could easily counter with driver updates.. they should be showcasing FP16 performance in compute/deep learning workloads. This whole release just seems so weird to me.
I'm pretty sure the average gamer couldn't care less (let alone understand) such benchmarks. This is a Radeon, not a FirePro. AMD also likely doesn't want cryptocurrency miners to get inspired and take all the sales. I'm not saying you're wrong, but you have to remember the targeted demographic, which is specifically gamers who want a "good enough" 4K experience without buying a piece of hardware that is worth roughly as much as every other component in their PC combined (keep in mind, if you're budgeting on a GPU, you're budgeting the rest of the system too).
https://forums.guru3d.com/data/avatars/m/45/45709.jpg
As always: AMD is playing the second fiddle "Don't go for second best baby Put your love to the test...":banana:
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Even my old 780 got ~180-190fps, its highly cpu bound.. they used r7 1800x there so thats why its so low, but then again amd still scored higher vs titanxp..
So what? The $450 Quadro P2000 scores higher than the Titan XP. The benchmarks are entirely driver based and Nvidia gimps the driver on it's consumer cards, Titan XP included. It appears AMD does the same considering their Polaris based WX 7100 performs basically the same as this: https://hothardware.com/ContentImages/Article/2581/content/spec1.png So is the point of this test is to show us that AMD gimps it's driver slightly less than Nvidia? Also why no deep learning benchmarks? This card is being marketed as a Titan XP competitor - Titan XP is the most popular deep learning training card. Frontier should be way better due to FP32 perf being higher but also because it has double the inferencing performance with the mixed math operations that Titan XP lacks. Yet no deep learning benchmarks at all. Why? Makes absolutely zero sense.
I'm pretty sure the average gamer couldn't care less (let alone understand) such benchmarks. This is a Radeon, not a FirePro. AMD also likely doesn't want cryptocurrency miners to get inspired and take all the sales. I'm not saying you're wrong, but you have to remember the targeted demographic, which is specifically gamers who want a "good enough" 4K experience without buying a piece of hardware that is worth roughly as much as every other component in their PC combined (keep in mind, if you're budgeting on a GPU, you're budgeting the rest of the system too).
You realize this card is $1200/$1700 right? It's essentially AMD's Titan XP except they aren't benchmarking it against any of the things Titan XP is used for.
https://forums.guru3d.com/data/avatars/m/263/263841.jpg
Only faster than a 1080? That's a bummer.
They're referencing back to their experience at computex. They didn't get any frame rate numbers for this comparison. Be interested to see how the consumer edition fares in these workstation operations.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
You realize this card is $1200/$1700 right? It's essentially AMD's Titan XP except they aren't benchmarking it against any of the things Titan XP is used for.
Nope I didn't. I stand corrected.
https://forums.guru3d.com/data/avatars/m/118/118854.jpg
The point is that you now have a choice. 1080ti or Vega with similar performance so just buy the cheap one. I have a 1080ti I'd swap for Vega if it's got more vram though. I used 10gb in ROTR today. After 12 solid hours of DiRT4 without a break strangely I also had about 10gb of vram used but I think the unpatched game had a leak. Looks like AMD went with 8gb.............urgh I can already think of 8 games already out that I could overfill that with.
I know this is old news, but here: https://www.thebitbag.com/amd-vega-10-specs-vega-gpu-to-double-the-usable-graphics-memory-capacity-via-high-bandwidth-cache-feature/220835
data/avatar/default/avatar31.webp
Here are a comparison between Vega FE and a Quadro P5000 (both priced $1800) on SPECViewPerf, keep in mind that the Quadro is available by almost a year and has a 180W TDP while Vega FE TDP is 375W! https://s30.postimg.org/r33ywsz7l/untitled-1.png