AMD Announces RX 5000 Series Graphics processors at Computex - Demos RX 5700

Published by

Click here to post a comment for AMD Announces RX 5000 Series Graphics processors at Computex - Demos RX 5700 on our message forum
https://forums.guru3d.com/data/avatars/m/258/258688.jpg
MonstroMart:

I don't think PCI-E 4 will give any advantage (it should not anyway). But yeah it will need to be tested first to be 100% sure.
I think for multi-gpu/X-fire it should be a nice little kick in the pants...;)
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
MonstroMart:

I don't think PCI-E 4 will give any advantage (it should not anyway). But yeah it will need to be tested first to be 100% sure.
MBTP:

PS5 says Hi! Maybe it has everything to with streaming capability between the nvme to the GPU... We have to wait and see.
Yeah, actually Hilbert needs to use a new rig to test the upcoming NVMe drives using PCIe 4.0. So, no matter what, he will end up with a new rig in his hardware testing cave.
https://forums.guru3d.com/data/avatars/m/277/277333.jpg
I guess this architecture is going inside the new consoles and doesn't have dedicated raytracing hardware? If so, that means those chips will be able to do decent raytracing even without specialized hardware, since PS5 has been announced to be capable of doing it. If that's the case, then either nvidia will have serious edge on raytraced games, or their specialized chips will be overkill and not cost-efficient. Next few years will be very interesting in the real time graphics field. 🙄
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
i can't wait to see the die mockup diagrams
Kaarme:

Yeah, actually Hilbert needs to use a new rig to test the upcoming NVMe drives using PCIe 4.0. So, no matter what, he will end up with a new rig in his hardware testing cave.
maybe someone will send him a 3800x and motherboard to test soon so he can stop using that ancient 5960.
https://forums.guru3d.com/data/avatars/m/63/63170.jpg
Ricardo:

I guess this architecture is going inside the new consoles and doesn't have dedicated raytracing hardware? If so, that means those chips will be able to do decent raytracing even without specialized hardware, since PS5 has been announced to be capable of doing it. If that's the case, then either nvidia will have serious edge on raytraced games, or their specialized chips will be overkill and not cost-efficient. Next few years will be very interesting in the real time graphics field. 🙄
My guess is a custom chip, maybe with Gen1 raytracing support, and the next AMD GPUs next year might get Gen2 ?
data/avatar/default/avatar17.webp
CPC_RedDawn:

Didn't you guys see that PCIE4.0 3DMark benchmark???? 69% better performance over a 2080Ti???? That can't be right, just from a bandwidth upgrade? Going from PCIE2.0 to PCIE3.0 wasn't much different, defo not 69%!??
2080 Ti isn't saturating 16x PCIE 3.0 so really at this point PCIE 4.0 means nothing to gaming GPUs Navi doesn't need it but this is how new standards start in the market. Better to have it before you need it. Usually the PCIE standard stays way ahead of GPU requirements. Dual GPU is a case where 4.0 standard would help because of mainstream splitting of 16x into two 8x but that's wasteful territory currently. PCIE 4.0 probably has biggest news in SSDs. 4x PCIE 3.0 used on current M.2 NVMe drives is a limitation which tops out at 3.9 GB/s where as 4x PCIE 4.0 would be limited at 7.87 GB/s. Intel will definitely want to be on PCIE 4.0 by the time these new 5 GB/s drives become a common place. Of course there are still a ton of operations that happen well within of the spec of PCIE 3.0 4x and I am merely focusing on peak throughput.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Jayp:

2080 Ti isn't saturating 16x PCIE 3.0
it can and does, gaming isn't the only application these cards have to handle.
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
Pretty much everyone seems to be saying this is a new architecture called RDNA specifically made with gaming in mind and that it wont be GCN (and it will coexist with GCN geared toward compute). I remember 2-3 weeks ago some guys here saying it would be 200% GCN for sure like it was impossible it was going to be a new architecture.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
from all i can find RDNA is an evolution of GCN with a VLIW design that makes up the shortcomings of its previous incarnations. we won't know more till the techsites get hardware briefs during the review phase.
https://forums.guru3d.com/data/avatars/m/275/275175.jpg
Darn looks sweet, now the question remains will it have hdmi 2.1 ?;)
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Hmm.... I'm growing more hesitant about getting this. It's nice in a way that it's a new architecture but if it really is that different from GCN then I might want to wait for the next gen.
MonstroMart:

I don't think PCI-E 4 will give any advantage (it should not anyway). But yeah it will need to be tested first to be 100% sure.
Maybe for this gen it won't matter much, but I'm sure after the next gen with all the necessary tweaks, PCIe 4.0 wll be a big deal for GPUs running with x8 lanes.
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
schmidtbag:

Hmm.... I'm growing more hesitant about getting this. It's nice in a way that it's a new architecture but if it really is that different from GCN then I might want to wait for the next gen. Maybe for this gen it won't matter much, but I'm sure after the next gen with all the necessary tweaks, PCIe 4.0 wll be a big deal for GPUs running with x8 lanes.
I forgot the nvme drive when i wrote that though. It is sort of a big deal if you build a new system now. I mean if Zen 2 is on par with Intel why would you build a new computer without PCI-E 4 and the possibility to have a boot drive supporting it? Unless you're a fanboy it would not make any sense to build a new system without PCI-E 4 if the cpus are more or less equal.
data/avatar/default/avatar05.webp
why no HDMI 2.1 ??? WHY
https://forums.guru3d.com/data/avatars/m/72/72830.jpg
dfsdfs1112:

why no HDMI 2.1 ??? WHY
Agreed. But finally at least it is a new core, GCN has delivered rather well for very long but the efficiency has been lacking.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
variable rate shading is a feature of the 2.6 driver model, all AMD has to do is expose feature support to directx and make their driver to the talking between api and hardware.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
Nvidia: Mild performance improvements, first generation RTX, high prices AMD: MIld performance improvements and better efficiency Personally will be waiting until the next generation of Nvidia or AMD. Nothing this generation is all that enticing to me playing at 2k.
data/avatar/default/avatar17.webp
Astyanax:

it can and does, gaming isn't the only application these cards have to handle.
First off I was talking about gaming. Also, there are more powerful GPUs than the 2080 Ti for non-gaming workloads that are not limited by PCIe 3.0 with 16 lanes. Sure there could be edge cases. I will accept you providing me a good source showing real world cases where a GPU is limited by 16x PCIe 3.0.
https://forums.guru3d.com/data/avatars/m/235/235344.jpg
Since they are revisiting the 5000 series of numbers again, are we also going to get a more modern bat mobile like shroud too?