Gigabyte Radeon RX 5600 XT Gaming OC review

Graphics cards 1049 Page 1 of 1 Published by

Click here to post a comment for Gigabyte Radeon RX 5600 XT Gaming OC review on our message forum
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
alanm:

Oh pls.. 1st gen RTX cards are useless for RT at anything above 1080p.
Eh, disagree. At the midrange they are certainly not great but newer titles are pushing up easily. Wolf for example, the 2080Ti can do 4K @ 60 without DLSS. It also remains to be seen with consoles having RT support how that drives game performance/adoption and techniques. Also RTX cards have mesh shaders, VRS and a few other features that AMD doesn't support at the moment. There is definitely value-add with RTX. Some other things: Nvidia's encoder is superior to RDNAs - enough to make an appreciable difference in image quality. Nvidia currently supports VRR, AMD does not. So outside of just sheer performance there is value with going Nvidia. How much are those things worth? Depends on the person.
data/avatar/default/avatar27.webp
Undying:

2060S is a 400$ gpu using dlss 50% resolution and it was not maxed out becouse game uses more than 8gb vram. I also think it was 1440p not 4k. Credit goes to amazing id tech engine optimization and great performance in the first place.
American dollar (they don't even include tax!) prices mean nothing to me, and I think most of this website's community, because I'm pretty sure big majority of this website are Europeans. For example in my country MSI RX 5700 GAMING X = 530 euro MSI RTX 2060 SUPER GAMING X = 550 euro. I would never recommend any of my friends to save measly 20 euro and buy GPU that lacks so many features, I would later feel terrible talking to a friend who saw some cool raytracing gameplay only to learn he can't use that feature on his shiny new AMD GPU.
https://forums.guru3d.com/data/avatars/m/150/150085.jpg
Glottiz:

Good luck with your AMD GPU which is basically already outdated and legacy hardware. People buy Nvidia RTX because of Raytracing, DLSS, Gsync, crazy fast renders for workstations thanks to RT cores, etc. It seems like the only jaded fanboy here is you, and frankly, I feel a bit of buyers remorse in your post.
Aww, isn't this so adorable. Looks like someone just got released from the Nv Emo Support Clinic. But I still don't understand why they still charging a co-pay? o_O
Undying:

There is no rtx 2060 price drop, its all on paper. Good luck finding evga ko version anywhere and nvidia 2060 FE is not in stock. Im sure nvidia did it just to try being competitive at 300$ range.
Shhh, I was trying to offer some hope. Perhaps late 4th quarter?:D
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
READ! The thread from this point onwards remains on topic and decent. Fight, and I WILL hit that ban button. Trust me, I don't care who you are or how long you've been here.
https://forums.guru3d.com/data/avatars/m/150/150085.jpg
Undying:

2060S is a 400$ gpu using dlss 50% resolution and it was not maxed out becouse game uses more than 8gb vram. I also think it was 1440p not 4k.
Glottiz:

snip
It was already discussed.
data/avatar/default/avatar23.webp
Interesting tidbit from Anandtech:
Now, to be sure, AMD has not changed the reference specifications for the Radeon RX 5600 XT. So despite what’s going on, the baseline hasn’t changed. Case in point: the Pulse 5600 XT’s quiet mode BIOS has the same 135W TGP (~150W TBP) power limit both with the old and new BIOS, as well as the same 1460MHz rated game clock. ... With all of that said, however, the presence of factory overclocked RX 5600 XT cards and AMD’s decision to further overclock them presents a major wildcard. In keeping with AnandTech editorial policy, I’m not going to write any recommendations based on factory overclocked cards. But it is none the less interesting to note how they seemed to be destined to end up on a tier of their own; the memory overclock in particular giving RX 5660 XT a several percent boost in performance. Factory overclocked cards are of course nothing new, but with our Sapphire card going for $289 – just a $10 premium – the line between factory and reference cards is going to be blurry.
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
I dunno for someone coming from a sub $250 card this might be a good choice but anyone with a $300-400 card would probably want to opt for a 5700. It still feels weird that the cards in this price range only have 6gb of VRAM.
https://forums.guru3d.com/data/avatars/m/189/189980.jpg
Let's be real. No surprise here, e-tailers riding the new RX5600XT wagon, it's business what they are after. Let the dust settle, and little by little prices will be come down a bit. What really "sells me" is the quietness in idle and power consumption. I know at what lengths I went to tweak my RX 480 just to NOT hear it when only browsing. But I'll wait a bit and I'll decide what version to get or getting something else.
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
Hilbert Hagedoorn:

it's my inner g33k. I salute you for recognizing that command properly 🙂
Assuming that number in your profile represents the number of times the ball of dust and iron spun around the other ball of very hot gas since you started observing the light from the 2nd ball, we are close in the aspect of inner g33k1|\|355 ;-) Lovely computers those were, C64, Spectrum, VIC ... probably the first and last generation where home g33k5 understood how the computer moves every bit of data.... after that it's all become abstraction over abstraction over more abstraction. These days unless one is a hardcore transistor engineer working for Intel, AMD, nVidia, ARM, Samsung, Apple ... designing chips, everyone else is just working with some layer of abstraction. Actually, even those engineers are using software tools to design chips. Logic goes in, layers of silicon come out. We are slowly but surely getting to the point of "Computer, Make me a sandwich" level of knowledge needed to use one...
https://forums.guru3d.com/data/avatars/m/150/150085.jpg
Kool64:

I dunno for someone coming from a sub $250 card this might be a good choice but anyone with a $300-400 card would probably want to opt for a 5700. It still feels weird that the cards in this price range only have 6gb of VRAM.
I'm guessing that for 1080p resolution 6gb would be more then enough. I've been down scaling to 1080p now a days. I don't know why but if I play a game that allows you to down scale resolution, like BF5/MW, it looks better at 1440p @75% vs just 1080p. And I still get all the performance I want.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Kool64:

I dunno for someone coming from a sub $250 card this might be a good choice but anyone with a $300-400 card would probably want to opt for a 5700. It still feels weird that the cards in this price range only have 6gb of VRAM.
I kind of wonder how the 6GB of ram thing is going to play out with XBONEX having 12GB - I feel like a lot of this gen's midrange is going to be wiped out by next gen games - especially ones that utilize RT.
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
Games are already taking more than 6gb of VRAM on 1080P. In Deus Ex: MKD I regularly use over 7gb on my 1070 with all the options turned on.
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
Prices are all over the place right now upwards of $345 for the Strix. Just get a 5700 for that extra couple bucks.
https://forums.guru3d.com/data/avatars/m/236/236506.jpg
Decent perf but too expensive right now. This does nothing to help bring down prices of Nvidia cards and help consumers. I get it, AMD want higher margins but they won't increase market share by using this strategy.
https://forums.guru3d.com/data/avatars/m/189/189980.jpg
Better find a cheap 5700 and be done with it. I'm just pissed off, AMD.
https://forums.guru3d.com/data/avatars/m/72/72485.jpg
Denial:

I kind of wonder how the 6GB of ram thing is going to play out with XBONEX having 12GB - I feel like a lot of this gen's midrange is going to be wiped out by next gen games - especially ones that utilize RT.
That's also something that's been in the back of my mind. The next gen consoles are gonna have plenty of cpu cores and vram to throw around so unless there's good optimization, we could see next gen PC game ports having very high system requirements. Call of Duty 2 immediately comes to mind, it beat the crap out of my 128mb Geforce 6800 at the time.
data/avatar/default/avatar33.webp
Denial:

Eh, disagree. At the midrange they are certainly not great but newer titles are pushing up easily. Wolf for example, the 2080Ti can do 4K @ 60 without DLSS. It also remains to be seen with consoles having RT support how that drives game performance/adoption and techniques. Also RTX cards have mesh shaders, VRS and a few other features that AMD doesn't support at the moment. There is definitely value-add with RTX. Some other things: - enough to make an appreciable difference in image quality. Nvidia currently supports VRR, AMD does not. So outside of just sheer performance there is value with going Nvidia. How much are those things worth? Depends on the person.
While I fully applaud Nvidia for introducing RT as a consumer option (though instantly take that back on pricing), I would be very careful of bashing another company with generic statements. The reader/buyer needs to be informed exactly which RT techniques (e.g. reflection, refraction, shadows) are being used in a game/piece of software- and which are not. They should also be fully aware exactly how much is being used in the game - 5% of the scene? It's entirely possible...inevitable...that optimisation stages include "how much can we reduce the RT burden while still being able to state 'RT-ON' on the box". Take that important piece of information out and you leave yourself open to standard replies.
https://forums.guru3d.com/data/avatars/m/180/180832.jpg
Moderator
Nice card! the black screen bug is worrying though for AMD. I also read in another review that all cards besides sapphire have a old bios installed which requires flashing to get to new speeds. and their card died while flashing. best is to get the sapphire or wait for updates at least it seems.