AMD Announces Radeon RX 5500 series Navi 14 - available next month

Published by

Click here to post a comment for AMD Announces Radeon RX 5500 series Navi 14 - available next month on our message forum
data/avatar/default/avatar11.webp
Good to se AMD working its way in to the mobile market. I wonder where it will position itself performance wise?
https://forums.guru3d.com/data/avatars/m/269/269625.jpg
AcidSnow:

I'm stuck on a 1080p projector for at least 2 more years, so although I really want a 5700 XT, I might just wait and see how these cards do... Because the 5700 XT and its little brother the 5700 both get a ridiculous 120FPS+ in most games (except for the new Metro Exodus, that's like ~68fps @1080p) and my R9 290X 4GB is still doing pretty well in games, it's my i7-920 CPU that is starting to slow down my frames!
These cards that can run 1080p at good frame rates even if a solid 60fps, are, I expect what most people are on. I was always happy playing on a 30'' tv. Then I got a 4k and a gpu needed for that is costly. Strange thing is I could not go back now to 1080p
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
schmidtbag:

We should abandon the performance hunt when we've reached adequate levels of performance, which we most certainly have not (for GPUs, anyway), especially now that real time raytracing is a thing. I've held off on getting a new GPU until I find something that can adequately play games in 4K@60FPS and doesn't cost more than twice the rest of my PC combined.
Why not go with 1440p/144Hz? It's considered the sweet spot at the moment - balance of high resolution and high refresh rate.
https://forums.guru3d.com/data/avatars/m/236/236670.jpg
Shaxuul:

Is it another "1080p" card, or can it handle 1440p, too?
This^.....RX580 can already play anything 1080p @60fps I gave $60 for an old mining card.....Dam thing works great!
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
D3M1G0D:

Why not go with 1440p/144Hz? It's considered the sweet spot at the moment - balance of high resolution and high refresh rate.
I'm not a competitive gamer, and of the competitive games I do play, none of them would give me much of an advantage at a higher refresh rate. My display is currently 42", and I don't intend to step down from that, since it also doubles as my livingroom TV. So, I prefer detail and immersion over improved latency and smoothness. Personally, I don't have any gripes about the smoothness of 60FPS. I wouldn't complain if I could go higher, but, if cranks up the price of my next display then it isn't something I'm going to pursue.
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
schmidtbag:

I'm not a competitive gamer, and of the competitive games I do play, none of them would give me much of an advantage at a higher refresh rate. My display is currently 42", and I don't intend to step down from that, since it also doubles as my livingroom TV. So, I prefer detail and immersion over improved latency and smoothness. Personally, I don't have any gripes about the smoothness of 60FPS. I wouldn't complain if I could go higher, but, if cranks up the price of my next display then it isn't something I'm going to pursue.
That's why I suggested 1440p/144Hz - it's higher refresh rate AND higher resolution 😉. It's far less demanding than 4K (a 5700/XT should be more than adequate) and provides a good mix of quality and quantity. I'm not a competitive gamer either but I play a lot of first-person shooters and a high refresh rate monitor generally provides a better experience.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
D3M1G0D:

That's why I suggested 1440p/144Hz - it's higher refresh rate AND higher resolution 😉. It's far less demanding than 4K (a 5700/XT should be more than adequate) and provides a good mix of quality and quantity. I'm not a competitive gamer either but I play a lot of first-person shooters and a high refresh rate monitor generally provides a better experience.
At 42", I don't think 1440p will have sufficient pixel density if I'm [sometimes] sitting within 1m away from the display (if I'm using keyboard+mouse). Besides, getting a 40"+ 1440p@144Hz display is going to be absurdly expensive, assuming such a thing even exists. There are some ultrawide monitors that might be that large but those are woefully unappealing to me. But hypothetically, let's say I went to a more normal display size for a gaming PC, like 27": for me personally, the added cost of a higher refresh rate just isn't worth it. Even though I have the money to get a 7980XE with a pair of 2080Tis, I don't care about gaming enough to burn that kind of money, and I don't play the kinds of games that warrant such hardware. Meanwhile, since I care more about visual quality than refresh rate, I'd have to actually decrease detail levels in order to maintain that high refresh rate in some cases. So really, such a display is going in the opposite direction of what I want. I totally see the appeal to them, but I don't recall ever playing a game at 60FPS thinking "this isn't fast enough" or "wow, this is really choppy". I do, however, recall several occasions getting distracted by how poorly certain details resolve at a distance, or being able to distinguish individual pixels a little too easily. Even if I'm playing an FPS (which I very rarely do; last time I extensively played one was about 3 years ago), I prefer being able to see my target from far away, rather than track it quickly.
https://forums.guru3d.com/data/avatars/m/163/163032.jpg
Gotta say, love the small size. Now, what I find odd is this, if I were to jump from a 570 to this for around $200, where is the upgrade? Same for the 5700or any nvidia card. What I paid for 2 year ago is the same as today (cost per fps) I really don't see an upgrade in fps per dollar spent (from either nvidia nor amd)
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
schmidtbag:

Meanwhile, since I care more about visual quality than refresh rate, I'd have to actually decrease detail levels in order to maintain that high refresh rate in some cases. So really, such a display is going in the opposite direction of what I want. I totally see the appeal to them, but I don't recall ever playing a game at 60FPS thinking "this isn't fast enough" or "wow, this is really choppy". I do, however, recall several occasions getting distracted by how poorly certain details resolve at a distance. Even if I'm playing an FPS (which I very rarely do; last time I extensively played one was about 3 years ago), I prefer being able to see my target from far away, rather than track it quickly.
Well, you don't have to play at 144 FPS to enjoy such a monitor - I typically aim for around 100 FPS (I like to keep settings high as I like high IQ as well 😉). And as I said, 4K is far more taxing on the GPU than 2.5K (it's more than twice the pixels). I can typically play at max settings at 1440p at well above 60 FPS on my 1080 Ti while I need to decrease settings to the lowest to achieve borderline 60 at 4K. A 2080 Ti can play at higher settings, although probably not ultra and definitely not with ray-tracing. My estimation is that an affordable GPU capable of 4K@60 at max settings will be a long time coming so a 1440p monitor will see plenty of use while providing a nice resolution bump (70% more pixels). It's a nice stepping stone towards eventual 4K - just thought I'd put it out there.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
D3M1G0D:

Well, you don't have to play at 144 FPS to enjoy such a monitor - I typically aim for around 100 FPS (I like to keep settings high as I like high IQ as well 😉). And as I said, 4K is far more taxing on the GPU than 2.5K (it's more than twice the pixels). I can typically play at max settings at 1440p at well above 60 FPS on my 1080 Ti while I need to decrease settings to the lowest to achieve borderline 60 at 4K. A 2080 Ti can play at higher settings, although probably not ultra and definitely not with ray-tracing. My estimation is that an affordable GPU capable of 4K@60 at max settings will be a long time coming so a 1440p monitor will see plenty of use while providing a nice resolution bump (70% more pixels). It's a nice stepping stone towards eventual 4K - just thought I'd put it out there.
True, but if I'm going to spend that much money on a monitor, I'm going to want to take advantage of its capabilities. Even opting for 120Hz can be expensive. So, I'm going to need a beefy GPU (and CPU) anyway to get my money's worth. I'd rather save that money on a higher-res (but lower Hz) display and use that money toward just a better GPU. It is worth pointing out that if I got a 4K display, I wouldn't be using AA. That can really butcher performance for a visual effect that I personally don't even like. As far as I'm aware, a 1080Ti can play most games in 4K at or close to 60FPS with AA disabled, with most other details maxed out. I'm fine with lowering something like shadows or disabling motion blur (which I also don't like) if that helps increase my framerate for a minimal visual loss. So - although I agree a GPU that can play AAA games at ultra with 4K@60Hz isn't going to come for a long time, I'm not seeking ultra settings. Actually, I hardly ever play AAA titles either, but we don't need to derail this thread into "games I prefer to play and how I prefer to play them". That being said, you and I have very different tastes, which is perfectly fine. Although I'm itching for an upgrade (and itching, because I've got poison ivy on my arms and legs...) I'm willing to wait longer until a product exists that does what I want for a reasonable price.
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
D3M1G0D:

Why not go with 1440p/144Hz? It's considered the sweet spot at the moment - balance of high resolution and high refresh rate.
But even for 1440p@144 you need a very expensive GPU. I currently got a 1070 and it's definitely without any doubt not powerful enough for this resolution. Ideally i would need a 2080 Super but that is too expensive in Canada. 2070 Super is more affordable but it's not as bullet proof. I'm considering a 5700XT cause for a temporary card performing near as good as the 2070 Super the price is not too bad and it would certainly be better than the 1070 but right now even at 1440p@144 if you are not willing to spend 2080 Super/Ti money then it's borderline at best.
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
schmidtbag:

As far as I'm aware, a 1080Ti can play most games in 4K at or close to 60FPS with AA disabled, with most other details maxed out.
Not really. As I said before, I need to decrease settings to the lowest to get 60 FPS at 4K on my 1080 Ti. When I initially got my 4K monitor I tried playing Fallout 4 with high settings and it was a slideshow (it was when I realized just how brutal 4K really was - what runs perfectly smoothly at 1440p is literally unplayable at 2160p). Doom is probably the only game that I can run comfortably at 4K with high settings. A 2080 Ti might do what you want, but it's going to be a while before such a GPU becomes affordable for the masses (I'm guessing at least two more years).
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
Vananovion:

Why does AMD put normal fans on lower-end reference cards and blower-style fans on higher end ones? The only reason I can think of is helping their partners earn more on higher-margin cards. Anyway I want to see bigger Navi, not these small ones. What is this, a GPU for ants?
In airflow restricted pc cases, the blower will do a better job, since the heat is ejected out the back of the computer rather than dumped inside the case, this is more of an issue with higher power video cards. low end cards do not produce as much heat and are intended to be cheaper, hence the simpler heatsink design.
data/avatar/default/avatar38.webp
D3M1G0D:

Not really. As I said before, I need to decrease settings to the lowest to get 60 FPS at 4K on my 1080 Ti. When I initially got my 4K monitor I tried playing Fallout 4 with high settings and it was a slideshow (it was when I realized just how brutal 4K really was - what runs perfectly smoothly at 1440p is literally unplayable at 2160p). Doom is probably the only game that I can run comfortably at 4K with high settings. A 2080 Ti might do what you want, but it's going to be a while before such a GPU becomes affordable for the masses (I'm guessing at least two more years).
Ummm something is wrong with your system. I have a 1080ti and i do not decrease settings to even medium. lowest i drop to is high to average a solid 60fps. Most games i max out minus AA and i get a solid 60fps. on 4k
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
D3M1G0D:

Not really. As I said before, I need to decrease settings to the lowest to get 60 FPS at 4K on my 1080 Ti. When I initially got my 4K monitor I tried playing Fallout 4 with high settings and it was a slideshow (it was when I realized just how brutal 4K really was - what runs perfectly smoothly at 1440p is literally unplayable at 2160p). Doom is probably the only game that I can run comfortably at 4K with high settings. A 2080 Ti might do what you want, but it's going to be a while before such a GPU becomes affordable for the masses (I'm guessing at least two more years).
Something sounds seriously wrong there. You shouldn't be suffering that great of a performance loss going to 4K. Even with AA on, you should see better results than that. There's plenty of benchmarks out there of the 1080Ti yielding playable (as in, 30FPS+) frame rates in 4K at max or near max detail, including from Guru3D.
data/avatar/default/avatar38.webp
Anybody else notice the 8-pin power connector? Granted, AMD always goes overkill on their reference board power design (5700 XT has a 40A rail that only a fraction of its capacity is used) but it still kind of surprised me https://i.imgur.com/C90aOdt.jpg
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
schmidtbag:

Something sounds seriously wrong there. You shouldn't be suffering that great of a performance loss going to 4K. Even with AA on, you should see better results than that. There's plenty of benchmarks out there of the 1080Ti yielding playable (as in, 30FPS+) frame rates in 4K at max or near max detail, including from Guru3D.
Well, he didn't state any numbers for his performance, so for all we know the need to go to minimum details was to obtain 120fps or similar.
https://forums.guru3d.com/data/avatars/m/236/236670.jpg
David3k:

Anybody else notice the 8-pin power connector? Granted, AMD always goes overkill on their reference board power design (5700 XT has a 40A rail that only a fraction of its capacity is used) but it still kind of surprised me https://i.imgur.com/C90aOdt.jpg
my little 1060 has 8pin
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
schmidtbag:

Something sounds seriously wrong there. You shouldn't be suffering that great of a performance loss going to 4K. Even with AA on, you should see better results than that. There's plenty of benchmarks out there of the 1080Ti yielding playable (as in, 30FPS+) frame rates in 4K at max or near max detail, including from Guru3D.
AlmondMan:

Well, he didn't state any numbers for his performance, so for all we know the need to go to minimum details was to obtain 120fps or similar.
60. I can accept 50 or maybe a brief dip into the 40s but anything below that I consider unplayable (even with FreeSync, the experience is generally poor).
https://forums.guru3d.com/data/avatars/m/278/278874.jpg
Silva:

Could everyone please stop calling these midrange? It's absurd how brainwashed people are these days. RX5700XT should be the 250€ midrange card, not a 3 year old performance card...
Well if we're gonna get 5700-5500-5300, it's the mid range for Navi anyway. Did you see nVidia mid-range (2070) move to 500$ way before AMD introduced their Navi lineup.
Vananovion:

Why does AMD put normal fans on lower-end reference cards and blower-style fans on higher end ones? The only reason I can think of is helping their partners earn more on higher-margin cards. Anyway I want to see bigger Navi, not these small ones. What is this, a GPU for ants?
It's a 3D representation of what the card might have been if release by AMD themselves, no ? I guess custom won't be far from the design anyway with custom rad and fans for some. I don't think the margin gonna be huge on this one, 150$ MSRP seems the price to come, i don't see why they would compare it to a GTX 1650 otherwise.
David3k:

Anybody else notice the 8-pin power connector? Granted, AMD always goes overkill on their reference board power design (5700 XT has a 40A rail that only a fraction of its capacity is used) but it still kind of surprised me https://i.imgur.com/C90aOdt.jpg
Is that really surprising ? It's not the lowest card for Navi lineup. RX 5300 should be the 75 Watts relying only on PCI-E power (at least for non OC variants if they go that way). Even gtx 1650 have a power pin connector on OC cards because without it they are limited on their frequency to what nVidia planned.