MSI Radeon R9 390X Gaming 8G OC review

Graphics cards 1049 Page 1 of 1 Published by

Click here to post a comment for MSI Radeon R9 390X Gaming 8G OC review on our message forum
data/avatar/default/avatar34.webp
why arent there any 380x and 370x, hmmm looks like amd is gonna release them later, lets see what they have up their sleeves 🙂
https://forums.guru3d.com/data/avatars/m/260/260317.jpg
amd need fix there drivers got that one spot on the 8gb vram is good if you gonna buy 2 and crossfire them i wish my gtx 970,s had 8gb vram each , i think that is bonus its got 8gb vram but the high power usage and heat in crossfire is big problem for that card
https://forums.guru3d.com/data/avatars/m/167/167502.jpg
Great review. Garbage product
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Two points: 1) Was this card tested with the proper 300 series drivers, or, as the same with OC3D TV's review of the same card, it was with the older, non-optimized drivers that came on the driver CD? (The review was just posted on OC3D's YouTube channel) 2) You bag AMD about drivers using Witcher 3 as an example, really?... Did you not know that due to it being a Nvidia Gameworks title, AMD had zero access to the game code until the game was released? In other words, AMD could not do any driver optimization until the game hit the shelves. Why is this so? Because all game dev's that partner with Nvidia Gameworks sign a contract that prohibits them from sharing any code with AMD. In fact, every man and his dog knows the story about Witcher 3 and how Nvidia tried to sabotage its performance on AMD cards... yet you used it as an example? I thought this site provided un-bias GPU reviews? Instead it seems you are trying to continue with the "AMD have **** drivers" BS that all Nvidiot bias sites are famous for.... goodbye Guru3D...
Yes, we started with the preliminary driver to get the review going and once the new batch from yesterdays public release was out, we updated and migrated the results (which showed tiny / minor differences really). As to your remark, I guess, you must be really happy with the driver support from AMD then. For the record, I am an editor, I watch and observe. If AMD fails to release a WHQL driver for over half a year, or is not capable of releasing driver 0-day with AAA game releases, then I find that to be bothersome. AMD does read these reviews you know, comments are noted and in the end .. these comments might get you better drivers more often opposed to what you have gotten in the past 12 months. That's bad how ? If you disagree with that, totally fine. Calling me bias is however stupid. Why do certain members react like you do these days ? So little respect is left it seems. Anyway, take care.
https://forums.guru3d.com/data/avatars/m/167/167502.jpg
AMD had zero access to the game code until the game was released?
Nvidia themselves said this was incorrect, as did 3dProjektRed, AMD had many months to get involved AMD simply don't invest money like Nvidia do (possibly can't afford to) If this was some big conspiracy like you seem to think, how come AMD still don't have working (with AA) crossfire drivers for the Witcher 3 How long do they need?
data/avatar/default/avatar19.webp
So how much power does it use when you overclocked it? Nearly as much as 2x 980's? Way more than a single one that's for sure, also having to fit a triple slot cooler is a pity and yet it still runs hotter than most dual slot coolers on nvidia cards. All in all the only decent thing about it is the 8gb of vram but is that worth the price premium over a r9 290x for example? Good review btw 🙂
data/avatar/default/avatar24.webp
This card is looking pretty nice, i came here just to see power consumption, temps, and how loud it would be, at this price is very competitive, but they need go a little bit lower on price to really reach the hot spot. GTX 980 is overpriced no way. 980 ti, fury(all) and this card are in very good position points. 970 for 4k is just not good enough.
data/avatar/default/avatar35.webp
i'm happy how this beat the 980 =)
data/avatar/default/avatar15.webp
Someone else said it perfectly. I will tweak that statement: Hilbert, as always, your prompt and detailed coverage is amazing and much appreciated. Now, I'll be blunt: This whole 3xx lineup from AMD is an absolute joke. I am stunned that they even dare release them especially with the financial situation they are in. My advice? Go grab a GTX 970 G1 and overclock the crap out of it. Or shoot, even a used GTX 980 G1/MSI 4G and overclock it. Much cooler, faster and overall, better card. Here's hoping the Fury X is at least some competition as my GTX 980 Ti G1 is running stable at 1522 core on air at 72 C load. Nothing to see here folks. *sigh*
https://forums.guru3d.com/data/avatars/m/134/134194.jpg
This card is looking pretty nice, i came here just to see power consumption, temps, and how loud it would be, at this price is very competitive, but they need go a little bit lower on price to really reach the hot spot. GTX 980 is overpriced no way. 980 ti, fury(all) and this card are in very good position points. 970 for 4k is just not good enough.
the 970 is a 1080p GPU, I like the new 390X looks like the 980 has a competitor I agree lower the price and it may sell better I think the 980 is a better GPU so AMD need to fight with price
https://forums.guru3d.com/data/avatars/m/99/99142.jpg
Depends from the game .Shadow of Mordor is one example of high vram requirements at http://www.guru3d.com/articles_pages/msi_radeon_r9_390x_gaming_8g_oc_review,16.html In future titles we could begin to see 8gb vram as mandatory for 4k.
Alright, that game that "uses" over 2gb vram to play a 10mb .bink file. Sure, 100% legit.
This card is looking pretty nice, i came here just to see power consumption, temps, and how loud it would be, at this price is very competitive, but they need go a little bit lower on price to really reach the hot spot. GTX 980 is overpriced no way. 980 ti, fury(all) and this card are in very good position points. 970 for 4k is just not good enough.
This MSI Gaming 390x is £350 in the UK, you can get an Asus DCUII 290x for £200. That is a saving of £150 and you're getting the same card, pretty much, minus the extra vram. Push both to their limit, they will perform pretty much the same. No way is the 390x a good deal.
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
Nice way to give a 2 year old card a price hike. Good job AMD. Anyone care to say only Nvidia are greedy now.
https://forums.guru3d.com/data/avatars/m/99/99142.jpg
Nice way to give a 2 year old card a price hike. Good job AMD. Anyone care to say only Nvidia are greedy now.
It's like AMD took a 290x, slapped another 4gb on it and overclocked it a little bit. Boom. 290x will perform the same as 390x, 95% of the time. I didn't expect much more tbh, seeing as the 280x was really just a 7970ghz etc. The "tweaks" really stand for 1-2 extra fps, not worth £150 at all. FuryX better deliver, otherwise it's going to be all green this time around.
https://forums.guru3d.com/data/avatars/m/228/228458.jpg
Nice way to give a 2 year old card a price hike. Good job AMD. Anyone care to say only Nvidia are greedy now.
That's crazy talk. Get out of here with that mess. 😀
https://forums.guru3d.com/data/avatars/m/260/260317.jpg
Yes, we started with the preliminary driver to get the review going and once the new batch from yesterdays public release was out, we updated and migrated the results (which showed tiny / minor differences really). As to your remark, I guess, you must be really happy with the driver support from AMD then. For the record, I am an editor, I watch and observe. If AMD fails to release a WHQL driver for over half a year, or is not capable of releasing driver 0-day with AAA game releases, then I find that to be bothersome. AMD does read these reviews you know, comments are noted and in the end .. these comments might get you better drivers more often opposed to what you have gotten in the past 12 months. That's bad how ? If you disagree with that, totally fine. Calling me bias is however stupid. Why do certain members react like you do these days ? So little respect is left it seems. Anyway, take care.
hes not being bias to nvidia amd drivers have been below par for long time crossfire or sli drivers support for the games is essential, gta 5 dont have gameworks but from day one i had an sli driver coz nvidia release driver for the game right on time , amd are behind nvidia wiith drivers its a fact nothing to do with bias
https://forums.guru3d.com/data/avatars/m/218/218363.jpg
It also has 4gb of vram in other words not good enough for 4k.
No gaming card on the planet is good enough for 4K, unless one wants to play at some 20-40fps which I personally don't. Maybe Pascal will achieve 4K-60fps in 2016, we'll see.
https://forums.guru3d.com/data/avatars/m/259/259067.jpg
Imho this "new" card(s) are just last beat of 28nm chips. They dont invest too much 'cause Next year (maybe) will be a real battle on 16nm tech. So why they should spend money & research on 28nm chips when in 2016-2017 the new 16nm chips will arive??? The new Fury X chip is a prove of their new upcoming tech on 16nm.
data/avatar/default/avatar34.webp
Also, anyone saying to CF two of these, you've got to be insane especially if you are planning on CF support in the first two weeks of any game or if you play any lesser known games. We've reached a point where a single GPU (980 Ti and hopefully Fury X) is more than sufficient for most games. SLI and CF can be great but as a game reviewer, I am tired of the spotty support, especially from AMD. And about no card is good enough for 4k.. are you serious? My G1 980 Ti can play all these games at 60 FPS on a single GPU at 4k: * Alien: Isolation * Book of Unwritten * Diablo III * Dark Souls II * Grim Dawn * Ori * Path of Exile * Torchlight II * Elite: Dangerous * Remember Me * Guild Wars II * Hard Reset * Hand of Fate * Lost Planet 3 * Just Cause 2 * Deus Ex Human Rev * Sacred 2 * Trine 3 * Batman AC * Tomb Raider * Skyrim * Max Payne 3 * MGS V: Ground Zeros Want me to list more? The most demanding games at 4K are still Crysis 3, Witcher 3 and DA: Inquistion.
https://forums.guru3d.com/data/avatars/m/99/99142.jpg
Imho this "new" card(s) are just last beat of 28nm chips. They dont invest too much 'cause Next year will be a real battle on 16nm tech. So why they should spend money & research on 28nm chips when in 2016-2017 the new 16nm chips will arive???
They haven't really spent money on the 390x and below, they're the same cards as the old gen. And the performance shows.
https://forums.guru3d.com/data/avatars/m/259/259067.jpg
Indeed they dont spend too much on this "new" cards. Maybe they will spend more on the new cards on 16nm.