AMD Teases PCs with Radeon Fury X2 Dual FIJI GPUs

Published by

Click here to post a comment for AMD Teases PCs with Radeon Fury X2 Dual FIJI GPUs on our message forum
https://forums.guru3d.com/data/avatars/m/260/260826.jpg
This card is dedicated to the VR and there will be no better solution for the VR within the next year. Oculus and Vive have resolution 2x (1080x1200). Gemini uses a single Fiji processor with 4GB HBM for one 1080x1200 display. By placing two GPUs on one PCB, cooperation of the GPU’s in Liquid VR can be better synchronized than in normal Crossfire.
This card will be another AMD PR stunt in the multi-card GPU niche. Theoretical on paper power and in reality USABLE power are two different things. CFX support since Crimson drivers (specially 16.1) is &%$·& and previous AMD multi-card GPU 295x2 has ZERO AMD support. His official page: www.amd.com/en-us/products/graphics/desktop/r9/295x2 still gives this message: http://i.imgur.com/UX9ZTMz.png
https://forums.guru3d.com/data/avatars/m/93/93080.jpg
What a silly comment. How can you say that such a powerhouse can be absolete in 4 months. You do realize its dual Fiji GPU? 😀
Yeah it won't be obsolete, but as you said before it's late. Would be better off focusing on their new gpu's.
https://forums.guru3d.com/data/avatars/m/254/254132.jpg
With the unsure future of multi-GPU's in general, not sure about this. CFX support isn't improving in DX11 games. Only 4GB effective VRAM and a price tag to follow the 295x2.
data/avatar/default/avatar38.webp
The untimely release of this product that is still not in stores will make it's short lifespan in terms of relevance unattractive to the person who will spend so much money on High End Video Cards with all new gimmicks. Because Other than the ability to run present products at relevantly good FPS you are locked on a hardware that is not in the technology peak of present time. It's the best product of old generation products that are one breath of becoming unattractive to people with money that want to have the new HDMI ; The New GCN; The new "DX12 Full support" and so on*... If you want to use VR in the months to come and have money to spend to enjoy it for say year and a half before going for the next best thing yeah this is for you. But if you want to invest in GPU for "future proofing" this is the worst card to get. PS: Polaris is a generation of GPU that will get die shrink, VRAM Upgrade, GCN upgrade , HDMI 2.0a, DX 12 support up to date and who knows what else . It may not be the fastest GPU but it brings more to the table than 3 generations AMD GPUs together so obsolete is not off the table.
https://forums.guru3d.com/data/avatars/m/248/248627.jpg
People that keep saying dx12 will solve the memory sharing need to stop. Dx12 hasn't even taken off yet first of all and second we don't even know if game devs will use that feature it could be a complete flop.
data/avatar/default/avatar20.webp
This card is dedicated to the VR and there will be no better solution for the VR within the next year. Oculus and Vive have resolution 2x (1080x1200). Gemini uses a single Fiji processor with 4GB HBM for one 1080x1200 display. By placing two GPUs on one PCB, cooperation of the GPU’s in Liquid VR can be better synchronized than in normal Crossfire.
True it can be used for the VR headsets coming out. Heck even 1 fury x is enough for the VR headsets. Also to me this card can be used for 4k.
https://forums.guru3d.com/data/avatars/m/230/230424.jpg
I hate how both AMD and Nvidia are complete *******s and market these cards as having twice the amount of VRAM. While technically correct, it doesn't apply in real life. The average Joe won't have a clue. Yeah I'm not holding my breath for that at the moment, SFR doesn't scale exactly well. DX12 has to pull some serious magic tricks in order to change that. Would be amazing if it would happen. I mean look, I had a GTX560 that was basically killed because of the 1GB VRAM.
Yeah i agree. Learned my lesson with the Geforce 9800 GX2. no more gpu sandwiches for me.
https://forums.guru3d.com/data/avatars/m/227/227853.jpg
^ Haha GPU sandwiches. I'll remember that.
not really, 4Gb hbm can handle 4k resolution very well, of course if you feel the need to go above 4k then yes, 4gb hbm probably won't be enough.
You're saying it like HBM makes a difference. HBM or GDDR5, VRAM capacity is VRAM capacity. It's all up to the devs to actually do their jobs and optimize. I mean look at the beautiful VRAM usage of The Witcher 3. Shadow of Mordor looks like a bloody joke in comparison.
Yeah it won't be obsolete, but as you said before it's late. Would be better off focusing on their new gpu's.
Yeah I agree. It's way too late and the way AMD seems to be moving in slow-motion annoys the living crap out of me. This card could have been released half a year ago. Let's be honest, you have the GPU. It's not such a hard job to cram two of them on one PCB. Yet another missed opportunity. This card isn't even released yet and it already faces a price cut very soon once Pascal starts rolling. And I have a feeling Pascal will hit the road first. I sure hope I'm wrong. AMD would lose a ton of sales, just like the 390 barely sells compared to how the 970 sold.
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
Yeah i agree. Learned my lesson with the Geforce 9800 GX2. no more gpu sandwiches for me.
I had the GeForce 9800 GX2 myself - still have actually as never managed to sell it. Then I swapped over to AMD with the dual gpu 5970 card (sold) and now currently still on the dual gpu 7990 card. I was going to get their latest dual gpu 295 card last year but decided not to as still too expensive. To be honest there is a lot of trouble actually using the two gpus in some games. I mostly used to play Battlefield series and they sort of supported them and got faster frame rates but now I am mostly playing Starwars Battlefront and that has absolutely atrocious support for two gpus. It actually makes the game unusable. I actually had to rollback the terrible Crimson drivers which would not allow disabling of crossfire at alliance order to get Starwars working. In fact Crimson when installed completely broke this game, caused two completely out of ordinary instant PCs shutdowns on a machine which is stable and not changed for years, and also completely destroyed the HDMI audio output on a media server machine I have.. Really don't know what card to get next. Probably never dual gpu again and maybe nvidia again for first time in years.
https://forums.guru3d.com/data/avatars/m/230/230424.jpg
I had the GeForce 9800 GX2 myself - still have actually as never managed to sell it. Then I swapped over to AMD with the dual gpu 5970 card (sold) and now currently still on the dual gpu 7990 card. I was going to get their latest dual gpu 295 card last year but decided not to as still too expensive. To be honest there is a lot of trouble actually using the two gpus in some games. I mostly used to play Battlefield series and they sort of supported them and got faster frame rates but now I am mostly playing Starwars Battlefront and that has absolutely atrocious support for two gpus. It actually makes the game unusable. I actually had to rollback the terrible Crimson drivers which would not allow disabling of crossfire at alliance order to get Starwars working. In fact Crimson when installed completely broke this game, caused two completely out of ordinary instant PCs shutdowns on a machine which is stable and not changed for years, and also completely destroyed the HDMI audio output on a media server machine I have.. Really don't know what card to get next. Probably never dual gpu again and maybe nvidia again for first time in years.
Yeah it frustrating coz when dual gpu works, it works well. I just find it ridiculous that AMD\Nvidia tout 4k, physX, gameworks, PureHair etc that require more gpu power than the top gpus can deliver. Then theres the issue if you buy two mid to lower end cards to save money for sli\xfire, if a new game doesnt support dual gpu, then you're simply stuck with a single low to mid card struggling to reach a stable fps.
https://forums.guru3d.com/data/avatars/m/227/227853.jpg
Yeah it frustrating coz when dual gpu works, it works well. I just find it ridiculous that AMD\Nvidia tout 4k, physX, gameworks, PureHair etc that require more gpu power than the top gpus can deliver. Then theres the issue if you buy two mid to lower end cards to save money for sli\xfire, if a new game doesnt support dual gpu, then you're simply stuck with a single low to mid card struggling to reach a stable fps.
This is exactly why I didn't pull the trigger on another 970 to SLI with my existing one. It's ridiculous how greedy the gaming industry has gotten. Frame rate caps and no multi-gpu support. That's a huge issue if you ask me. Although to be fair the fps cap trend is more or less over because the respective devs have been shamed out of proportion. I've never had SLI/CFX. If this trend continues; I may never. Or perhaps Vulkan or DX12 will make this task easier, encouraging the industry to support it well.
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
This is exactly why I didn't pull the trigger on another 970 to SLI with my existing one. It's ridiculous how greedy the gaming industry has gotten. Frame rate caps and no multi-gpu support. That's a huge issue if you ask me. Although to be fair the fps cap trend is more or less over because the respective devs have been shamed out of proportion. I've never had SLI/CFX. If this trend continues; I may never. Or perhaps Vulkan or DX12 will make this task easier, encouraging the industry to support it well.
Appears to be, since the new consoles are now using the same AMD gpus, that the big development houses using the big game engines are actively only developing for single gpu since that is all that every console has. And and nvidia and motherboard manufacturers will never admit it though since they make lots of money selling the whole dual gpu, sli, crossfire idea to pc gamers. Thing I can't understand is why big games like Starwars which use the well known PCs frostbite engine which worked perfectly well with battlefield still can't work right with more than one Gpu. I have also heard that lots of the big game algorithms and modern rendering techniques are just not compatible with multi gpu which is very sad. Appears to me now consoles are number one, that multi gpu is dead as a dodo.
https://forums.guru3d.com/data/avatars/m/165/165326.jpg
Yeah it frustrating coz when dual gpu works, it works well. Then theres the issue if you buy two mid to lower end cards to save money for sli\xfire, if a new game doesnt support dual gpu, then you're simply stuck with a single low to mid card struggling to reach a stable fps.
^ This +1. I think more and more of us enthusiast are moving away from Crossfire and SLI due to the above mentioned statement you made. My last SLI set up was a VGA GTX 690 and my last Crosssfire set up was 4x 290X Quadfire , when they worked they deliver some amazing performance , but since a couple of years back i have started to noticed a trend on both camps Nvidia and AMD where support for SLI/Crossfire has been lacking on a lot of tittles , i see it more and more often now. The Gemini Dual FuryX will no doubt will be a beast of a card , but i wonder for how long those merely 4GB of HBM memory will last on the upcoming years to come with the new AAA games sometimes exceeding 4+ GB of video ram at 4K gaming, I love AMD cards but Gemini it is too late to the party in my opinion :/