AMD Teases PCs with Radeon Fury X2 Dual FIJI GPUs

Published by

Click here to post a comment for AMD Teases PCs with Radeon Fury X2 Dual FIJI GPUs on our message forum
https://forums.guru3d.com/data/avatars/m/248/248627.jpg
While I don't doubt this will be a monster of a card I can see 4gb being very limiting for these 2 gpu's
data/avatar/default/avatar29.webp
While I don't doubt this will be a monster of a card I can see 4gb being very limiting for these 2 gpu's
Maybe if thye use dx12 exclusive feature to allow for combined memory you get 8GB of HBM then this should make more sense. I still can't wait to see hwo this performs
https://forums.guru3d.com/data/avatars/m/252/252888.jpg
Maybe if thye use dx12 exclusive feature to allow for combined memory you get 8GB of HBM then this should make more sense. I still can't wait to see hwo this performs
Even though it sounds very promising it's a massive "what if". Pretty cool card if it's going to be that small 😀
https://forums.guru3d.com/data/avatars/m/227/227853.jpg
I hate how both AMD and Nvidia are complete *******s and market these cards as having twice the amount of VRAM. While technically correct, it doesn't apply in real life. The average Joe won't have a clue.
Maybe if thye use dx12 exclusive feature to allow for combined memory you get 8GB of HBM then this should make more sense. I still can't wait to see hwo this performs
Yeah I'm not holding my breath for that at the moment, SFR doesn't scale exactly well. DX12 has to pull some serious magic tricks in order to change that. Would be amazing if it would happen. I mean look, I had a GTX560 that was basically killed because of the 1GB VRAM.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Late AMD. Just focus on Polaris and release it before competition.
https://forums.guru3d.com/data/avatars/m/130/130124.jpg
While I don't doubt this will be a monster of a card I can see 4gb being very limiting for these 2 gpu's
not really, 4Gb hbm can handle 4k resolution very well, of course if you feel the need to go above 4k then yes, 4gb hbm probably won't be enough.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
not really, 4Gb hbm can handle 4k resolution very well, of course if you feel the need to go above 4k then yes, 4gb hbm probably won't be enough.
VR goggles of the first generation aren't anywhere near the 4k resolution to begin with. Although the 90Hz requirement does matter, but that's about the muscles, so 4GB is enough for it.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Am I the only one here who noticed that it both GPUs are supposed to be running at the same speed as a single GPU? Considering how AMD shipped the Fury X with liquid cooling and requires 2x 8-pin PCIE power connectors, I'm finding it really hard to believe how this Fury X2 can even function (it also gets me to wonder if the Fury X really needs both of those connectors). I'm surprised they didn't go with 2 Nano chips, which would be safer to power and easier to cool.
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Am I the only one here who noticed that it both GPUs are supposed to be running at the same speed as a single GPU? Considering how AMD shipped the Fury X with liquid cooling and requires 2x 8-pin PCIE power connectors, I'm finding it really hard to believe how this Fury X2 can even function (it also gets me to wonder if the Fury X really needs both of those connectors). I'm surprised they didn't go with 2 Nano chips, which would be safer to power and easier to cool.
4x 8pin I guess, plus the 150w or what it is from the PCI-E 3.x slot itself. (Lots of power so I guess a lot of heat both from that and the GPU dies themselves plus stuff like VRM's so it's likely using a water cooled solution of some sort, maybe a bigger version of the FuryX one?)
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Am I the only one here who noticed that it both GPUs are supposed to be running at the same speed as a single GPU? Considering how AMD shipped the Fury X with liquid cooling and requires 2x 8-pin PCIE power connectors, I'm finding it really hard to believe how this Fury X2 can even function (it also gets me to wonder if the Fury X really needs both of those connectors). I'm surprised they didn't go with 2 Nano chips, which would be safer to power and easier to cool.
Decently binned Fiji parts + better power scaling and it's probably doable. No one expected the Nano as a full Fiji to do what it did either lol. GCN's voltage scaling is pretty intense. Besides are the final clockspeeds confirmed or are we just assuming that it's going to run at full Fury X speed?
4x 8pin I guess, plus the 150w or what it is from the PCI-E 3.x slot itself. (Lots of power so I guess a lot of heat both from that and the GPU dies themselves plus stuff like VRM's so it's likely using a water cooled solution of some sort, maybe a bigger version of the FuryX one?)
Well it says in the article that photos of it showed only 2x 8pin. I think with the right binning and scaling (similar to nano) you can pull it off. I'm sure Fox has a better understanding of it. He basically explained exactly how the Nano was going to work before it even launched.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
Am I the only one here who noticed that it both GPUs are supposed to be running at the same speed as a single GPU? Considering how AMD shipped the Fury X with liquid cooling and requires 2x 8-pin PCIE power connectors, I'm finding it really hard to believe how this Fury X2 can even function (it also gets me to wonder if the Fury X really needs both of those connectors). I'm surprised they didn't go with 2 Nano chips, which would be safer to power and easier to cool.
I'm thinking the 2x8-pins are on the Fury X for overhead. Better to have more than enough compared to just enough. If that is the case, then I am also worried that the x2 variant will not have a whole lot of headroom...Who knows, maybe power is regulated different with the bridge chip onboard?
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I'm thinking the 2x8-pins are on the Fury X for overhead. Better to have more than enough compared to just enough. If that is the case, then I am also worried that the x2 variant will not have a whole lot of headroom...Who knows, maybe power is regulated different with the bridge chip onboard?
Pretty much what I was thinking. @Denial Yeah, the voltage scaling could have a lot to do with it. Makes it pretty easy to advertise and sell a GPU with high specs even if it will never reach them for extended periods of time.
data/avatar/default/avatar33.webp
This is a waste AMD. Don't even bother with it. Just focus on the next group of cards. However there will be some people out there who will buy one of these cards.
data/avatar/default/avatar39.webp
This is a waste AMD. Don't even bother with it. Just focus on the next group of cards. However there will be some people out there who will buy one of these cards.
Ditto. This would have been interesting 6 months ago, but now? Yup definitely going to spend 800€ just to buy a GPU that is going to be obsolete in 3-4 months.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Ditto. This would have been interesting 6 months ago, but now? Yup definitely going to spend 800€ just to buy a GPU that is going to be obsolete in 3-4 months.
That's arguable. I think people have this idea that Polaris/Pascal are going to be significantly faster then their predecessors, they really aren't. The cost of manufacturing essentially forbids them from increasing the transistor count much and I've heard rumblings that the clock speed will be similar too due to leakage. I predict the new Polaris/Pascal top end parts will only be 20-25% faster then their current models and most likely be priced similarly, in the $650 range. So if AMD can sell this at $1000 I think it will hold the crown until 2017, when both companies refresh. I think it could be worth it for enthusiasts looking to spend that kind of money anyway, especially in VR.
https://forums.guru3d.com/data/avatars/m/152/152580.jpg
This is a waste AMD. Don't even bother with it. Just focus on the next group of cards. However there will be some people out there who will buy one of these cards.
This card is dedicated to the VR and there will be no better solution for the VR within the next year. Oculus and Vive have resolution 2x (1080x1200). Gemini uses a single Fiji processor with 4GB HBM for one 1080x1200 display. By placing two GPUs on one PCB, cooperation of the GPU’s in Liquid VR can be better synchronized than in normal Crossfire.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Ditto. This would have been interesting 6 months ago, but now? Yup definitely going to spend 800€ just to buy a GPU that is going to be obsolete in 3-4 months.
What a silly comment. How can you say that such a powerhouse can be absolete in 4 months. You do realize its dual Fiji GPU? 😀
https://forums.guru3d.com/data/avatars/m/85/85047.jpg
Definitely not have HDMI 2.0..., pass!
https://forums.guru3d.com/data/avatars/m/262/262208.jpg
Depends what price will be and what performance will be in OpenCL,then I will go with this card,but still Polaris is around corner(end of the year I hope so),then I don't know,but all depends on performance in my view Thanks,Jura