Guru3D.com
  • HOME
  • NEWS
    • Channels
    • Archive
  • DOWNLOADS
    • New Downloads
    • Categories
    • Archive
  • GAME REVIEWS
  • ARTICLES
    • Rig of the Month
    • Join ROTM
    • PC Buyers Guide
    • Guru3D VGA Charts
    • Editorials
    • Dated content
  • HARDWARE REVIEWS
    • Videocards
    • Processors
    • Audio
    • Motherboards
    • Memory and Flash
    • SSD Storage
    • Chassis
    • Media Players
    • Power Supply
    • Laptop and Mobile
    • Smartphone
    • Networking
    • Keyboard Mouse
    • Cooling
    • Search articles
    • Knowledgebase
    • More Categories
  • FORUMS
  • NEWSLETTER
  • CONTACT

New Reviews
Scythe Mugen 5 Rev.C CPU Cooler review
be quiet Pure Loop 2 FX 280mm LCS review
HP FX900 1 TB NVMe Review
Scythe FUMA2 Rev.B CPU Cooler review
SK Hynix Platinum P41 2TB M.2 NVMe SSD Review
Corsair K70 RGB PRO Mini Wireless review
MSI MPG A1000G - 1000W PSU Review
Goodram IRDM PRO M.2 SSD 2 TB NVMe SSD Review
Samsung T7 Shield Portable 1TB USB SSD review
DeepCool LS720 (LCS) review

New Downloads
Intel ARC graphics Driver Download Version: 30.0.101.1743
GeForce 516.94 WHQL driver download
Display Driver Uninstaller Download version 18.0.5.4
FurMark Download v1.31
Intel HD graphics Driver Download Version: 31.0.101.3222
AMD Radeon Software Adrenalin 22.7.1 driver download
GeForce 516.93 WHQL Studio driver download
Corsair Utility Engine Download (iCUE) Download v4.26.110
ReShade download v5.3.0
AIDA64 Download Version 6.75


New Forum Topics
GeForce RTX 4080 and RTX 4070 same consumption as RTX 3080 and RTX 3070 but more performance NVIDIA GeForce 516.94 WHQL driver download & Discussion Intel has released performance figures for the Arc A750 Vulkan and DirectX 12 APIs Nvidia reduces revenue prediction due to video card demand. 516.94 - Clean Version Frame rate drop to 12 FPS on Tomb Raider Custom Levels AMD Software: Adrenalin Edition 22.7.1- Driver download and discussion Decision on best option for Video Card AMD video cards paired with AMD vs Intel? MSI MEG Ai1300P PCIE5 is ATX 3.0 Compliant PSU with 600 W PCIe Connector




Guru3D.com » News » Both Mantle and DX12 can combine video memory

Both Mantle and DX12 can combine video memory

by Hilbert Hagedoorn on: 02/03/2015 03:33 PM | source: | 48 comment(s)
Both Mantle and DX12 can combine video memory

Robert Hallock (Head of Global Technical Marketing at AMD)  shared something interesting on Twitter earlier on. You guys know that when your have a GPU graphics card combo with dual-GPUs that the memory is split up per GPU right ? 

Thus an 8GB graphics card is really 2x4GB. As it stands right now, this will be different when DirectX 12 comes into play and apparently already is with Mantle. Basically he states that two GPUs finally will be acting as ‘one big’ GPU. Here's his complete quote:

Mantle is the first graphics API to transcend this behavior and allow that much-needed explicit control. For example, you could do split-frame rendering with each GPU ad its respective framebuffer handling 1/2 of the screen. In this way, the GPUs have extremely minimal information, allowing both GPUs to effectively behave as a single large/faster GPU with a correspondingly large pool of memory.

Ultimately the point is that gamers believe that two 4GB cards can’t possibly give you the 8GB of useful memory. That may have been true for the last 25 years of PC gaming, but thats not true with Mantle and its not true with the low overhead APIs that follow in Mantle’s footsteps. – @Thracks (Robert Hallock, AMD)

There is a catch though, this is not done automatically, the new APIs allow memory stacking but game developers will need to specifically optimize games as such. An interesting statement.

Keep in mind though, this is a marketing rep talking ... 



Both Mantle and DX12 can combine video memory




« Ubisoft Enabling Fraudulent Far Cry 4 Keys For Customers Already Playing · Both Mantle and DX12 can combine video memory · Lawyers homing in on Nvidia after GTX 970 Memory allocation Claims »

10 pages 1 2 3 4 > »


jimdove
Senior Member



Posts: 345
Joined: 2010-12-04

#5006740 Posted on: 02/03/2015 04:15 PM
I still will not go multi gpu even if this is true, the issues with games not supporting sli or doing it poorly are too great imo. I am biased as I had a nightmare 9 months with a 5970 back in 09, poor ati drivers + terrible xfire support in the games I was playing = RMA and lots of rage :D I cannot go through that again (first world problems)

Lane
Senior Member



Posts: 6361
Joined: 2005-02-25

#5006741 Posted on: 02/03/2015 04:18 PM
Last Civilization: Beyond Earth use allready SFR with Mantle..

http://www.anandtech.com/show/8643/civilization-beyond-earth-crossfire-with-mantle-sfr-not-actually-broken

Thankfully, AFR is not the only approach to multi-GPU. Mantle empowers game developers with full control of a multi-GPU array and the ability to create or implement unique MGPU solutions that fit the needs of the game engine. In Civilization: Beyond Earth, Firaxis designed a "split-frame rendering" (SFR) subsystem. SFR divides each frame of a scene into proportional sections, and assigns a rendering slice to each GPU in AMD CrossFire configuration. The "master" GPU quickly receives the work of each GPU and composites the final scene for the user to see on his or her monitor.

/QUOTE]

Ofc, in the case of SFR ( split frame render ), developpers need to put more time and work with it, compared of fix some few problem with AFR ( texture / light flashing, scaling )

DeskStar
Senior Member



Posts: 1307
Joined: 2011-01-11

#5006742 Posted on: 02/03/2015 04:19 PM
Yeah I hear you....... Does sound a bit fishy as both GFX card companies have truly addressed their concerns on stacked/unified RAM architecture and how they plan on introducing it to the public. All in due time I suppose.

Makes me feel like my kids on Christmas, or something........waiting ever so patiently to find out what the companies are going to give us.

vbetts
Moderator



Posts: 15139
Joined: 2006-07-04

#5006759 Posted on: 02/03/2015 04:42 PM
Seeing that even if I added a second 970 to my rig, I would still be under the power restraints that is the Mac Pro and the silly power connectors that comes with it. I would add a second one no problem if developers did use this API.

I would have more than 4gb of memory then!(Joke)

A M D BugBear
Senior Member



Posts: 3916
Joined: 2005-09-27

#5006764 Posted on: 02/03/2015 04:49 PM
Quad TITAN's standing by to stack some memory.....! Just hope they don't pull one over on us about how this is just for the newest brands and no backward compatibility with us guys with the older hardware.

Please do not be another marketing rep blowing smoke attempting to rattle the cages of us hardware freaks that are out there.


Soon as the games hit 6-8gb of vram, your quad titan's, even if they scaled at 100% full speed, your cards, all 4 of them, will be on their knees, 8gb vram???, forget it.

maybe 6gb vram and settings,etc but at full 8gb vram? 100% even with 4 980 gtx's oc'ed, will be on their knees.

5-6gb sounds good, over that, our cards ain't fast enough to do something like that yet and maintain very high fps at the same time.

LOL, and LG is talking about 8k resolution this year? PLEASE!!

Games wont move period at that resolution, as of now no, OLD!! games perhaps, Today's games? FORGET IT...

If people own 4k, Try maxing out with dsr, should give you a hint how fast it will move, lol.

5gb sounds perfect.

Shoot, playing farcry4 @ slightly over 2560X1600 res(DSR), + maxed out settings + nvidia inspector 2xmsaa + 2xSGSSAA, sucker is moving like 17+fps to 20+ fps, seems to moving alright, looks great, lol.

I cant imagine running 8k with farcry4 with everything maxed out @ 60fps, holy good god moly, thats NEVER going to happen for a LONG LONG TIME.

As of the Mantle+DX12:

All I got to say is this, its about time, how many years later and still haven't found out a way to stack the vram? I knew it was just a matter of time, good god about time, I wonder when this will be coming out in full force? I don't expect anytime soon.

Matter of time before Nvidia comes up with there's, lol.

10 pages 1 2 3 4 > »


Post New Comment
Click here to post a comment for this news story on the message forum.


Guru3D.com © 2022