Middle Earth: Shadow of Mordor GeForce GTX 970 VRAM stress test

Published by

Click here to post a comment for Middle Earth: Shadow of Mordor GeForce GTX 970 VRAM stress test on our message forum
https://forums.guru3d.com/data/avatars/m/260/260317.jpg
i have gtx 970 ichill airboss ultra in sli and at 4k in dying light i get the full 4k gpu usage 60 fps everything maxed im very happy with my 970 sli https://www.youtube.com/watch?v=ziJRYsBZwQ0 video showing 4gb vram at 4k
https://forums.guru3d.com/data/avatars/m/105/105985.jpg
ok ok so NVidia is a lying ho bag but besides that its all pink inside. so why are so many really upset(don't get me wrong I would beeach too)they still have a really good card. Anyways it seems a lot of misinformed people care too much about ram capacity these days....they all forget the big picture. great testing boss!!!!!!
https://forums.guru3d.com/data/avatars/m/255/255677.jpg
I'm really surprised by their actions on how they handle this. First they hide the true specs and the memory configuration hoping noone will ever notice. Then some people noticed and they proved Nvidia wrong. Then Nvidia admitted. And then SILENCE. Their forums are suspiciously quiet. Only yesterday ManuelG and some other guy came to forums claiming we can return our card (country doesn't matter) but this will be hard cause we are talking about third party independent companies like Asus, Gigabyte, MSI etc and those are not involved in the matter and it's not their mistake. Nvidia answered me on Twitter and those guys in their forums basically say we can return the card but if the store or the company doesn't want to refund us we can contact with nvidia and they will contact with them to get the matter solved. So this is not a real solution and here we see once again how they handle the matter pretending it's not a big deal and let's forget it everyone will forget it soon. They also claimed they will try to compensate the performance and the 0.5gb partition in an upcoming driver. Which proves something is not quite right about the performance anyways. They have yet to release an official statement as a whole company to apologise to GTX970 buyers and offer them a full scale refund/upgrade/discount program or smth like that at least. Instead they are sending some individuals from their company in their forums who will try to help like they do it form their own good will and not a company as a whole. Again I will say it's not that the card is bad or you can't play games with it but they committed fraud by covering the real specs and memory configuration. Also from their own saying I get that there is a real performance drop in some situations at least and that's why they try to improve it in future drivers. Nvidia got me really upset. That's not a real professional way on how to handle your customers. The customers who bought an expensive card with fake specs.
https://forums.guru3d.com/data/avatars/m/224/224067.jpg
^^ Use paragraphs, please! 😉
https://forums.guru3d.com/data/avatars/m/101/101440.jpg
Any OpenGL test?
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Any OpenGL test?
OpenGL is irrelevant to DX class gaming. Obviously NV is virtualizing the memory space for DirectX, then partitions it to prioritize the 3.5 GB to deliver as little as possible data to that 512MB. That doesn't happen in OpenGL and as such in OpenGL there will be a performance drop. But it's stating and measuring the obvious.
https://forums.guru3d.com/data/avatars/m/248/248203.jpg
We slowly wonder though why certain US press is always so much prioritized and is cherry picked … Nvidia ?
:banana: nice job hilbert. i wonder why some folks demand testing past the card's "marketed" capabilities, its just damned silly to debate which 11-15 fps is a better experience. maybe they just want to see a product/card fail. though is there a possibility to look at a SLI set up, iirc there was some dropped frame ugliness going on . . just asking.
https://forums.guru3d.com/data/avatars/m/244/244590.jpg
so its a scam!, repent nvidia, REPENT!!
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I'm really surprised by their actions on how they handle this. First they hide the true specs and the memory configuration hoping noone will ever notice. Then some people noticed and they proved Nvidia wrong. Then Nvidia admitted. And then SILENCE. Their forums are suspiciously quiet. Only yesterday ManuelG and some other guy came to forums claiming we can return our card (country doesn't matter) but this will be hard cause we are talking about third party independent companies like Asus, Gigabyte, MSI etc and those are not involved in the matter and it's not their mistake. Nvidia answered me on Twitter and those guys in their forums basically say we can return the card but if the store or the company doesn't want to refund us we can contact with nvidia and they will contact with them to get the matter solved. So this is not a real solution and here we see once again how they handle the matter pretending it's not a big deal and let's forget it everyone will forget it soon. They also claimed they will try to compensate the performance and the 0.5gb partition in an upcoming driver. Which proves something is not quite right about the performance anyways. They have yet to release an official statement as a whole company to apologise to GTX970 buyers and offer them a full scale refund/upgrade/discount program or smth like that at least. Instead they are sending some individuals from their company in their forums who will try to help like they do it form their own good will and not a company as a whole. Again I will say it's not that the card is bad or you can't play games with it but they committed fraud by covering the real specs and memory configuration. Also from their own saying I get that there is a real performance drop in some situations at least and that's why they try to improve it in future drivers. Nvidia got me really upset. That's not a real professional way on how to handle your customers. The customers who bought an expensive card with fake specs.
I doubt they intentionally hid the specs of the card, considering their own tools report the correct ones. I'm also pretty sure they disabled ROPs for binning purposes, rather then artificially decreasing performance. I mean we see what, a 4% difference in performance drop? If the 970 performed 4% faster, would we really lose that many 980 sales? I doubt it.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
I'm still not sure you can really feel the performance decrease with full vram on the 970. Somehow I think you won't lose that many fps, and then again, you won't be at exactly 30fps or 60fps and dip below when it hits you, so you feel the lag / tearing. But, then again, I hope people find a sollution for themselves. If the cards really run back to the retailers / nvidia, we'll see a 990 with two (cut) gm204 chips soon, or some other buyers will look for them on the aftermarket and get those tri-sli 970s running 🙄
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I'm still not sure you can really feel the performance decrease with full vram on the 970. Somehow I think you won't lose that many fps, and then again, you won't be at exactly 30fps or 60fps and dip below when it hits you, so you feel the lag / tearing.
People know about it now and the placebo effect will kick in hardcore. It's the same in all enthusiast communities. Nvidia could release the same driver twice with a different version number and there will be multiple posts about how FPS has decreased or increased or how some random bug in a random game was fixed/created. The best example I've ever seen was in Android. I think it was the Galaxy Nexus, when a popular kernel dev was the first to implement a voltage control feature on the Gnex. Literally everyone and their mother raved about how it gave them better battery life. People were claiming it gave them an extra hour of screen on time and whatnot. The dev came out and said that the implementation was actually broken, and the voltage control didn't do anything. Even after that people still claimed it did something. So yeah, now that people know there is something weird going on, all kinds of performance issues will be attributed to it.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
HH, for sake of consistency this kind of testing is very bad. Please set certain settings where everyone can agree that game runs very well. (lets say around 55-60fps) Measure maximum vram utilization during testing. Turn off tested game. Then preallocate with another game/application enough vram so with next tests game hits 3.9-3.95GB utilization of vram. This is only way to realistically see what is real impact of this slow part of vram. As increasing details/effects for sure will decrease frame rate and it is pretty hard to decide where exactly decrease of performance came from.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Measure maximum vram utilization during testing. Turn off tested game. Then preallocate with another game/application enough vram so with next tests game hits 3.9-3.95GB utilization of vram. This is only way to realistically see what is real impact of this slow part of vram.
That is the most stupendous methodology I have ever heard of. We test in real-world scenarios, we test games in a manner you use in your home scenario. We're not forcible pushing a graphics card into problems where you will know you will get them. Obviously by using your methodology ANY graphics card will run into into massive issues.
https://forums.guru3d.com/data/avatars/m/220/220755.jpg
Great investigation work from Hillbert, i really enjoy reading the facts about the last Nvidia adventure. This community is very demanding, and is always going to push to the cards limits, I suppose that they don't count on that.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
That is the most stupendous methodology I have ever heard of. We test in real-world scenarios, we test games in a manner you use in your home scenario. We're not forcible pushing a graphics card into problems where you will know you will get them. Obviously by using your methodology ANY graphics card will run into into massive issues.
It will not run into issues. GTX780Ti has 3 GB of vram, if you have good performance while game utilizes 1.5GB of vram. Then you turn it off and preallocate another 1.5GB of vram. Starting game will make exactly same result as before, since game sill still use only 1.5GB of vram, but it will be in another memory region. Real world scenario: I play unreal tournament 3 on maximum details in windowed mode, ALT+Tab from it and forget that it was running. Offscreen game will put load on GPU/CPU as it sleeps (well done engine) but vram does not get unloaded. Then I happen to start another windowed game having no performance impact enjoying it to full, because I still have enough of vram and data are still accessed at peak rate. I did this with 6 games at once each using from 80 to 500MB vram, and there was no impact. Or is AMD somewhat special and therefore friendly to multiple applications having data in vram?
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
It will not run into issues. GTX780Ti has 3 GB of vram, if you have good performance while game utilizes 1.5GB of vram. Then you turn it off and preallocate another 1.5GB of vram. Starting game will make exactly same result as before, since game sill still use only 1.5GB of vram, but it will be in another memory region. Real world scenario: I play unreal tournament 3 on maximum details in windowed mode, ALT+Tab from it and forget that it was running. Offscreen game will put load on GPU/CPU as it sleeps (well done engine) but vram does not get unloaded. Then I happen to start another windowed game having no performance impact enjoying it to full, because I still have enough of vram and data are still accessed at peak rate. I did this with 6 games at once each using from 80 to 500MB vram, and there was no impact. Or is AMD somewhat special and therefore friendly to multiple applications having data in vram?
That is not a real world scenario. Who the hell is running multiple games in the background and honestly caring about performance? That's a scenario for like the 3 people who multi-box WoW accounts or some crap. No one does that.
data/avatar/default/avatar04.webp
HH, for sake of consistency this kind of testing is very bad. Please set certain settings where everyone can agree that game runs very well. (lets say around 55-60fps) Measure maximum vram utilization during testing. Turn off tested game. Then preallocate with another game/application enough vram so with next tests game hits 3.9-3.95GB utilization of vram. This is only way to realistically see what is real impact of this slow part of vram. As increasing details/effects for sure will decrease frame rate and it is pretty hard to decide where exactly decrease of performance came from.
How in the world is that supposed to show the 3.5GB threshold for performance loss? It has already been discussed that the driver moves the data around, so I can only guess that the preallocated data would be moved to the slower section when running the 2nd test. If there was any significant stuttering involved, than the FCAT would have shown it by now. Given the ridiculous settings that are required to even reach 4GB, this shouldn't be a problem for any 970 owner.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
How in the world is that supposed to show the 3.5GB threshold for performance loss? It has already been discussed that the driver moves the data around, so I can only guess that the preallocated data would be moved to the slower section when running the 2nd test. If there was any significant stuttering involved, than the FCAT would have shown it by now. Given the ridiculous settings that are required to even reach 4GB, this shouldn't be a problem for any 970 owner.
If you read that very huge thread about 970, you would know it has been done. And that user replicated it several times with exact same results, current drivers are not cycling data around based on use. Because if you already fill 3.5GB part and fill 0.5GB, then you would have to unload part of 3.5GB, move there 0.5GB slow data block and reload from system memory unloaded data to that slow part. And that my dear would cause severe momentary freeze.
data/avatar/default/avatar15.webp
If you read that very huge thread about 970, you would know it has been done. And that user replicated it several times with exact same results, current drivers are not cycling data around based on use. Because if you already fill 3.5GB part and fill 0.5GB, then you would have to unload part of 3.5GB, move there 0.5GB slow data block and reload from system memory unloaded data to that slow part. And that my dear would cause severe momentary freeze.
Wouldn't that be something that would happen during the loading screen of the game? I don't know though, I'm just speculating. Can you think of a reason why it won't show up on the FCAT?
https://forums.guru3d.com/data/avatars/m/227/227853.jpg
Wouldn't that be something that would happen during the loading screen of the game? I don't know though, I'm just speculating. Can you think of a reason why it won't show up on the FCAT?
I second this, shouldn't it happen while loading the game itself aka when switching to it?