Star Wars: Battlefront Beta PC graphics performance review

Game reviews 126 Page 1 of 1 Published by

Click here to post a comment for Star Wars: Battlefront Beta PC graphics performance review on our message forum
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
As if that makes any difference lmao. 660 runs games from 50 to 100% faster than ps4 gpu (severely power constrained obviously), and the 960 is up to 50% faster than the 660. So this is still poor numbers for a game running post process antialiasing intended for xbox360 and ps3. Now get a ****ing clue.
How does 1.9 tflop gpu run games 50 to 100% faster then ps4 gpu that has 1.83 tflops? How? Really... You would need around 770 level gpu to run games at ps4 level. 660 won't get up to speed vs damn xbox one either even if it has more computing power alone since your pc has amazing bottlenecks vs closer to metal consoles. Now 960 can and will perform better then ps4 and x1 given multiplats.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Gents ... language !
https://forums.guru3d.com/data/avatars/m/239/239622.jpg
Eyefinity resolutions supported with Beta? 5670x1200, ect?
data/avatar/default/avatar38.webp
I like what DICE did here in terms of optimization. This game aparently, not only by the results, but by the recommend hardware too, is a game that loads ton of stuff quickly but don't go higher than 4GB of VRAM memory buffer. Probably it dynamicaly allocates the memory according to whats on disposal and makes well balanced use of the CPU, RAM, "SSD" and GPU. Possibly less then 4GB memory cards will suffer from some hiccups on the 4k ultra settings, but that is to be expected. I think Frostbite engine is a really good engine that uses well the two architectures (AMD; NVIDIA) and does not gimp the performance from one or the other. This results are what i expect to see from a good engine that takes acount people who does not have High End systems and want to have a great experience too.
https://forums.guru3d.com/data/avatars/m/265/265161.jpg
Looks like I had better get rid of my Palit 780 πŸ™ and spend stupid money on a 980 ti
https://forums.guru3d.com/data/avatars/m/116/116345.jpg
Yea, which seems odd to say the least. If you are coming from the same IP address, why should the beta care how many GPUs you use? Perhaps it's because they are trying to get a fix on how different hardware is running with the game--but still, during a 24-hour lockout period they are getting zero data about anything. An IP lock would make a lot more sense if they are trying to control the spread of the beta.
An IP lock can easily be fixed with an IP lease.
https://forums.guru3d.com/data/avatars/m/228/228574.jpg
Why not use the new TAA anti aliasing option? Much better AA especially in motion than fxaa.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Likely, but I first need EA to solve the HW change limitation for me. I swap out 4 cards and I am locked out of the game.
Thanks chief, nothing less than I expected. Perfectly fair too πŸ™‚
Looks like I had better get rid of my Palit 780 πŸ™ and spend stupid money on a 980 ti
Reminds me of back when I got a 580 lightning once BF3 was to come around πŸ˜€
data/avatar/default/avatar30.webp
I like what DICE did here in terms of optimization. This game aparently, not only by the results, but by the recommend hardware too, is a game that loads ton of stuff quickly but don't go higher than 4GB of VRAM memory buffer. Probably it dynamicaly allocates the memory according to whats on disposal and makes well balanced use of the CPU, RAM, "SSD" and GPU. Possibly less then 4GB memory cards will suffer from some hiccups on the 4k ultra settings, but that is to be expected. I think Frostbite engine is a really good engine that uses well the two architectures (AMD; NVIDIA) and does not gimp the performance from one or the other. This results are what i expect to see from a good engine that takes acount people who does not have High End systems and want to have a great experience too.
Memory buffer is a bit too much exploited as a term.. many engine just feed the memory buffer to maxlevel, with unneeded things, that will never been then computed. for empty it when they want.. its 2 ways of doing it. This dont forcibly said the game use this ammount of data (or need this ammount of storage )
data/avatar/default/avatar37.webp
Looks like I had better get rid of my Palit 780 πŸ™ and spend stupid money on a 980 ti
Why don't u just get a 390?
data/avatar/default/avatar23.webp
How does 1.9 tflop gpu run games 50 to 100% faster then ps4 gpu that has 1.83 tflops? How? Really... You would need around 770 level gpu to run games at ps4 level. 660 won't get up to speed vs damn xbox one either even if it has more computing power alone since your pc has amazing bottlenecks vs closer to metal consoles. Now 960 can and will perform better then ps4 and x1 given multiplats.
Power/thermal constraints, I said it.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Hilbert: Maybe you should try using a motherboard with 2 or 3 PCIe slots that run at 16x. Then, insert a different GPU into each slot at the same time. That way, you can benchmark more GPU varieties at a time without triggering a hardware change. Obviously, you wouldn't be benchmarking on all GPUs at the same time. Of course, this depends on how the game detects the hardware. This idea might not work but you could get double or triple the results before the game starts complaining.
https://forums.guru3d.com/data/avatars/m/191/191516.jpg
Well thats interesting. 290x slightly beating a GTX 980 thats a good $200 dollars more. If I never purchased 980ti, i would definitly be getting a 290x right about now. Best bang for your dollar. 970 and 780ti are far behind in contrast.
data/avatar/default/avatar27.webp
Well thats interesting. 290x slightly beating a GTX 980 thats a good $200 dollars more. If I never purchased 980ti, i would definitly be getting a 290x right about now. Best bang for your dollar. 970 and 780ti are far behind in contrast.
So glad I picked up a 290X 6 months ago. I paid less for it then than the 390X costs today, even though they are practically the same card! Absolutely awesome value for the money.
data/avatar/default/avatar40.webp
Sorry, i really didn't understand what you meant to say exactly, but analising the graph of memory usage in the last page i can make reasonable guess about it, and you probably know that much of the information that sometimes are not used and discarded, sometimes is used as a cache, and should be better to cache on the VRAM than on the RAM because of the "swapping" and much faster speed of the VRAM. By the graphs you can spot the 780 ti and 960 going full ram on the 4k and the 980ti surpassing them but not going full. By looking at the Ultra HD graph FPS on page 6 the 780ti(3GB) and 970(3.5GB) probably aren't as smooth as the others and they are at bottom excluding the 2GB cards... I can't atest if they are really being used in realtime or discarded, or whatever technique used but they do seems to offer better gameplay. Sorry for the english.
https://forums.guru3d.com/data/avatars/m/90/90726.jpg
Played it this morning (completed in 9 minutes on first attempt) on a user system here in my office @ 1440. Seemed to play well enough even though had to crank down the settings a bit. Even at medium graphics it still looks great and plays pretty smooth.. no crazy hitching or anything. I'm not a huge star wars fan or anything, but after playing this it seems like this game should really make a lot of star wars fan excited. Also glad to see the TI 1440p performance.. very respectable @80fps, along with the fury x.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
I thought 780ti perf looked great @1080p, just what I kind of expected.. 970 is slower in raw gpu power then 780ti, so those numbers look spot on. A little oc and im sure it will catchup 290x/390x or 980gtx, imo gk110 is not old bones just yet πŸ™‚
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
It would be okay if results were under msaa4x but it seems to be cheapo fxaa, graphics are like 99% console and it runs there at 60fps. Ps.- Seriously, gtx960 should be like 50/80% faster than ps4Β΄s gpu, not slower.
Are you happy with every console to PC port which does not improve on graphics at all? Yes, we have magnitude stronger HW in PCs, and we expect games to have improved shaders, texture resolution, ... So, unless you can exactly replicate used resources and settings, there is no way to objectively compare PC to console performance.
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
Power/thermal constraints, I said it.
Yeah within those constraints the ps4 will beat a 660... Since it will deliver the nearly 2 tflops of performance as a whole package for same or less wattage then 660 or 660ti. And X1 delivers that 1.4-1.5 tflops with max 120watts. You are going to need way faster then matching theoretical numbers on pc tho.
https://forums.guru3d.com/data/avatars/m/145/145154.jpg
It looks like it should be playable for everyone with a decent rig, which is nice. Now we just need to find out if it's good or meh.