BioShock Infinite VGA Graphics Benchmark performance test

Game reviews 126 Page 1 of 1 Published by

Click here to post a comment for BioShock Infinite VGA Graphics Benchmark performance test on our message forum
https://forums.guru3d.com/data/avatars/m/54/54834.jpg
Better read about that guy... He has moded nv driver since years, his word is truer than anybody on pc forums!
Says the guy with 2 posts, same day join date? :P Honestly though the game runs super smooth, even runs on my APU powered laptop lol surprisingly. Just hooked on my 360 controller and feels like a console hah. http://imageshack.us/scaled/landing/89/img20130326213356.jpg
https://forums.guru3d.com/data/avatars/m/202/202567.jpg
your quoting the guy that promised a 100% increase in crysis 3 in performance over the regular nvidia driver? :stewpid: he also promised amd gpu physx as well. lmao
Shh, it IS that guy. Obvious troll is obvious.
https://forums.guru3d.com/data/avatars/m/124/124168.jpg
banned based on a post count of 1? priceless
https://forums.guru3d.com/data/avatars/m/124/124168.jpg
Shh, it IS that guy. Obvious troll is obvious.
sorry been at work all day, a little slow today. what a freaken moron
data/avatar/default/avatar24.webp
wait wait wait , what the heck is this u wanna tell me nvdia gpu's run better in this game lol even the gtx 650 ti is better then the hd 7870 and the hd 7970 ghz defeated by gtx 680 i think u have something wrong with your bench ore perhaps guru3d are highly payed from nvidia ore what ? check this http://gamegpu.ru/action-/-fps-/-tps/bioshock-infinite-test-gpu.html all the bench show amd gpu's are so strong in this game perhaps when DDOF are used nvidia show nice performance and gtx 680 are at the same performance of 7970 ghz but when i compare your bench with this one there is allot of difference a really a big difference , i know every bench is different (hardware scene etc .. ) but this .... , u need to check again your review
Well it's just a matter of time and members themselves will prove who is correct. A lot of people here have similar hardware or hardware capable of the same performance, so just wait and see if their benches match the ones in the review and you'll have your answer. I've personally never found any significant variations with benches posted here and hence I have no doubt that G3D is legit. I'll post a bench later on, that should resemble the 690 benches the most.
data/avatar/default/avatar12.webp
Have no clue whether that [H] forum guy is a troll or what, yet one has to admit there's something fishy about the newest nvidia drivers with their tremendous fps boost in a couple of amd-sponsored titles with presumably hard stress of video-card computation capabilities. To let apart plain guesswork just look at the series of benchmark diagrams from that said gamegpu.ru site. Just try to explain, how come so that nvidia's performance without DOF is so considerably lower if compared to the likes from the amd side, while adding DOF bring nvidia up and even higher than that of amd!? Isn't in obvious that the DOF feature has always been a weak point of Kepler generation which one can see for ex. from such a pc game as Metro? I honestly think that nvidia has really managed to change something in a very subtle and inconspicuous way in their game processing profiles to avoid hard computations, and the idea that they substituted FP64 with FP16 just fits perfectly into my uderstanding of the issue. Anyway, that is a pure imho and pure guessing.
data/avatar/default/avatar16.webp
Great stuff, Hilbert. BTW: I like the way you write. You have your own vision and that's a good thing these days. I have seen you write about things you don't like about both Nvidia and AMD. So don't let anyone get under your skin. Keep writing like you always do. Even if I don't always agree, it keeps me glued to this site. 🙂
https://forums.guru3d.com/data/avatars/m/90/90667.jpg
I've just compared the 25x16 Ultra Quality with DDOF FXAA performance from us with theirs, and there is very little difference. You are confused, the GTX 650 Ti you refer to is the yesterday released BOOST edition, as clearly marked in the charts. Not the regular Ti one. BTW, the next guy that claims we are BIAS to either AMD or NVIDIA, insinuate that we get paid for these reviews will get banned. I'm not joking, I spend a lot of time and work on these articles and do not wish or deserve to be insulted.
U doing good job Hilbert, keep on it don't let anyone run into your way, some people are too biased. game runs good on GTX 680 and 7970GHZ, what all the bias??? the ONLY downside for me of bioshock, is the fact that it have the stuttering issue.
https://forums.guru3d.com/data/avatars/m/208/208079.jpg
Nice work, Mr. Hilbert.
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
Thanks for the review HH, it's sad people accuse you because they prefer to live in their own world.
data/avatar/default/avatar28.webp
hm, oh well nvidia's driver is newer than amd's so better optimized drivers shined up a bit on the last minute, i recall amd's attitude with the era after 12.11 about performance gain but nvidia doesn't hold back, they are answering with their new driver in all newer games! great work Hilbert, nice and professional review once again as always, thanks 😀
data/avatar/default/avatar36.webp
Why aren't the video card memory sizes listed in any of the graphs? If you don't know which version of a card is being benchmarked it makes the results a lot less helpful. Specifically - is the 7850 you used a 1GB or 2GB model?
https://forums.guru3d.com/data/avatars/m/54/54834.jpg
Why aren't the video card memory sizes listed in any of the graphs? If you don't know which version of a card is being benchmarked it makes the results a lot less helpful. Specifically - is the 7850 you used a 1GB or 2GB model?
Usually they are listed as the reference or original card so most likely 2GB, if they used a different version it will list it. For example there is a 6GB version of the 7970, you don't see every benchmark showing 3GB, 3GB, 3GB etc... if they used a non-reference or a specific model review they will list it as such. HIS, MSI TF3... Sapphire etc... As for looking at benchmarks it is a rough indication usually of how cards will perform for that particular game with that particular setup using those drivers. The game may do better in these cards or may do better with those cards. Not everyone has the same setup and they will not have the same results. Just because the benchmark says this card gets 60fps doesn't mean ill have it if i have the same card, now if i have a copy of his entire rig then yes, also the fact that these cards are at stock or manufactured/factory defaults unless oc'd specified so you can't say zomg my card is fasster than that how come its on the low end :O :O :O you have to take into consideration the setup and how the benchmark works before you can truly understand it. Personally these benchmarks are given to us for free, we don't pay for them to see them, there is a lot of time and work put into them. They may not cater to your liking or meet your requirements for a detailed benchmark but it works for most people. I look at multiple reviews but I'm not going to compare every one of them. Go with what benchmark/reviewer you like and stick with that if you want. I personally have many but I like how this is setup so I always go back and read up on things.
https://forums.guru3d.com/data/avatars/m/113/113761.jpg
Great review Hilbert. I really like the new frame latency tests and I'm looking forward to your FCAT review. The FCAT reviews on Tom's Hardware and PC Perspective have really challenged people's perspectives a bit, and gotten others in a twitter. http://www.tomshardware.com/reviews/graphics-card-benchmarking-frame-rate,3466.html http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-Dissected-Full-Details-Capture-based-Graphics-Performance-Testin I have to say it's pretty funny... but apparently my last post here was in November 2005. I had to change my PC configuration and get rid of the listing that had my video card as a 6800GT overclocked to 410 mhz.
data/avatar/default/avatar27.webp
Wow that was a good article from Tom's. But I'm not comfortable with using an Nvidia software to measure AMD hardware, not because it might cripple AMD's side of things, but rather it might make Nvidia's look better. I understand that it is essentially recording and analysis of captured video, but what about runt frames? Is what FCAT using a good measure and any chance it can be biased in any way?
https://forums.guru3d.com/data/avatars/m/217/217316.jpg
Wow that was a good article from Tom's. But I'm not comfortable with using an Nvidia software to measure AMD hardware, not because it might cripple AMD's side of things, but rather it might make Nvidia's look better. I understand that it is essentially recording and analysis of captured video, but what about runt frames? Is what FCAT using a good measure and any chance it can be biased in any way?
I'm sure there's a chance, and I'm sure Nvidia is jumping on it because it makes AMD look bad, but, they have a lot more to gain by releasing a good tool that make their competition look bad than they do by lying. Consider: We release a solid tool that holds us accountable and makes AMD look bad. or We release a biased tool that makes AMD look bad and then, when found, makes us look 100 times worse.
data/avatar/default/avatar20.webp
one thing I don't get = why is the gtx 650 ti boost beating a 6970 @1900 x 1200 ?
data/avatar/default/avatar33.webp
FCAT looks good, Hilbert. 🙂 Do the spikes resemble stuttering?
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
FCAT looks good, Hilbert. 🙂 Do the spikes resemble stuttering?
Yes, but don't draw conclusions on this chart please as that's Sim City 6 and I was simply quickly rotating and zooming in to see if I could detect that in the chart. But that's a R7950 and you'll notice that the weird spikes we see with Fraps are have vanished. This is just step 34 of 100 though ! It'll be a while before we have a good grip and understanding of the data analysed 🤓