Crucial T700 PCIe 5.0 NVMe SSD Review - 12GB/s
Sapphire Radeon RX 7600 PULSE review
Gainward GeForce RTX 4060 Ti GHOST review
Radeon RX 7600 review
ASUS GeForce RTX 4060 Ti TUF Gaming review
MSI GeForce RTX 4060 Ti Gaming X TRIO review
GeForce RTX 4060 Ti 8GB (FE) review
Corsair 2000D RGB Airflow Mini-ITX - PC chassis review
ASUS PG27AQDM Review - 240Hz 1440p OLED monitor
MSI MAG X670E Tomahawk WiFi review
Final Fantasy XV Official Site shows 2080 and 2080Ti Benchmark Results
So if you browse the Final Fantasy XV website, their benchmark results set is showing new entries at 2560x1440 and 3840x2160. The scores shown here are the average values calculated for the various GPUs.
The resulting aggregate scores may be affected by the specifications for the CPUs, memory and graphics drivers etc. used in individual tests. These scores are for reference purposes only and do not constitute a guarantee to the performance when running the release version of the game, at least that is what is stated on the page.
Click the thumbnails below.
« GeForce RTX 2080 Ti Availability Delayed by a week · Final Fantasy XV Official Site shows 2080 and 2080Ti Benchmark Results
· MSI introduces Trident X Series with the latest NVIDIA GeForce RTX 2080 Ti »
Creative Chosen as official Final Fantasy XV Recommended Gear - 03/15/2018 08:56 AM
Creative Technology announced that Sound BlasterX H7 Tournament Edition headset, H5 Tournament Edition headset, Vanguard K08 keyboard and Siege M04 mouse have officially been recommended for Square En...
Unprotected version of Final Fantasy XV loads faster - 03/13/2018 09:03 AM
The unprotected version of Final Fantasy XV runs better than the Steam version. For this comparison dso gaming used an Intel i7 4930K (overclocked at 4.2Ghz) with 8GB RAM, NVIDIA’s GTX980Ti,...
Benchmark review: Final Fantasy XV for Windows PC (Updated) - 03/06/2018 09:39 AM
We check out and benchmark the PC version of Final Fantasy XV / 15 (2018) for Windows relative towards graphics card performance with the latest AMD/NVIDIA graphics card drivers. Multiple graphics car...
Final Fantasy XV has been cracked two days before its even released - 03/05/2018 02:36 PM
It's a bit of af a mess, so remember the PC demo of Final Fantasy XV from last week? As it seems, it was an unprotected release. A Chinese crack group called "3DM" simply used the .exe f...
Final Fantasy XV demo for PC is out - 02/27/2018 09:34 AM
Square Enix has released a demo for the PC version of Final Fantasy XV. This demo is 20GB in size and will feature the first Chapter and will allow players to fully explore the tutorial and opening qu...
Fox2232
Senior Member
Posts: 11808
Joined: 2012-07-20
Senior Member
Posts: 11808
Joined: 2012-07-20
#5585391 Posted on: 09/17/2018 08:51 AM
I'm not an expert on vRam architecture, but if I recall correctly (please correct me if I'm wrong) Gddr5x chips came in a form of 4,6,8,12,16gb capacity chips, and same goes for gddr6. 11gb we have on 1080ti/2080ti is 3x4gb chips, but one of them is cut down for whatever reason. I was honestly expecting to see at least 12, but more leaned towards 16gb vram on new generation from Nvidia. We don't see that. Instead, it's 11gb again, this time gddr6 (which is the main contributor to the increase in performance of Turing).
And if they release Titan (which they probably won't as my personal prediction is that Turing will be one of the shortest-lived generations, and releae of Ti model with everything else suggests that) it will have 12/16gb.
If they are cutting down the 4gb chip (which I highly doubt, probably its somehow limited on software bios level) that will cost them more than simply putting a full 4gb chip. But if their goal is to simply use that as another factor of - hey, 2080Ti has 11gb, but Titan has 1 more, go for Titan and we charge you extra 600$ - Then that's just disgusting.
AMD wise we can see that HBM the way it works can offer 2 options right now - 1stack =8gb, 2 stacks = 16gb. Next Navi will probably have 8gb still in order to keep the prices competitive and remove the chances of shortages as with initial Vega release.
11 GB on card comes from 11x 8Gbit chips. Each chip has 32bit interface... 11*32=352bit.
I'm not an expert on vRam architecture, but if I recall correctly (please correct me if I'm wrong) Gddr5x chips came in a form of 4,6,8,12,16gb capacity chips, and same goes for gddr6. 11gb we have on 1080ti/2080ti is 3x4gb chips, but one of them is cut down for whatever reason. I was honestly expecting to see at least 12, but more leaned towards 16gb vram on new generation from Nvidia. We don't see that. Instead, it's 11gb again, this time gddr6 (which is the main contributor to the increase in performance of Turing).
And if they release Titan (which they probably won't as my personal prediction is that Turing will be one of the shortest-lived generations, and releae of Ti model with everything else suggests that) it will have 12/16gb.
If they are cutting down the 4gb chip (which I highly doubt, probably its somehow limited on software bios level) that will cost them more than simply putting a full 4gb chip. But if their goal is to simply use that as another factor of - hey, 2080Ti has 11gb, but Titan has 1 more, go for Titan and we charge you extra 600$ - Then that's just disgusting.
AMD wise we can see that HBM the way it works can offer 2 options right now - 1stack =8gb, 2 stacks = 16gb. Next Navi will probably have 8gb still in order to keep the prices competitive and remove the chances of shortages as with initial Vega release.
11 GB on card comes from 11x 8Gbit chips. Each chip has 32bit interface... 11*32=352bit.
PrMinisterGR
Senior Member
Posts: 8103
Joined: 2014-09-27
Senior Member
Posts: 8103
Joined: 2014-09-27
#5585418 Posted on: 09/17/2018 09:28 AM
Yeah they could, that's the thing. Nobody stopped this from being a 18bn transistor Pascal with 6k+ shaders. But then we would have one and the same, again.
These cards are just a stopgap. They couldnt fit enough raster power under the hood and add their "AI" cores as well. So yeah, it makes sense that even a full 2 years later there isnt much of a performance increase.
Yeah they could, that's the thing. Nobody stopped this from being a 18bn transistor Pascal with 6k+ shaders. But then we would have one and the same, again.
cryohellinc
Senior Member
Posts: 3534
Joined: 2014-10-20
Senior Member
Posts: 3534
Joined: 2014-10-20
#5585422 Posted on: 09/17/2018 09:37 AM
Well, my bad then.
11 GB on card comes from 11x 8Gbit chips. Each chip has 32bit interface... 11*32=352bit.
Well, my bad then.

quantum hacker
Member
Posts: 70
Joined: 2018-09-14
Member
Posts: 70
Joined: 2018-09-14
#5585433 Posted on: 09/17/2018 10:00 AM
Glad I scored a 1080 for $350 the other day. I figured I would want at least the 2080ti level strength for 4k as the 1080ti seems a bit on the edge, but I wanted a bump up from the 970 so I could at least play around with some 4k things, like custom res 3840x1620 in Assetto Corsa for instance. That 2080 just looks like a lame duck at that price. If you're already laying down near a $1000 for a card then you may as well get the Ti variant since you're likely going to be using it for awhile. I have a feeling many 2080 owners are going to be like "I should have just gotten the 2080TI" since they're going to want to feel like the purchase price was justifiable. I would cringe looking at benchmarks knowing my new $800 card is second rate.
Glad I scored a 1080 for $350 the other day. I figured I would want at least the 2080ti level strength for 4k as the 1080ti seems a bit on the edge, but I wanted a bump up from the 970 so I could at least play around with some 4k things, like custom res 3840x1620 in Assetto Corsa for instance. That 2080 just looks like a lame duck at that price. If you're already laying down near a $1000 for a card then you may as well get the Ti variant since you're likely going to be using it for awhile. I have a feeling many 2080 owners are going to be like "I should have just gotten the 2080TI" since they're going to want to feel like the purchase price was justifiable. I would cringe looking at benchmarks knowing my new $800 card is second rate.
Click here to post a comment for this news story on the message forum.
Senior Member
Posts: 3534
Joined: 2014-10-20
I'm not an expert on vRam architecture, but if I recall correctly (please correct me if I'm wrong) Gddr5x chips came in a form of 4,6,8,12,16gb capacity chips, and same goes for gddr6. 11gb we have on 1080ti/2080ti is 3x4gb chips, but one of them is cut down for whatever reason. I was honestly expecting to see at least 12, but more leaned towards 16gb vram on new generation from Nvidia. We don't see that. Instead, it's 11gb again, this time gddr6 (which is the main contributor to the increase in performance of Turing).
And if they release Titan (which they probably won't as my personal prediction is that Turing will be one of the shortest-lived generations, and releae of Ti model with everything else suggests that) it will have 12/16gb.
If they are cutting down the 4gb chip (which I highly doubt, probably its somehow limited on software bios level) that will cost them more than simply putting a full 4gb chip. But if their goal is to simply use that as another factor of - hey, 2080Ti has 11gb, but Titan has 1 more, go for Titan and we charge you extra 600$ - Then that's just disgusting.
AMD wise we can see that HBM the way it works can offer 2 options right now - 1stack =8gb, 2 stacks = 16gb. Next Navi will probably have 8gb still in order to keep the prices competitive and remove the chances of shortages as with initial Vega release.