Nvidia talks Pascal 16GB Memory at 1TB/S bandwidth

Published by

Click here to post a comment for Nvidia talks Pascal 16GB Memory at 1TB/S bandwidth on our message forum
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Moore's law (if you can even call it a law) is about transistor count, not performance. It's also been adjusted from a year to 18 months since it's inception. As far as the fabrication, it definitely will slow down. Intel's only real method of competing in ARM's space is with smaller processes. They've been hiring the best and brightest engineers on the planet to do just that. But classical physics can no longer be used to govern the properties of silicon (or any material really) past 14nm. So for the first time ever they are applying quantum theories that haven't been fully tested to actual physical applications. On the math end, it was easy to simulate and predict how stuff would behave up until this point. Now they have to hire physicists to solve decade old physics problems in order to continue. Unless there is some major unifying breakthrough in physics I don't see any semblance of Moore's law continuing. In fact with Intel's 14nm delay it's already dead for the most part in the consumer space.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
According to the Steam Hardware Survey for October 2015, less than one tenth of one percent of users are running 4K monitors: There are 18 times as many people running QHD displays as there are people running 4K. This idea that 4k is relevant to the actual market, rather than just the marketing, is a myth. The 900 series has sold well for Nvidia because it's a good product. They went up in VRAM which everyone wants more of, the cards run quiet and cool, they get great fps-per-dollar, the power consumption is a bonus, and there's very little competition from AMD. If they just improve performance and VRAM again, then Pascal will sell well regardless of 4K performance. Obviously 4K will be better than Maxwell, but how much better is all but irrelevant to their sales figures.
Wow, I would have thought 720p would be the highest and not 1080p. My steam rig streams at 720p for latency purposes. I say bring it on, I'm finally starting to dive into 4k gaming and I'm loving it so far.
https://forums.guru3d.com/data/avatars/m/186/186763.jpg
I'm more than likely going to be buying one of these next year, really need to replace my two 7970s, I want one Single awesome GPU now!
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Moore's law (if you can even call it a law) is about transistor count, not performance. It's also been adjusted from a year to 18 months since it's inception.
Transistor count is often (not always) directly correlated to performance. If you can fit more transistors into a smaller package under the same architecture, you will (generally speaking) get more performance. But suppose we were to reach the physical limitation of silicon transistors: an architecture will have then reached it's maximum speed. If you want more performance out of the same design, you'll have to either: * Increase the frequency (which draws more power) * Add more cores (which also draws more power) * Add more cores but lower the frequency (which won't always help performance) * Or optimize the software (very undependable) Otherwise, you'll have to design a new architecture. But honestly, there's only so much you can do to get the perfect ratio of performance, TDP, and cost. There's still a long way until we reach that limit, but performance improvements have been noticeably slowing down. We're going to have to move on from silicon eventually. I think there's plenty of room for silicon to get us decent performance at 4K resolutions, but I personally think it's going to require some re-thinking of GPU architectures; I don't think smaller fabrication nodes, tri-gate transistors, or HBM2 memory are enough.
https://forums.guru3d.com/data/avatars/m/142/142454.jpg
That may be so, but the hardware performance has dramatically increased too. IGPs in modern PCs and tablets (if you include Tegra) can outperform some of the best GPUs back when 1080p went mainstream.
Possibly but in-game graphics have also improved. Battlefield 4 or GTA V would be unplayable (with high graphics settings) @ 1080 on a 8800 GTX. Since 4K has been generally available, (Guru3d only started doing 4K reviews with the release of the 780TI), the best GPUs have improved by about 50%. That's nowhere near the 400% needed and 400% wont be seen for quite a long time.
https://forums.guru3d.com/data/avatars/m/186/186763.jpg
On thech forums there is to much "hype" for 4k and 1440k, while there are less than 2% of people using 1440p and less than 0.3% using 4K. HD and Full HD is dominant, HD because of laptops still using 1366x768 and most desktops are 1080p.
To much hype!? Not sure what your talking about here. I game at 1440p and have to say I much prefer the 27" screen size and the higher res, have no interest in going back to 1080p. No doubt I'll feel the same if I go 4K why would i want to go back! Or the same as 120Hz, 144Hz and 165Hz once you go higher dam near no one wants to go back, its just better lol Its just not mainstream yet, mostly because it requires a lot of money to do, especially 4K.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I game at 1440p and have to say I much prefer the 27" screen size and the higher res, have no interest in going back to 1080p. No doubt I'll feel the same if I go 4K why would i want to go back! Or the same as 120Hz, 144Hz and 165Hz once you go higher dam near no one wants to go back, its just better lol
I see what you mean, but you'd be surprised how quickly you can adapt. Try playing an N64 on an old CRT TV and you're probably going to think "holy hell this is an ugly awful experience" but 3 hours later and suddenly it's not so bad anymore. Our brains are great at filling in the gaps and refining data that doesn't exist, it just takes a moment. Many people don't really see the reason to upgrade because 1080p is usually "good enough". 4k is incomparably better looking than 1080p, but until you see it for yourself you'll never think that, because your brain does such a good job at making 1080p look better than it really is. This is one of those things where "ignorance is bliss" is very true.
https://forums.guru3d.com/data/avatars/m/260/260855.jpg
To much hype!? Not sure what your talking about here.
Less than 1/10 of 1% of people are using 4K, but 4K is all over the marketing material for graphics cards from both AMD and Nvidia. That's marketing hype. No one is saying QHD or 4K are bad things.
https://forums.guru3d.com/data/avatars/m/169/169957.jpg
Transistor count is often (not always) directly correlated to performance. If you can fit more transistors into a smaller package under the same architecture, you will (generally speaking) get more performance. But suppose we were to reach the physical limitation of silicon transistors: an architecture will have then reached it's maximum speed. If you want more performance out of the same design, you'll have to either: * Increase the frequency (which draws more power) * Add more cores (which also draws more power) * Add more cores but lower the frequency (which won't always help performance) * Or optimize the software (very undependable) Otherwise, you'll have to design a new architecture. But honestly, there's only so much you can do to get the perfect ratio of performance, TDP, and cost. There's still a long way until we reach that limit, but performance improvements have been noticeably slowing down. We're going to have to move on from silicon eventually. I think there's plenty of room for silicon to get us decent performance at 4K resolutions, but I personally think it's going to require some re-thinking of GPU architectures; I don't think smaller fabrication nodes, tri-gate transistors, or HBM2 memory are enough.
I honestly doubt we'll be seeing "big pascal" by mid 2016. it just makes no sense to think there are still new 980Ti models being announced with gp100 around the corner, also 16gb vram on consumer cards ? Really? I remember reading a rumour about NV focusing on their computer/workstation line before the consumer class for Pascal
https://forums.guru3d.com/data/avatars/m/265/265660.jpg
I saw a great demand in GPU power by jumping from a 1080p monitor to a QHD one. I need almost 2X the power as before. Getting around -2X FPS with same settings as before. Let alone 4K. The GPU power for a single card 4K mainstream it's just not here and we have a long way till it happens. When someone jumps into 4K must also buy a powerful GPU and drop some graphical settings in games to get a smooth 60FPS. GPUs must make a more dramatic leap in performance to support well the 4K. I think HBM is here to help for this but I don't expect in Pascal. Also 4K monitors and G-sync monitors are way too expensive today. Prices must go down before it becomes a mainstream thing. The more you read and love PC hardware the more you realize you must put your hand deep deep inside your pockets.
https://forums.guru3d.com/data/avatars/m/172/172560.jpg
16 gigs smaller card and less power usage with more performance very nice but how much will it cost and will game developers use the power because the consoles now are struggle to do 1080p and 30 fps in new titles , with pascal the gap will be massive if they dont bring out a ps5 and xbox two
Well let's hope that developers DO NOT decide to use 16 gigs of vram because they will loose 99% of gamers that have no intention of shelling out 500$+ for a GPU or already have a decent 3/4/8 gig GPUs :3eyes: I mean, eventually, yea, I would like to see 16 gig vram cards in mainstream PCs and devs taking advantage of that BUT that is at least 5 years away, if even then...
https://forums.guru3d.com/data/avatars/m/248/248627.jpg
Ide bet that if a consumer card does get 16gb+ it will only be for the joke of a series known as titan. The next x80ti will probably get 8gb or so just my guess.
https://forums.guru3d.com/data/avatars/m/90/90726.jpg
So...? There's always something to look forward to. Volta will be old news by the time it is available and people have started looking forward to the next big thing.
Or, in other words, life.
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
The whole vram system seems to have shifted away from size now. 16gb will be nice but future games will still need to work on 4gb or less. Seems like game engines are fine tuning towards an all you can eat method. Therefore i think most of that 16gb of vram will mainly just help speed up larger area maps and such stuff like load times. Of course it will probably give you more AA options and help at higher res but the % of people/gamers with a 16gb card will be tiny for the first few years. Just like the 4k users now on steam hardware survey.
data/avatar/default/avatar03.webp
I think we will end needing eight times the amount of vram in consoles just to play ports with max textures, like always. Xbox 360 had 256mb, ports were already needing 1.5/2gb... 24gb of vram might be in our near future XD.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
well next gpu wont be bought till pascal is released at the earliest i curious to what they gona charge though
https://forums.guru3d.com/data/avatars/m/265/265352.jpg
upgrade or not just wait & see for the performance increasing,and the reasonable price (still not interested with Titan)..hopefully amd really compete with NEW WEAPON not REBRANDING:3eyes:
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Well one would assume that by the time pascal comes out it would have inherent support for DX12 and most other features that have been available for some time now. Not sure about that "single card" situation, or solution......let us hope so. If we are still talking about hitting that 60fps mark under 4K come 2017 (Volta) then I would say something is definitely wrong with either ones setup, or an upgrade has been needed. I play 4K at 60+FPS right now with %99 of the bells and whistles turned on. And it is amazing with a monitor supporting more than a meazily 60hz whilst using DSR @ 100hz with 4K enabled. Main reason I have not stepped up to a 4K monitor as of yet.......60hz feels like garbage after anything 100hz+ for the last couple of years.
You're gaming with quad TXs, aren't you? Definately one of the most powerful setups even available, let alone any cost. So of course you're running great 😀 But yeah, 4K with all the eye candy on a single card at 60Hz... not so sure about it. Might even turn out that at the time that's reality, 8K is going to be what 4K is now, and everybody will have switched to 1440p. Still some time ahead I'd say.
You're absolutely right. Why should people on an enthusiast forum for hardware get excited over new hardware?
Especially not what follows too soon after a major upgrade. Still was excited going from X58 to X99 😉
https://forums.guru3d.com/data/avatars/m/169/169957.jpg
You're gaming with quad TXs, aren't you? Definately one of the most powerful setups even available, let alone any cost. So of course you're running great 😀 But yeah, 4K with all the eye candy on a single card at 60Hz... not so sure about it. Might even turn out that at the time that's reality, 8K is going to be what 4K is now, and everybody will have switched to 1440p. Still some time ahead I'd say. Especially not what follows too soon after a major upgrade. Still was excited going from X58 to X99 😉
I upgraded from a GTX 570 and a ****ty (cheap in 2006) TN FHD monitor to a 980Ti and a Crossover 2795QHD. Seeing that IPS panel turn on the first time felt like looking straight into the sun, it's really awesome making such a big jump all at once. Anyway, with DSR I can run many games at 4K without much of a problem. Just need to tweak graphics settings to achieve that 60/75/90fps 😀 I'm also planning on upgrading to x99 from my x58 setup but I'm seriously considering waiting for X199/x109 (whatever they choose to call it), I'm undecided because on one hand I can buy a cheap 6-core xeon and some ddr3 as a stopgap, on the other hand it seems silly buying new ram and cpu to keep for around a year. Difficult decisions...
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I'm also planning on upgrading to x99 from my x58 setup but I'm seriously considering waiting for X199/x109 (whatever they choose to call it), I'm undecided because on one hand I can buy a cheap 6-core xeon and some ddr3 as a stopgap, on the other hand it seems silly buying new ram and cpu to keep for around a year.
Why exactly is a 6-core Xeon only going to last you a year? I'm not sure what you're using now or what you do besides gaming but as long as what you have is a quad core and newer than Sandy Bridge I can't imagine a cheap 6-core Xeon is going to be an upgrade. The cheaper Xeons are crap for gaming purposes, because their frequencies are so low and they have a mediocre amount of PCIe lanes.