Gigabyte GeForce RTX 3070 Series Renders Find their way to the www

Published by

Click here to post a comment for Gigabyte GeForce RTX 3070 Series Renders Find their way to the www on our message forum
data/avatar/default/avatar28.webp
8GB. Cool, so a 1080P card then?
https://forums.guru3d.com/data/avatars/m/218/218363.jpg
Moonbogg:

8GB. Cool, so a 1080P card then?
Is the 2080 Super a 1080p card with its 8GB VRAM? Not really, right? The amount of VRAM doesn't dictate "max resolution" in games. I will work nicely at 1440p and even 4K depending on the game.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Moonbogg:

8GB. Cool, so a 1080P card then?
Try 1440 to 4K as 8GB is plenty and will be for 3+ years. Get with the facts, not makebelieve just because you want it to be correct.
data/avatar/default/avatar08.webp
Aura89:

Try 1440 to 4K as 8GB is plenty and will be for 3+ years. Get with the facts, not makebelieve just because you want it to be correct.
Lets not speak NOW , lets speak when next gen games start poping. It will change fast. We need to have same discussion in 2 years.
data/avatar/default/avatar04.webp
kapu:

Lets not speak NOW , lets speak when next gen games start poping. It will change fast. We need to have same discussion in 2 years.
You mean when the next series of cards are being readied for release?
https://forums.guru3d.com/data/avatars/m/274/274977.jpg
No DVI connector either, huh. [SPOILER]*looks nervously at monitor*[/SPOILER]
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Aura89:

Try 1440 to 4K as 8GB is plenty and will be for 3+ years. Get with the facts, not makebelieve just because you want it to be correct.
Is that why 16gb and 20gb versions of 3070 and 3080 are coming? Ok then.
https://forums.guru3d.com/data/avatars/m/269/269690.jpg
Maybe hold off, there's rumoured to be a 16gb version in the wings
https://forums.guru3d.com/data/avatars/m/69/69564.jpg
emperorsfist:

No DVI connector either, huh. [SPOILER]*looks nervously at monitor*[/SPOILER]
Nvidia's ditched those ages ago. Maybe they had em on high end 700 series last?
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Alex13:

Nvidia's ditched those ages ago. Maybe they had em on high end 700 series last?
What are you talking about. Even a 1080 have a dvi-d port.
https://forums.guru3d.com/data/avatars/m/69/69564.jpg
My 900 card and 20 series didn't, guess some cards still had em. Need that VGA too. *dell nods*
https://forums.guru3d.com/data/avatars/m/274/274977.jpg
Alex13:

Nvidia's ditched those ages ago. Maybe they had em on high end 700 series last?
My GTX 1080 says otherwise. It's true, however, that I wasn't paying much attention to the 2xxx series after the benchmarks came in...
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
emperorsfist:

My GTX 1080 says otherwise. It's true, however, that I wasn't paying much attention to the 2xxx series after the benchmarks came in...
1080 FE dont have but aibs do have even as 1080ti. With 2000 series its only dp and hdmi.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
kapu:

Lets not speak NOW , lets speak when next gen games start poping. It will change fast. We need to have same discussion in 2 years.
Lets speak with historical information. Not an unknown future based on how the past is and history hasn't happened.
Undying:

Is that why 16gb and 20gb versions of 3070 and 3080 are coming? Ok then.
So nvidia can get money out of suckers? Because they wish to pay more for something that won't benefit them? Absolutely. Just like AMDs currently released 16GB models, just like AMDs previous 4GB vs 8GB models, etc. Barely any, if any, difference, and happened years after the release when saving money and buying a new GPU later on would have helped far more then the few, very few extra FPS you got for over spending on a GPU that didn't need that much memory. Facts are facts, history is history. Get enough people complaining your GPUs don't have enough memory even though you as a company who makes said GPUs knows otherwise, why wouldn't they release higher memory versions and rake in more cash just to satisfy those who don't want to look at reality? Win-Win for nvidia/AMD, more money, and more sales.
https://forums.guru3d.com/data/avatars/m/282/282600.jpg
Aura89:

Just like AMDs currently released 16GB models, just like AMDs previous 4GB vs 8GB models, etc. Barely any, if any, difference, and happened years after the release when saving money and buying a new GPU later on would have helped far more then the few, very few extra FPS you got for over spending on a GPU that didn't need that much memory. Facts are facts, history is history. Get enough people complaining your GPUs don't have enough memory even though you as a company who makes said GPUs knows otherwise, why wouldn't they release higher memory versions and rake in more cash just to satisfy those who don't want to look at reality? Win-Win for nvidia/AMD, more money, and more sales.
You are comparing low/mid range cards with high end cards. The 570 which does a 4GB and 8GB is a slow card not capable of doing 4k gaming so yea, 99 times out of 100 you won't need more than 4GB, but when you're talking about a 3070/3080 that 8/10GB is crossing a might fine line in 4k gaming maxed settings now, let along moving forwards. As someone else said, the consoles having about 16GB total memory so probably 8-10GB VRAM are obviously going to be using it, those devs and artists constrained by the under powered machines of the last gen are being let loose and as a result even at 1080p gaming the VRAM requirements will go up, its exponential at higher res's like 4k. If you only ever play 1440p with a 3080 you'll never have any issues but I'd say quite a few people buying it will have the ability to play 4k
https://forums.guru3d.com/data/avatars/m/282/282600.jpg
Oh and I just got told none of the 3080's I wanted are in stock until November, if thats right AMD has a chance to cause some damage with a good reveal and product
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
Aura89:

Try 1440 to 4K as 8GB is plenty and will be for 3+ years. Get with the facts, not makebelieve just because you want it to be correct.
People have been confusing memory allocated with utilized since the beginning of computers. Also higher the bandwidth the faster said memory can be filled so you can get away with less without any impact since you can preload resources faster. I think what everyone should do it is read reviews of higher VRAM cards and decide off the data.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
ACEB:

You are comparing low/mid range cards with high end cards.
I'm stating facts. I'm sure radeon vii users are super happy about their 16gb of vram its lasted them so well compared to newer GPUs. Same with the rtx titan and its 24gb Same with the titan x/xp with its 12gb Im sure they all love the fact that those GPUs cost a lot of money but are getting destroyed by GPUs with much less ram in them at 4K, where the additional ram was...supposed to make their GPU last longer..... (Only talking gaming here, yes, workstations can utilize that extra ram) So no, i'm stating facts, if you dont wish to believe these facts, you can put a tinfoil hat on at any time.
data/avatar/default/avatar34.webp
Aura89:

I'm stating facts. I'm sure radeon vii users are super happy about their 16gb of vram its lasted them so well compared to newer GPUs. Same with the rtx titan and its 24gb Same with the titan x/xp with its 12gb Im sure they all love the fact that those GPUs cost a lot of money but are getting destroyed by GPUs with much less ram in them at 4K, where the additional ram was...supposed to make their GPU last longer..... (Only talking gaming here, yes, workstations can utilize that extra ram) So no, i'm stating facts, if you dont wish to believe these facts, you can put a tinfoil hat on at any time.
These arent facts they are copes. 3070 8GB is obsolete right out the box.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Mundosold:

These arent facts they are copes. 3070 8GB is obsolete right out the box.
I'm sorry you don't believe facts. They aren't something that should have to be believed. But some people wish to wear tinfoil hats either which way. Glad i'm not one of those people, and actually see historical information, and don't base my future on incorrect, non-factual information. ...Unless you, ya know...have actual, provable, non-theoretical scenario where your statement is actually correct in more then at minimum 2 scenarios? Yeah...doubt it sorry i just simply don't...want to believe in what isn't there. Not interested, neither should you be. Or anyone else for that matter.