GALAX GeForce GTX 1070 Box Photo and Renders Surface
Click here to post a comment for GALAX GeForce GTX 1070 Box Photo and Renders Surface on our message forum
Neo Cyrus
$379 USD for the non-Sucker Edition. That's $500 CAD. And I thought the $400-$450 prices of the 970s were absolutely outrageous for a card that's the bottom of what's realistically necessary for acceptable settings at 1440p.
With tax that's $565 for a lower-midrange card. Pass. And before anyone says anything about it not being towards the lower end due to how it performs compared to the last gen... No **** it should blow away the previous generation, hasn't it been like 20 months since the release of the last gen? That doesn't change the fact this 1070 is what would have been labled something like 1060 LE in the past due to its relative place in the actual spectrum.
If Vega doesn't save the day I guess I'll buy a Neo PS4 or something. Cheaper than a low-mid range video card.
Vxheous
Koniakki
Something tells me the GTX 1070 will cost between €500-600. Mark my words.
I seriously hope I'm wrong.
sverek
Ryu5uzaku
0blivious
I want better performance and I want it to cost less and I want a pony and I want it right now.
Ryu5uzaku
Koniakki
https://cdn.meme.am/instances/500x/11314373.jpg
:D
Neo Cyrus
Undying
Im expecting atleast 450€ in EU. Gonna wait for Polaris before i make my choice.
MainFrame Alpha
Ryu5uzaku
sverek
AndreasGuido
IVe got bit hopes for this one, but it all depends on the price. I wish it was something like £300-350 😀 But i would imagine its going to be in £450 region
gUNN1993
kinggavin
nvidia have cut the gtx 1070 down a little too much i think they should have give the gtx 1070 more cuda cores than the gtx 980, and the price u.k it probably gonna be £400 , the problem for me with pascal is we know nvidia is gonna release that 1080 TI in few months and it probably have 12 or 16gb vram and be much faster and good price point just like the gtx 980 ti was , im gonna stick with gtx 970 for now and wait for the gtx 1080 TI
Denial
Vxheous
Neo Cyrus
http://www.gpureview.com/show_cards.php?card1=36&card2=32#
The 8800GTX/Ultra was the top end single GPU card, that's the equivalent of the current Titans by every metric but price. At least back then when they sold the Ultra binned version at a stupid price it was barely any different in actual performance, it was just a few MHz faster (37MHz LOL) like the 9800 Pro Vs XT example from earlier, but even less of a difference % wise. Now you'll get an entirely different architecture that has shaders cut to boot.
Yeah, I'm only talking about apples to apples, single GPU cards. I honestly don't even bother remember dual GPU cards, I've intentionally avoided multi GPU setups. Maybe DX12/Vulkan/whatever will change that with the change in processing style.
The bottom line is: Either nVidia/AMD's research/development is unsustainable in terms of the result in market prices at the rate they're going, or they're just abusing the niche market for every drop they can squeeze out. The truth is often somewhere in the middle, but I'm having an impossible time believing AMD/nVidia couldn't do much better if they really wanted.
The 9800 Pro was still a high end card with nothing cut, the 1070 is a joke compared to what the Titan will be. The 9800 XT was just 32MHz higher on the core and 25MHz on the memory - Monchis
Bang for buck relative to consoles has plummeted hard in the latest years, I remember I could easily play xbox games at twice console resolution (480i vs 768p) on my ati 8500 and friends with the fx5200 did it too, pff I was able to play call of duty 2 on dx7 40/60 @1024x768 smoothly. Then a couple of years after the xbox360 arrived one could get a $80 hd4670 (don´t remember the nvidia equivalent) and easily play console games at twice console resolution as well (1024p vs 540p). But today it seems that nvidia and amd have decided that you need to drop a small fortune to get the same amount of physical memory of the consoles and play at the same walmart tv 1080p resolution, oh and if you want smooth gameplay you better grab a new gsync/freesync monitor, because the card wont get you there by itself. IMO these two need to go under investigation again, because they are known to be naughty.