Nvidia Fabs 200 special Cyberpunk 2077 graphics cards based on 2080 Ti

Published by

Click here to post a comment for Nvidia Fabs 200 special Cyberpunk 2077 graphics cards based on 2080 Ti on our message forum
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Hitman1985:

while the game comes out rtx 2077 video card in game will run on medium quality settings, everybody knows how it works.
Game will need a 2080ti to run properly with the raytracing on.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
fantaskarsef:

Probably some special edition to sell left over Turing cards. Wouldn't count on Ampere to be teased yet.
Nailed it.
https://forums.guru3d.com/data/avatars/m/272/272728.jpg
RavenMaster:

FFS... need a GPU with a HDMI 2.1 socket so i can run my LG C9 OLED @ 4k/120hz.
Why not use 2560X1440 downscalled by the TV? This is what I'm planning to use with the future to be released 48 inch OLED. The cards aren't fast enough to drive 120 fps at 4k, so why bother?
https://forums.guru3d.com/data/avatars/m/218/218363.jpg
Another one of those contests where nVidia has forgotten that Sweden & Iceland exists 🙁 Apparently we're not as good as our Scandinavian neighbors below OPEN ONLY TO LEGAL RESIDENTS OF AUSTRALIA, AUSTRIA, BELGIUM, CANADA (EXCLUDING THE PROVINCE OF QUEBEC), COLOMBIA, CROATIA, CZECH REPUBLIC, DENMARK, FINLAND, FRANCE, GERMANY, GREECE, IRELAND, JAPAN, NEW ZEALAND, NORWAY, PERU, POLAND, SOUTH KOREA, SPAIN, SWITZERLAND, TAIWAN, THE NETHERLANDS, UNITED KINGDOM, THE UNITED STATES OF AMERICA (EXCLUDING PUERTO RICO AND ITS OTHER TERRITORIES AND POSSESSIONS).
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Oh I just realised, Canada except Quebec... 😀
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
Let's first see if this doesn't turn out to be Cyberjunk, before making hardware skins.
https://forums.guru3d.com/data/avatars/m/215/215813.jpg
Error8:

Why not use 2560X1440 downscalled by the TV? This is what I'm planning to use with the future to be released 48 inch OLED. The cards aren't fast enough to drive 120 fps at 4k, so why bother?
I am currently using 1440p 120hz as you say. But one RTX 2080Ti does around 70fps on Ultra settings with AAA titles when set resolution is set to 4K 60hz. With 2x 2080Ti's and NVLink you get around 120fps with V-sync off but because native resolution is set to 4K 60hz, you get screen tearing and a bit of micro-stuttering. At 1440p 120hz, everything is smooth as butter, no micro-stutters, no screen tearing and G-sync activated. Just need either a Displayport 1.4 to HDMI 2.1 adapter or a RTX 3080Ti with native HDMI 2.1 to achieve 4k 120hz smoothness.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
It would be cool if the card was named RTX2077.
https://forums.guru3d.com/data/avatars/m/145/145154.jpg
So they slapped a cyberpunk sticker/plate on it. That's easily worth an extra $200 now. Maybe more.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
sverek:

Promotion is on the next level. I hope CD Project will stand by its brand and will deliver quality product, which deserves the hype it created.
I have no doubt that they are going to deliver a great product in terms of technical quality, it´s a trademark of them. The problem is if the game is going to be fun and i have serious doubts about that...
https://forums.guru3d.com/data/avatars/m/115/115462.jpg
What a big pile of leather jacket cyberpunk MEH... by the time the actual game comes out, I'm 99% sure the 3000 series will be out, and I'm willing to bet that a 2080Ti will not be enough to max the game out. I get it that it's a giveaway limited edition thing, but still... it's really, really weird and pointless everything considered. As for the game, I'm a big CD Projekt fan, however there's so much money going into marketing and hype around CP2077, and coupled with consecutive delays... I just don't know, I hope I'm wrong, but I am starting to get a bad feeling about it.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Cyberdyne:

DisplayPort is a dumb name for consumers to understand, and HDMI carries audio.
Displayport carries audio, the overall spec is way better, it's royalty free. I couldn't careless what the consumer understands - if it's the only port on the display and device they are going to plug it in.. or just rename it.
https://forums.guru3d.com/data/avatars/m/267/267641.jpg
Maybe someone can make some thermal resistant stickers.. to make same thing cheaply.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Netherwind:

Another one of those contests where nVidia has forgotten that Sweden & Iceland exists 🙁 Apparently we're not as good as our Scandinavian neighbors below OPEN ONLY TO LEGAL RESIDENTS OF AUSTRALIA, AUSTRIA, BELGIUM, CANADA (EXCLUDING THE PROVINCE OF QUEBEC), COLOMBIA, CROATIA, CZECH REPUBLIC, DENMARK, FINLAND, FRANCE, GERMANY, GREECE, IRELAND, JAPAN, NEW ZEALAND, NORWAY, PERU, POLAND, SOUTH KOREA, SPAIN, SWITZERLAND, TAIWAN, THE NETHERLANDS, UNITED KINGDOM, THE UNITED STATES OF AMERICA (EXCLUDING PUERTO RICO AND ITS OTHER TERRITORIES AND POSSESSIONS).
Now that sucks, my is excluded too, but not Croatia.. wdf xD If I win I will just give aunt name in Switzerland and she sends to me 😀
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Denial:

Yeah, it's going to require a new display controller though so I wouldn't expect it until Ampere. Kind of crazy though that the TV tech is ahead of the GPU tech. Kind of wish HDMI died though. DP1.4 > I don't really see the purpose of HDMI anymore.
HDMI is more flexible than DP1.4+ (the latter being only for monitors at this stage). HDMI can be plugged into AV receivers for better audio as well. I dont think TV tech will incorporate DP anytime soon.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
alanm:

HDMI is more flexible than DP1.4+ (the latter being only for monitors at this stage). HDMI can be plugged into AV receivers for better audio as well. I dont think TV tech will incorporate DP anytime soon.
It's only for monitors because TV's don't have it. AVR's should have been incorporating it too.. My point is that since DP1.3 (2014) it's met or exceed HDMI in every single category including audio. Every AVR/TV/GPU/Etc since 2015 should have had DP and we should have been at a point now where HDMI is dead. The only reason why HDMI still exists is because the consortium of shitty companies collecting licensing fees on it are the same people that make the TVs.
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
Man would love to win one of these ...
https://forums.guru3d.com/data/avatars/m/232/232130.jpg
H83:

I have no doubt that they are going to deliver a great product in terms of technical quality, it´s a trademark of them. The problem is if the game is going to be fun and i have serious doubts about that...
Whether game is fun or not is depending on your tastes, it can be fun for 99% of gamers and yet no fun for you.
https://forums.guru3d.com/data/avatars/m/218/218363.jpg
-Tj-:

Now that sucks, my is excluded too, but not Croatia.. wdf xD If I win I will just give aunt name in Switzerland and she sends to me 😀
Good idea! I'll send a relatives name as well :>
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
sverek:

Whether game is fun or not is depending on your tastes, it can be fun for 99% of gamers and yet no fun for you.
Of course. It´s just i found W3 boring in terms of gameplay and fear that Cyberpunk will be the same although i hope i´m wrong.