GeForce RTX 2080 and 2080 Ti - An Overview Thus far

Graphics cards 1049 Page 1 of 1 Published by

Click here to post a comment for GeForce RTX 2080 and 2080 Ti - An Overview Thus far on our message forum
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
If this stuff is true then this will be the first time a XX80 card doesn't beat the previous XX80ti card, infact by Tflops this will actually be slightly weaker, and the new XX80ti has less than 50% more Tflops compared to the over double jump we saw with the 980ti to 1080ti.... interesting to see if tensor cores are used to make up for this or the architecture could. So far seems more like a 1080ti with extra cores and a few added features
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
@nz3777 Ray tracing in layman terms is like using a laser pointer to draw something on a wall while holding a camera shutter open until you're done drawing (aka, moving the laser ray all over the place) Any objects in the path of the light will create shadows, any reflective ones will bump the ray in another direction, any translucent one will make it refract through it and also change direction. But instead of a laser, this is done via mathematics, for EVERY PIXEL on the screen, and realtime raytracing means it has to be done 30-60 or more times every second. This is extremely difficult to achieve, because the mathematics involved are very complex.
https://forums.guru3d.com/data/avatars/m/238/238795.jpg
This is making me very happy I pulled the trigger on a 1080 Ti and decided the 2080 looked underwhelming from initial leaks. Seems to be more and more like an accurate bit of info. Minus a few new performance killing effects like nvidias wish it were real raytracing, there doesn't seem to be much difference. I am still curious for benches though. Optimizations and drivers can help the 2080 improve. I still feel the 1080 Ti will be enough for me until the next refresh now. Hopefully. I said it before, with PC games being console ports for the most part, and consoles not being upgraded until 2020, a 1080 Ti should suffice until the next consoles hit and more power is required. the 10xx seies seems like it was meant for this current gen and anything additional is just extra gravy.
https://forums.guru3d.com/data/avatars/m/223/223196.jpg
Two of these puppies are so very much WANT, but very little NEED. Then there's the matter of CAN AFFORD. Well, I'm going to wait and see the reviews and benchmarks, then decide.
https://forums.guru3d.com/data/avatars/m/224/224796.jpg
I think I'm more excited about NVLink than anything else so far. 🙂
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
wavetrex:

@nz3777 Ray tracing in layman terms is like using a laser pointer to draw something on a wall while holding a camera shutter open until you're done drawing (aka, moving the laser ray all over the place) Any objects in the path of the light will create shadows, any reflective ones will bump the ray in another direction, any translucent one will make it refract through it and also change direction. But instead of a laser, this is done via mathematics, for EVERY PIXEL on the screen, and realtime raytracing means it has to be done 30-60 or more times every second. This is extremely difficult to achieve, because the mathematics involved are very complex.
So by the time a few games use it, we could be on the heels of Turings successor which will improve upon it further. So ray tracing on Turing may not be anything for gamers to be exited about.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
There is likely going to be one or two Gameworks games that use Ray Tracing in the near future. I really don't see this becoming a popular feature for 4-5 years. Also, I wouldn't be surprised if Intel was very competitive with ray tracing when they finally release there GPU. I guess what I'm getting at is this whole ray tracing in hardware thing is mostly a gimik at this point unless you are a game developer and if you are a game developer you have some new toys.
data/avatar/default/avatar17.webp
good write up, looking forward to the full reviews when they come out. is there any indication of hdmi 2.1 or any solutions for 4k 120Hz 10 bit hdr + gsync etc for the top end screens
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
According to some post we should stop R&D cause games are not gonna use features coming with each new gen. Guess what, no new stuff hardware wise, means no developers will introduce it into games... And so far, most ppl are looking to improve performance , not new features. So not buying a new product just because you cant fully use it? Ok. Maybe someone needs to explain that to ppl buying 100 room mansions, or cars that can go +65mph (as most countries have speed limits). 😀 As long as no card below the ti has more than 8gb, i will skip this, as siege will need more than 8gb vram to go beyond 1440p, and i wont pay more than 800, doubt ill get a LC ti for that..
https://forums.guru3d.com/data/avatars/m/239/239459.jpg
All sounds really exciting but I won't be upgrading until I start seeing options in games that my current 1070 can't do, it's like when physx came out and I started seeing all these new games, well mainly mafia 2 at the time but I had to get a card that could do it because I felt like I was missing out, I'll most likely skip this next generation myself but it all depends how quick the devs are to implement all this new technology into their games, if this DXR takes off then I may have to dig into my wallet but even then I'm not gonna be upgrading until there's a nice watercooled version available with a pre applied block like my seahawk, I don't want to be messing around with waterblocks again and voiding my warranty.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
fry178:

According to some post we should stop R&D cause games are not gonna use features coming with each new gen. Guess what, no new stuff hardware wise, means no developers will introduce it into games... And so far, most ppl are looking to improve performance , not new features. So not buying a new product just because you cant fully use it? Ok...
Not so much that, its the massive RT marketing blitz Nvidia is unleashing upon us, hoping to reel in unsuspecting gamers by the millions who may led to believe it will transform all their games instantly. Most buyers will not likely know wtf RT is but will be swept up by all the hype regardless.
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
I really hope crypto mining doesn't take off again when the new cards are release... still waiting for prices to go back to normal *__*
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
[youtube=yVflNdzmKjg] Thought this was relevant...
data/avatar/default/avatar36.webp
Now I totally understand what to look for with the ray-tracing. Thanks guys. This in theory schould bring Games more to life very sweet! But of course we are still keeping phys-x right? This is just more icing on the cake. Also Amd has to respond to this? Nvidia cannot be touched at the moment and future as well,Too damm strong!
data/avatar/default/avatar38.webp
I was really hopping that the 2060 would be competitive with the 1080, but it looks like it won't be or am I wrong?
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
@DW75 And? When has bus be a bottleneck on xx60 and up? So far i have seen "horsepower" and vram amount being the issue, not the bus speed. Just because its slower in speed doesn't mean it will perform less. Go back for previous gen ti and the like, why haven't they been bottlenecked, even that bus speed doesn't go past 256/384 for years now... As long as they keep improving other things like compression, its not an issue. And similar with vram. A xx60/70 will run out of power, before running into vram issues. A switched between xx50/ti/60/70/80 (upgrades; new build for others), and all games from 2000 till R6 siege (newest game for me) running 1080/1440p, vram isnt the issue, but performance was...
https://forums.guru3d.com/data/avatars/m/66/66219.jpg
Nice editorial, got me up to speed good. 🙂 Hangin for that 2080ti upgrade... hope its released with or soon after 2080.
data/avatar/default/avatar10.webp
Is there any leaked mining info? I sold 65% of my mining farms and I am ready if those cards prove worth, just better performance and saving 50%+ on power will save me few MW per month
https://forums.guru3d.com/data/avatars/m/211/211933.jpg
Wasn't planning to buy this gen but if they release the Ti this soon i'll find it very hard not to upgrade.