GeForce GTX 1660 Spotted in Presentation - TU116 with GDDR5 and GDDR6 inbound

Published by

Click here to post a comment for GeForce GTX 1660 Spotted in Presentation - TU116 with GDDR5 and GDDR6 inbound on our message forum
https://forums.guru3d.com/data/avatars/m/145/145154.jpg
10 Series, 20 Series. RTX is the new thing! Wait, there's more GTX 10 series. Hold on, we have a 1600 series too but it's really just a 10 series refresh. Have you heard about Ray-Tracing? My 1st biggest issue with Nvidia is their pricing. My 2nd is their new direction in naming their cards. The stock market doesn't expect them to grow in 2019. I cannot disagree.
https://forums.guru3d.com/data/avatars/m/261/261432.jpg
RTX is just another feature like PhysX which will be used in limited titles for limited years. Nice to have, but not largely accepted. Same with DX12 when that hype train was real look at it today, no-one cares if a game runs in DX11 or 12 just that it does.
data/avatar/default/avatar10.webp
As a GTX 970 owner trying to finally upgrade for a reasonable price that will up suite my needs this may be it. Depending on price and performance. or could wait even fecking longer for ATi or nvidia .7nm.... Looking forward to the review, will gddr6 make a difference? and how much will that difference cost?
https://forums.guru3d.com/data/avatars/m/154/154983.jpg
0blivious:

Hold on, we have a 1600 series too but it's really just a 10 series refresh.
Except it's not a refresh, 16 series are still Turing GPUs, just without the RT cores (and possibly without Tensor cores as well).
data/avatar/default/avatar08.webp
Aura89:

....it's the first iteration Your comment makes no sense That's like saying how tessellation could not be done even remotely well on anything but high end when originally released, that somehow made it so it never caught on...oh wait it's in basically all games now. Not sure you understand the definition of "refresh"
No... tessellation was running fine on Nvidia when they tried to push it hard(AMD was having issues with it) and didn't require dedicated hardware. completely different! Even if its based on the Turing architecture with all the memory/cache changes. the performance difference while having the same CUDA count will be minimal. now ofc it depends on the 1660 price. if its at 150$~180$, then its great cuz the 1060 cost 250$ at launch but I bet it won't
data/avatar/default/avatar34.webp
ezodagrom:

Except it's not a refresh, 16 series are still Turing GPUs, just without the RT cores (and possibly without Tensor cores as well).
Fox2232:

So, they can show better in-game trailers to catch more people? Same thing as implementing Ultra-Details which do cost you 50% of performance at minimal visual gain. But there are games where those fps costly features look really good in comparison to lower details. (Like BDO.)
Never saw a situation where u lose 50% of your performance from HIGH to ULTRA. and again it doesn't require dedicated hardware that makes the chips much more expensive.
https://forums.guru3d.com/data/avatars/m/154/154983.jpg
HardwareCaps:

Even if its based on the Turing architecture with all the memory/cache changes. the performance difference while having the same CUDA count will be minimal. now ofc it depends on the 1660 price. if its at 150$~180$, then its great cuz the 1060 cost 250$ at launch but I bet it won't
RTX 2060, same CUDA count as the 1070 yet it has 1070 Ti level of performance. I wouldn't be surprised if the base 1660 ends up either trading blows with the RX 590, or being just a little bit weaker than it. Price wise, seems like the 1660 6GB will be $230 and the 1660 Ti $280, apparently.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
HardwareCaps:

Never saw a situation where u lose 50% of your performance from HIGH to ULTRA. and again it doesn't require dedicated hardware that makes the chips much more expensive.
Never saw a situation where you get much visual improvement from high to ultra either.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I'm glad Nvidia finally realized that they can use Ti on their 60 series (not sure why they've avoided it for so many years) but it's a shame we don't get 2060 performance without raytracing. But, I understand why they did this - they need to convince people that the RT cores are extra special, even outside of raytracing purposes. Here's what I don't get though: Is Nvidia planning on releasing a 2050 or 2050Ti? Because if so, those would likely fall under the performance of the 1660 and 1660Ti. I'm not sure if Nvidia plans on using raytracing on the 2050s, but if they don't, that's going to make their product lineup even more confusing than it already is.
https://forums.guru3d.com/data/avatars/m/218/218363.jpg
NCC1701D:

The RTX sales numbers have spoken and shareholders are not happy. Money is king. It will be interesting to see what Nvidia does with their lineup next gen. Will they abandon the RTX series, or will they still offer it when they've had some time to mature it further for better performance? Interesting times indeed.
I wish I knew the sales figures. How does the 2080Ti sell compared to the 1080Ti?
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Netherwind:

I wish I knew the sales figures. How does the 2080Ti sell compared to the 1080Ti?
Yeah, I wonder about that also
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
schmidtbag:

Is Nvidia planning on releasing a 2050 or 2050Ti? Because if so, those would likely fall under the performance of the 1660 and 1660Ti. I'm not sure if Nvidia plans on using raytracing on the 2050s, but if they don't, that's going to make their product lineup even more confusing than it already is.
There will probably be no 2050 this gen. These cards are here replace the lower tier. As you can see beside 1660 now even 1650 is coming.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
HardwareCaps:

Never saw a situation where u lose 50% of your performance from HIGH to ULTRA. and again it doesn't require dedicated hardware that makes the chips much more expensive.
I mentioned it for those who "Never Saw". BDO = Black Desert: Online
data/avatar/default/avatar10.webp
Fox2232:

I mentioned it for those who "Never Saw". BDO = Black Desert: Online
BDO is horribly built and not done by a respectable studio. the vast majority of games don't show this kind of behavior. you point is invalid.
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
That pretty much does it then. There will be no mainstream RTX option for Turing, and Nvidia is indeed repeating their mistake with the GeForce 4. This will significantly hold back ray-tracing developments (and DLSS), making it a niche feature in a few games instead of a game changer - we can basically forget about ray-tracing for this generation. Way to shoot yourself in the foot, Nvidia. Then again, I sort of suspected that this might be the case, seeing how underpowered the 2080 Ti was in ray-tracing. There was little hope that a mainstream card would deliver acceptable performance.
data/avatar/default/avatar28.webp
D3M1G0D:

There will be no mainstream RTX option for Turing, and Nvidia is indeed repeating their mistake with the GeForce 4. This will significantly hold back ray-tracing developments (and DLSS), making it a niche feature in a few games instead of a game changer - we can basically forget about ray-tracing for this generation.
To be fair I don't think Nvidia's midrange cards supporting RTX and DLSS would have changed anything with regards to adoption of those technologies in games. PC gaming is a niche market when you consider gaming overall, aside from some franchises or genres that never had big adoption on consoles to begin with, and Nvidia is one of three players in PC graphics. I wouldn't expect realtime raytracing to see big adoption in new titles overall until the consoles support it, which might be as early as next generation if we're lucky. With that in mind Nvidia's decision to restrict RTX and DLSS to the enthusiast space makes a certain amount of sense, as it's going to be a minor feature add more than a necessity for the foreseeable future. The inclusion may be an incentive for customers willing to pay more but hardly worth paying for in a more budget-constrained market. It's basically just product segmentation which, at this point in time, makes sense as realtime raytracing isn't in any way a necessary feature.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
schmidtbag:

Here's what I don't get though: Is Nvidia planning on releasing a 2050 or 2050Ti? Because if so, those would likely fall under the performance of the 1660 and 1660Ti. I'm not sure if Nvidia plans on using raytracing on the 2050s, but if they don't, that's going to make their product lineup even more confusing than it already is.
Doubt it. Can imaging everyone cringing as they see a 2050's RT benchmarks. Nvidias shame would be complete.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
HardwareCaps:

BDO is horribly built and not done by a respectable studio. the vast majority of games don't show this kind of behavior. you point is invalid.
Therefore you saw it. Which makes your "Never Saw" statement... And it does exists, therefore it is valid. Have you played it? Have you seen visual differences between those details? Have you seen those real time reflections and other changes in details as you max it?
https://forums.guru3d.com/data/avatars/m/154/154983.jpg
schmidtbag:

Here's what I don't get though: Is Nvidia planning on releasing a 2050 or 2050Ti? Because if so, those would likely fall under the performance of the 1660 and 1660Ti. I'm not sure if Nvidia plans on using raytracing on the 2050s, but if they don't, that's going to make their product lineup even more confusing than it already is.
If they were planning a 2050/2050 Ti, there would be no 16 series at all, I think that RTX 2060 is as low as RTX will get in this generation. If anything seems like there's just a rumored 1650 for $180 and the 1050 Ti will get its price lowered.