GeForce GTX 950 Ti Mid-range Graphics Card To be Outed

Published by

Click here to post a comment for GeForce GTX 950 Ti Mid-range Graphics Card To be Outed on our message forum
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Well yeah obviously, if you don't turn on vsync in place of it then you will get tearing.
Yeah but when V-Sync is on it introduces massive amounts of input lag. Like idk, when G-Sync was announced and people were like "yeah you have to use it to experience the difference" i kinda rolled my eyes. Then I bought a Swift and was holy f*ck, it's real. When you're operating in the 35-60FPS Window G-Sync makes it feel like 90+. And with Freesync, aside from like the ghosting issues and below sync range stuff, it can be implemented for free in any monitor. It's like a free better button when buying a monitor. So why not?
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Yeah but when V-Sync is on it introduces massive amounts of input lag. Like idk, when G-Sync was announced and people were like "yeah you have to use it to experience the difference" i kinda rolled my eyes. Then I bought a Swift and was holy f*ck, it's real. When you're operating in the 35-60FPS Window G-Sync makes it feel like 90+. And with Freesync, aside from like the ghosting issues and below sync range stuff, it can be implemented for free in any monitor. It's like a free better button when buying a monitor. So why not?
If your monitor comes with it and so does your GPU, then yeah, it's a no-brainer to use it. I'm not sure about Freesync, but G-Sync monitors are often at least $100 more expensive. That's not something I'm willing to pay for when I feel my 60FPS experience is smooth. If it isn't, I don't really care that much about turning down something like draw distance, shadows, or AA. After about an hour of playing something, you don't even notice if those things have been toned down. But - that's just me. I know a lot of people around here can't accept anything below 110%, which is usually why they'll bash companies like AMD for making a sub-par product even though their total equity is a fraction of either Intel's or Nvidia's net income.
https://forums.guru3d.com/data/avatars/m/262/262613.jpg
The 7870 (aka 370) isn't so bad, i've used it to run my 4k monitor for a few weeks between the time I sold my gtx 980 and bought my 980Ti and it ran older games at 4k very smoothly. It should only show you how far ahead of it's time the 7870 was because i wouldn't be runnig anything on that gtx 570 with its display port 1.1a and 1280MB of ram. Though i agree that they shouldn't still be selling it today, we shouldn't see anything older than GCN 1.1 today but the reality is that these are details that only the minority will understand and care about, the average guy that runs into a computer store will not know or care of. All that trully matters is if it will run their game at the smoothness they desire and as far as things like WoW or CS:Go, yes it does.
https://forums.guru3d.com/data/avatars/m/226/226864.jpg
Hm, might be interesting as a dedicated PhysX card, depending on the price, especially if there are low profile, small form factor variants like with the 750 Ti.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
those 128bit and lower bus needed to die out like 3+ years ago
https://forums.guru3d.com/data/avatars/m/263/263608.jpg
Yeah but when V-Sync is on it introduces massive amounts of input lag. Like idk, when G-Sync was announced and people were like "yeah you have to use it to experience the difference" i kinda rolled my eyes. Then I bought a Swift and was holy f*ck, it's real. When you're operating in the 35-60FPS Window G-Sync makes it feel like 90+. And with Freesync, aside from like the ghosting issues and below sync range stuff, it can be implemented for free in any monitor. It's like a free better button when buying a monitor. So why not?
I can tell you that Freesync ghosting issues only happen in the AMD Freesync demo, real gaming test i don't see ghosting i use my friend 290x & i have no see any ghosting using AMD Freesync hotfix driver. Also your right, most people need to experience Gsync/Freesync in person. games that runs on 60fps lock, feel like over 60fps. now i don't know why Nvidia waste there R&D time on 950/950 Ti & fix those drivers bugs that people want to be fix.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I can tell you that Freesync ghosting issues only happen in the AMD Freesync demo, real gaming test i don't see ghosting i use my friend 290x & i have no see any ghosting using AMD Freesync hotfix driver. Also your right, most people need to experience Gsync/Freesync in person. games that runs on 60fps lock, feel like over 60fps. now i don't know why Nvidia waste there R&D time on 950/950 Ti & fix those drivers bugs that people want to be fix.
I don't know if they fixed it since, but PC Perspective explored the issue and it definitely effects frames in games/all content. Basically the panel is unable to overdrive the pixels while it's in the Freesync window. So, for example, the MG279Q from 35-90 fps the panel turns off overdrive and activates Freesync, in that FPS window it will ghost. If you exceed 90fps overdrive returns to normal and you don't get ghosting anymore. G-Sync on the other hand calculates the overdrive in the module and updates the display accordingly. AMD could/may have already fixed this, but it requires specific paths written for each type of panel, which would require updated drivers and whatnot. Regardless modern panels even without overdrive don't ghost that bad so it's not a big deal. I think the biggest issue I have with Freesync is the lack of a specified "sync window". There are monitors out there where the minimum is 45hz and that's unacceptable. AMD/VESA should have defined a minimum window and not left it up to OEM's. Anyway, yeah G-Sync/Freesync is something you have to experience before you can comment on it. I know a few people with 4K Acer G-Sync monitors that basically say it's the difference between making 4K acceptable and unplayable. Any game that operates in the 35-60ish FPS zone basically immediately feels 10x more enjoyable. I will never not have another main gaming monitor without either tech.