AMD FreeSync LCD displays to be available in 2015

Published by

Click here to post a comment for AMD FreeSync LCD displays to be available in 2015 on our message forum
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
I wish this technology was vendor agnostic. I always go with the most power efficient well performing gpu whether it be nvidia or amd. Currently nvidia is destroying amd in those categories as we all know amd's flagship draws massive power and can;t even keep up to nvidia which draws way less power. But that could easily turn. Sucks to be locked into 1 vendor because of free sync or g sync. There needs to be 1 standard that works on both.
Adaptive Sync is a VESA standard and is part of the Display Port 1.2a specification. FreeSync is simply AMD's feature naming for Adaptive Sync. NVidia isn't "destroying AMD" in power efficiency or performance. Which card performs best is very dependent on the specific game being played. There really isn't much difference between NVidia and AMD in power consumption.
data/avatar/default/avatar40.webp
yea there is.... r9 290x uses more power and produces more heat than gtx 780ti and for all that extra power use im not seeing the performance lead that should be associated with it. AMD lost this round. I;m not a fanboy either as i own a 5870. I think when the 5870 came out it was against the gtx 580 and the 580 was the more power hungry card then so I went with amd. But they lost this most recent round no doubts about it. I don't rly factor price into it. I'm blessed to be able to afford whatever i really want.
data/avatar/default/avatar25.webp
I wish this technology was vendor agnostic. I always go with the most power efficient well performing gpu whether it be nvidia or amd. Currently nvidia is destroying amd in those categories as we all know amd's flagship draws massive power and can;t even keep up to nvidia which draws way less power. But that could easily turn. Sucks to be locked into 1 vendor because of free sync or g sync. There needs to be 1 standard that works on both.
The technology is vendor agnostic lol .. it is a display connector standard .. Every monitor who want to be VESA compliant and use DisplayPort connection with 1.2a or newer will provide it.. AMD have allready done their driver for it.. Nvidia will surely reuse their g-sync work for do the same. ( less happy is maybe Asus or Benq to have licence the technology with the addin card to Nvidia ). I count anyway without problem on Nvidia to put their 2 feets on the wall for delay the maximum of time possible their compatibility with this displayport standard. ( otherwise why buy specific g-sync monitor who work specifically with their gpu.. ) I say that, but who know, maybe Nvidia prepare a little things they could add to the g-sync system for keep it interessant.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
yea there is.... r9 290x uses more power and produces more heat than gtx 780ti and for all that extra power use im not seeing the performance lead that should be associated with it. AMD lost this round. I;m not a fanboy either as i own a 5870. I think when the 5870 came out it was against the gtx 580 and the 580 was the more power hungry card then so I went with amd. But they lost this most recent round no doubts about it. I don't rly factor price into it. I'm blessed to be able to afford whatever i really want.
HD5K series didn't compete against GTX500...lol. HD6K competed against the GTX500 series. R9 290X draws, according to Hilbert's calculations, 286watts. The GTX780Ti draws, according to Hilbert's calculations, 262watts. That's all the difference at full load. That difference won't even be noticeable on your power bill. Like I said, there's very little difference in power consumption. If you were really "blessed to be able to afford whatever you really want" that difference would be completely meaningless. My HD7950 runs at 100% load 24/7... For me, that difference would be almost noticeable. For a gamer, it wouldn't be noticeable at all. We're talking pennies a year here for gamers.
data/avatar/default/avatar36.webp
i care about power consumption for heat and noise not the electric bill and ye i couldnt remember 4 years ago but yea the 5870 was the cooler quieter gpu the 480 was a loud hot beast. now the r9 290x is the loud obnoxious beast
data/avatar/default/avatar16.webp
Can you really see NVidia supporting this? Probably have to rely on 'hacked' drivers for support.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Anyhow, I believe it when it hits the reviewers. I'm fairly sure it works, but the drivers need to be adapted to it, and that's where I'm not sure it will work at launch, just like Mantle needed a few patches and updates.
data/avatar/default/avatar24.webp
Anyhow, I believe it when it hits the reviewers. I'm fairly sure it works, but the drivers need to be adapted to it, and that's where I'm not sure it will work at launch, just like Mantle needed a few patches and updates.
AMD have some good advance on it, as basically they had allready presented this 6 years ago in a whitepaper. More substantially they have allready the driver. For Mantle it is more a question to get the game debugged.. Mantle is still new for developper and they have need to learn. Cant really throw a stone to them, 2 patchs for Mantle, when with current games, at release this is more 10 patchs who is needed for correct the few bugs ( and BF4 is a good example for it ).
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
Can you really see NVidia supporting this? Probably have to rely on 'hacked' drivers for support.
Whether NVidia decides to support it or not doesn't change the fact that it's been adopted by VESA and added to the Display Port 1.2a specification. NVidia can either choose to do the right thing and support it or ignore it and screw their customers over. VESA is the organization that decides on standards and specifications for PC display connectivity, not NVidia.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
AMD have some good advance on it, as basically they had allready presented this 6 years ago in a whitepaper. More substantially they have allready the driver. For Mantle it is more a question to get the game debugged.. Mantle is still new for developper and they have need to learn. Cant really throw a stone to them, 2 patchs for Mantle, when with current games, at release this is more 10 patchs who is needed for correct the few bugs ( and BF4 is a good example for it ).
I actually considered the optimisation an nvidia topic, if they will ever support it like they do with gsync. And that's right, I meant Mantle to be a good thing, yet it just didn't work perfect out of the box, but made a fairly good improvement so far! The only thing is, I'm not sure I'd go for it if I had the performance (and gpu) to be able to choose, as I have only seen a single comparison screenshot for BF4 so far, and I thought the iq of dx11 was better than mantle. But I guess that's just my opinion, and a matter of personal taste of how things look. And it was way back, when mantle had it's debut, maybe that has changed now.
Whether NVidia decides to support it or not doesn't change the fact that it's been adopted by VESA and added to the Display Port 1.2a specification. NVidia can either choose to do the right thing and support it or ignore it and screw their customers over. VESA is the organization that decides on standards and specifications for PC display connectivity, not NVidia.
No, nvidia never screws over it's customers 🙄 As I said above, I doubt they will fully, and perfectly support it, as it would make their gsync more or less useless, wouldn't it? So I doubt a 100% support.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
No, nvidia never screws over it's customers 🙄 As I said above, I doubt they will fully, and perfectly support it, as it would make their gsync more or less useless, wouldn't it? So I doubt a 100% support.
If they choose not to support it, they won't be compliant with future Display Port specifications. They'll be stuck at the current spec.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
No, nvidia never screws over it's customers 🙄 As I said above, I doubt they will fully, and perfectly support it, as it would make their gsync more or less useless, wouldn't it? So I doubt a 100% support.
As sykozis said they will be forced to support it if they ever want to move forward with displayport. That being said I'm sure they'll add some value to G-sync in order to make it useful.
data/avatar/default/avatar06.webp
as I have only seen a single comparison screenshot for BF4 so far, and I thought the iq of dx11 was better than mantle. But I guess that's just my opinion, and a matter of personal taste of how things look. And it was way back, when mantle had it's debut, maybe that has changed now.
It was a gamma issue on BF4's side on the Mantle codepath, fixed long ago.
https://forums.guru3d.com/data/avatars/m/227/227853.jpg
yea there is.... r9 290x uses more power and produces more heat than gtx 780ti and for all that extra power use im not seeing the performance lead that should be associated with it. AMD lost this round. I;m not a fanboy either as i own a 5870. I think when the 5870 came out it was against the gtx 580 and the 580 was the more power hungry card then so I went with amd. But they lost this most recent round no doubts about it. I don't rly factor price into it. I'm blessed to be able to afford whatever i really want.
I think you confused the numbers. Your 5870 is nowhere near a 580. It's weaker than a 560 even.
data/avatar/default/avatar15.webp
VESA is the organization that decides on standards and specifications for PC display connectivity, not NVidia.
And it's NVidia that decides whether to support it, not VESA!
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
And it's NVidia that decides whether to support it, not VESA!
Well I mean if they want to claim DP 1.3 compatibility and onward they are forced to support it so.. yeah, they kinda have to. Not to mention the fact that if Freesync is as good as G-Sync all the nvidia fannerds will just whine until they do. Me included.
https://forums.guru3d.com/data/avatars/m/228/228458.jpg
Nvidia can't claim full DP 1.3 support without supporting freesync. Doing so would be a blatant lie and false advertising. It's support all features or be stuck with DP 1.2
data/avatar/default/avatar24.webp
Nvidia can't claim full DP 1.3 support without supporting freesync. Doing so would be a blatant lie and false advertising. It's support all features or be stuck with DP 1.2
Adaptive-Sync is an optional feature in DisplayPort 1.2a.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
And it's NVidia that decides whether to support it, not VESA!
If NVidia wants to claim compliance with connectivity specifications, they have to support all require features.
Adaptive-Sync is an optional feature in DisplayPort 1.2a.
Yes, in DP1.2a. Notice he said DP 1.3?
data/avatar/default/avatar02.webp
You mean Adaptive Sync which has already been brought to 1.2a as an optional feature will be enforced as a mandatory in 1.3? That train has already passed, and DP1.3 will deal with new feature, not look back at introduction of old ones. Monitor manufacturers want DP compliance sticker without the need for additional cost, so it's a reasonable expectation that it AS will remain optional. All in all it's highly unlikely that DP 1.3 will change compliance regarding an existing 1.2a feature.