Nvidia responds to AMD FreeSync

Published by

Click here to post a comment for Nvidia responds to AMD FreeSync on our message forum
data/avatar/default/avatar13.webp
And isnt thunderbolt from intel faster I have a gtx680 on 3 x 24 inch displays and i never have screen tearing with v-sync off all the time... and i get some games over 100fps...dont see the need for this crap v-sync helps only with sli as i have done it before i guess someone along the line will hack it and create some form of software that could duplicate g-sync 50/50 my vote
I smiled reading your comment and then laughed when I saw your forum rank - newbie. Forgivable. But I demand silence from you since now.:3eyes:
https://forums.guru3d.com/data/avatars/m/247/247476.jpg
So brand deathmatch as always lads? :wanker: Why on earth would any company let their biggest rival have the technology that they have invested time and money in? Still living in a dreamworld are we? And I must say that if it is true that AMD are just bluffing which they most likely are as I can't see how display vendors who will support this would not see background of it is a very pathetic move.. Now I really hope that their mantle thing will actually work or else it's gonna be another disappointing year for AMD. Whether you like it or not AMD is nowhere near Nvidia or Intel when it comes to bringing new ideas&tech. So let's just hope their Mantle works and that next gen Nvidia chips won't be too quick for them as usual.
https://forums.guru3d.com/data/avatars/m/99/99142.jpg
So brand deathmatch as always lads? :wanker: Why on earth would any company let their biggest rival have the technology that they have invested time and money in? Still living in a dreamworld are we? And I must say that if it is true that AMD are just bluffing which they most likely are as I can't see how display vendors who will support this would not see background of it is a very pathetic move.. Now I really hope that their mantle thing will actually work or else it's gonna be another disappointing year for AMD. Whether you like it or not AMD is nowhere near Nvidia or Intel when it comes to bringing new ideas&tech. So let's just hope their Mantle works and that next gen Nvidia chips won't be too quick for them as usual.
Just lol
https://forums.guru3d.com/data/avatars/m/165/165840.jpg
If this keeps up, won't this be a win for AMD since I think laptops outsell desktops by a significant margin?
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
GSync is very nice tech and it definitely solves a horrible problem that has been there for years. I can't understand why the industry didn't address this issue years ago but anyway... I'd love to have it but don't have NVidia GPUs or the compatible monitors. And to be honest even if I DID have NVidia GPUs I don't think I would buy those horrible 1080P "gaming" monitors (gaming = cheapo IMO) just to do this. In my case I need more professional monitors to do serious work while not gaming so until these more expensive monitors (larger, higher res, IPS) have this technology support I won't be pushed to NVidia over this...
https://forums.guru3d.com/data/avatars/m/202/202110.jpg
I would not bet on current desktop monitors supporting this without some kind official firmware updates (not gonna happen) or community-driven firmware hacking.
I don't think community firmware hacks would be an issue. The community surrounding these monitors is nuts and super-active. I have an X-Star (one of the OC-able PLS panels) and the Qnix/X-Star Thread on OC.N is giant. Nearly 1300 pages at this point. If a $300 korean 1440p monitor can have a firmware flash to enable FreeSync then that would really deal a blow to GSync. Question is if it can actually be done.
data/avatar/default/avatar30.webp
GSync is very nice tech and it definitely solves a horrible problem that has been there for years. I can't understand why the industry didn't address this issue years ago but anyway... I'd love to have it but don't have NVidia GPUs or the compatible monitors. And to be honest even if I DID have NVidia GPUs I don't think I would buy those horrible 1080P "gaming" monitors (gaming = cheapo IMO) just to do this. In my case I need more professional monitors to do serious work while not gaming so until these more expensive monitors (larger, higher res, IPS) have this technology support I won't be pushed to NVidia over this...
So, that new Asus 27" 120hz Gsync display sounds like a good fit. It's a given that there are 4k models coming soon, too (although sadly probably 60hz, which is a shame as I'd luuuurve 120hz 4k desktop use) CDJay
https://forums.guru3d.com/data/avatars/m/103/103291.jpg
It would be nice if dynamic refresh rates became a standard, but as long as Free Sync or GSync is proprietary, I don't see it taking off. Nvidia could just man up and sell GSync chips to monitor manufacturers that works regardless of what brand your GPU is. I would rather see 120Hz (or faster) screens become more available and more affordable. Moving from a CRT that was 110Hz capable to a 60Hz LCD was painful. I don't understand why manufacturers haven't even tried to recover from that decade long setback.
data/avatar/default/avatar03.webp
They should share the technology with AMD for the same excuse ever.......IS FOR THE CHILDREN, THINK ABOUT THE CHILDREN!!!! :P j/k I´m out 😉
data/avatar/default/avatar29.webp
They are going to fail. Nvidia obviously did the research and found that the only way to make this work on desktop displays is to design and implement the G-Synch module. They even said in the interview "Nvidia's intent in building the G-Sync module was to enable this capability and thus to nudge the industry in the right direction." This direction is to have displayport 1.3 on all monitors going forward. If AMD put any effort into this at all they would have shown this off on a modern desktop PC. But no, they put something together quick to make sure everyone knew that they had this technology too. Just as a sidenote, I walked into my local BestBuy yesterday and out of the 30-40 monitors they had on display, 2 of them had displayport inputs. Everything else is DVI/HDMI. The display industry has progressed very little in the last few years.
VESA is a industry standard for this tech and VBLANK has been worked on by AMD for a few generations even back to the 6000 series support VBLANK. When VESA makes VBLANK a standard and monitors support it then everyone will have it. Nvidia def didn’t do its homework since the concept of VBLANK is from the mid to late 90s.
data/avatar/default/avatar01.webp
So brand deathmatch as always lads? :wanker: Why on earth would any company let their biggest rival have the technology that they have invested time and money in? Still living in a dreamworld are we? And I must say that if it is true that AMD are just bluffing which they most likely are as I can't see how display vendors who will support this would not see background of it is a very pathetic move.. Now I really hope that their mantle thing will actually work or else it's gonna be another disappointing year for AMD. Whether you like it or not AMD is nowhere near Nvidia or Intel when it comes to bringing new ideas&tech. So let's just hope their Mantle works and that next gen Nvidia chips won't be too quick for them as usual.
Are you trolling or just that ignorant? Who licences x64 to who? Who had the first multi-core x86 processor? While Intel is currently the best choice for enthusiast CPUs, it has not always been this way. Nvidia and AMD(Ati) have always been neck and neck. Sometimes one is better than the other for a generation but that is about the extent of the disparity between them. GCN has proven itself to be on par if not better than kepler, as soon AMD got off their asses and made an appropriate sized die the almighty titan was dethroned and forced Nvidia to make the 780ti and bring their prices back to earth - and this is good for everyone. I actually really want an Nvidia GPU to try lightboost on my BenQ, but I am making so much $ from mining that I can't justify going green. While no company is perfect, blanket statements like the one you mentioned is better suited for a pro-eugenics argument.
data/avatar/default/avatar17.webp
Who licences x64 to who?
To be fair, Intel licensed x86 to AMD that served as the base for x86-64 a.k.a. AMD64. If it's about who had 64-bit first, then it's definitely Intel. Intel had their first Itanium (a.k.a. IA-64) CPU released on 2001, while AMD had their first AMD64-based CPU (Opteron) released on 2003.
Who had the first multi-core x86 processor?
Depending on who you ask, but both AMD & Intel roughly released dual-core CPUs around the same time in 2005. Most people probably don't really remember, but Pentium D was out around the same month as Athlon 64 X2. AMD released dual-core Opteron a month earlier but really, that's all the e-peen AMD is going to get: one month. Now, if what you meant was more than two cores, then I don't really remember who went first.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
Intel produced the first dual core processor but AMD produced the first "native" dual core processor. Pentium-D was nothing more than 2 Pentium4 dies slapped on a single chip with an interconnect between them. Athlon64 uised a single die containing 2 processor cores. Core 2 Duo was the first processor from Intel designed from the start to be multi-core. Core 2 Quad was the first quad core (developed the same way as the Pentium-D)....but Phenom X4 was the first "native" quad core. Obviously the "native" part makes no difference because Conroe was still faster.....even with interconnects increasing latency. Intel created x86. Intel then created IA64. AMD created the x64 extensions for the x86 instruction set which is commonly referred to as "AMD64" (or "EM64T" when refering to Intel processors). Intel didn't intend for IA-64 to be used in the consumer market. The original IA-64 based Itanium processors lacked backwards compatibility with 32bit applications as it didn't support the x86 instruction set. Since "AMD64" was nothing but an extension to the x86 instruction set, 64bit processors using the 86x64 instruction set maintain backwards compatibility with older software. AMD had the first 1ghz processor.....
data/avatar/default/avatar29.webp
Intel produced the first dual core processor but AMD produced the first "native" dual core processor. Pentium-D was nothing more than 2 Pentium4 dies slapped on a single chip with an interconnect between them.
It was, but still 'dual core' nonetheless. When Bulldozer was out, their 'clusters' were technically more akin to a traditional core (because each module in a cluster shares an FPU), but most people and AMD would just call the modules in a cluster as cores anyway.
AMD had the first 1ghz processor.....
That's more of a technical hurdle rather than 'new ideas & tech' that was discussed by the post I was replying to. Intel probably has more technical breakthroughs regarding fabrication process whereas AMD excelled more in designs (using readily available fabrication process). Then again, AMD was late to realize that the GHz race was over when Intel released Core Duo, the C2D predecessor only available on laptops. So, no, Core 2 Duo wasn't Intel's first 'native' dual-core.
data/avatar/default/avatar34.webp
nvidia sucks with this gsync..
https://forums.guru3d.com/data/avatars/m/124/124168.jpg
nvidia sucks with this gsync..
Can you elaborate?
https://forums.guru3d.com/data/avatars/m/94/94596.jpg
Moderator
Can you elaborate?
Have you ever felt a good deal of PC users do not know or care to know how things work but will comment on it. Sure we rub each other the wrong way many times, but many of those times good information comes out of it, maybe even a different but believable point of view. This thread has many reasonable points of view and some clearly ignorant.
https://forums.guru3d.com/data/avatars/m/124/124168.jpg
Have you ever felt a good deal of PC users do not know or care to know how things work but will comment on it. Sure we rub each other the wrong way many times, but many of those times good information comes out of it, maybe even a different but believable point of view. This thread has many reasonable points of view and some clearly ignorant.
I don`t even know why I respond to such drivel, prob easier just to put the dude on ignore as he does not appear to have any info I need from him