AMD Holds Press Conference 31st of May – Likely announces Vega and/or HEDT

Published by

Click here to post a comment for AMD Holds Press Conference 31st of May – Likely announces Vega and/or HEDT on our message forum
data/avatar/default/avatar11.webp
the ES samples are out already 😀
https://forums.guru3d.com/data/avatars/m/217/217375.jpg
and they have Quad channel memory. They say that Dual channel memory was partly responsible for limiting Ryzen's gaming performance where the GPUs were facing light loads. Will be very interesting to see how these perform on the new S3 (internal code name) socketed platform 🙂
data/avatar/default/avatar15.webp
I dunno if we should be smiling. OEMs are less than excited about Ryzen and AMD is still fighting for their lives. And they still haven't learned how to do a proper product launch. How many times it has to be said: Having a great hardware is just a half of the success equation. I am looking at Vega/Ryzen combo and I just know that getting both would be a double trouble. Take Linux. GCN tanks compared to NV, and then Ryzen tanks compared to Intel. And on top of that FreeSync does not work. Thats double/tripple tanking 😀 And all that within open source software stack, which makes that famous AMD's finger pointing and blaming others not even possible.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I dunno if we should be smiling. OEMs are less than excited about Ryzen and AMD is still fighting for their lives.
Source? OEMs aren't using Ryzen because they're mostly only interested in the A320/A300 chipsets and the APUs. In case you haven't noticed, most OEM PCs don't support overclocking or decent GPUs. Of the outlying OEMs that cater to enthusiasts, some are already making products involving the B350/X370 and Ryzen (like Alienware or iBUYPOWER).
Take Linux. GCN tanks compared to NV, and then Ryzen tanks compared to Intel. And on top of that FreeSync does not work. Thats double/tripple tanking 😀 And all that within open source software stack, which makes that famous AMD's finger pointing and blaming others not even possible.
You seem to have tunnel vision. There are several tests where AMD GPUs (on the open source drivers) outperform Nvidia in Linux. Sure, nvidia wins most of the time, but that's because game devs are specifically only supporting Nvidia. There is very little AMD can do to change this aside from bribing devs or making application-specific code in their kernel drivers, which they adamantly avoid. It wasn't very long ago until AMD got OpenGL 4.5 support, and their Vulkan support still needs work. Compared to the Windows drivers, AMD Linux is closing the performance gap on a monthly basis. If you want the best Linux gaming experience, Nvidia currently is the only realistic option. If you want an adequate experience, AMD is perfectly fine. Very rarely will my R9 290 not keep up with whatever I play in Linux with open source drivers.
data/avatar/default/avatar27.webp
I dunno if we should be smiling. OEMs are less than excited about Ryzen and AMD is still fighting for their lives. And they still haven't learned how to do a proper product launch. How many times it has to be said: Having a great hardware is just a half of the success equation. I am looking at Vega/Ryzen combo and I just know that getting both would be a double trouble. Take Linux. GCN tanks compared to NV, and then Ryzen tanks compared to Intel. And on top of that FreeSync does not work. Thats double/tripple tanking 😀 And all that within open source software stack, which makes that famous AMD's finger pointing and blaming others not even possible.
No one cares about the 1% that use Linux!
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
and they have Quad channel memory. They say that Dual channel memory was partly responsible for limiting Ryzen's gaming performance where the GPUs were facing light loads. Will be very interesting to see how these perform on the new S3 (internal code name) socketed platform 🙂
I fail to see how the bandwidth of the memory is the reason why cpu has issue with game performance. and how quad channel will fix that memory speeds arnt gona fix that
https://forums.guru3d.com/data/avatars/m/54/54823.jpg
I fail to see how the bandwidth of the memory is the reason why cpu has issue with game performance.
LGA1151 has the same issue. It's the whole reason why I won't go near the platform, as well as AM4. That's including the lanes. GPUs just get held back. Fallout 4 is the most extreme example for LGA1151. It's super pronounced on lower clocked i5s with a high end GPU.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
No one cares about the 1% that use Linux!
Except those who do... Also, roughly 1% of Steam users are Linux users, and possibly around 2% when you include unidentified OSes and those who run Steam in Wine (those users would be recognized as Windows users). Many Linux users play games but refuse to use Steam out of principle. So even when strictly looking at gamers (which is a strong minority of home Linux users), that would make up a minimum of 2% of the PC gaming market. That's tens of thousands of people.
I fail to see how the bandwidth of the memory is the reason why cpu has issue with game performance. and how quad channel will fix that memory speeds arnt gona fix that
IGPs tend to be very memory intensive. Just memory frequency alone has a very linear effect on performance.
https://forums.guru3d.com/data/avatars/m/254/254338.jpg
If you want the best Linux gaming experience, Nvidia currently is the only realistic option. If you want an adequate experience, AMD is perfectly fine. Very rarely will my R9 290 not keep up with whatever I play in Linux with open source drivers.
If you want to use open drivers then AMD is the only game in town.
https://forums.guru3d.com/data/avatars/m/259/259737.jpg
No one cares about the 1% that use Linux!
No you have that wrong, people like you dont care thats all, if no one cared amd wouldnt work on linux at all, so plenty people care just more than you can understand.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
If you want to use open drivers then AMD is the only game in town.
Intel's open drivers are also very good, particularly if you use Skylake. But, if you want open drivers and a high-performance gaming system, AMD is the only option.
data/avatar/default/avatar38.webp
Except those who do... Also, roughly 1% of Steam users are Linux users, and possibly around 2% when you include unidentified OSes and those who run Steam in Wine (those users would be recognized as Windows users). Many Linux users play games but refuse to use Steam out of principle. So even when strictly looking at gamers (which is a strong minority of home Linux users), that would make up a minimum of 2% of the PC gaming market. That's tens of thousands of people. IGPs tend to be very memory intensive. Just memory frequency alone has a very linear effect on performance.
Well... Linux, in OS market share represent 2.09%... i really have some pain to imagine... 100% of them are playing games on steam. ( you will tell me that is the same for windows or OSX, but honestly 1.2% or 2% will not do any differences ... https://www.netmarketshare.com/operating-system-market-share.aspx?qprid=10&qpcustomd=0 But the problem is not really Linux, but old OpenGL code who are completely flushed by Nvidia codes and routine. I have never understand how an opensource comunity can have been so bad to keep their code "open" and not one sided. ( outside that the director of Khronos is the second in the leader board of Nvidia ). Old OpenGL was a joke alone, lets hope they all move quickly to Vulcan,. ( at least for gaming, i use enough openGL pro 3D software based.. ).. One example, it is AMD who finance and rewrite the OpenGL viewport of Blender for accelerate the realtime render.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Well... Linux, in OS market share represent 2.09%... i really have some pain to imagine... 100% of them are playing games on steam. ( you will tell me that is the same for windows or OSX, but honestly 1.2% or 2% will not do any differences ... https://www.netmarketshare.com/operating-system-market-share.aspx?qprid=10&qpcustomd=0
First of all, I said PC gaming market, not the PC market as a whole. Secondly, by your logic, are you telling me 93% of Mac users are gamers? Because according to the Steam surveys, 2.99% of their userbase are Mac users and according to the site your sourced, 3.21% are Mac users. That 93% doesn't include anyone who plays games in emulators, 3rd party sources, or via the App Store. If we give the benefit of the doubt that there are some Mac users in Other, there's a good chance there's a similar amount of Linux users too.
data/avatar/default/avatar16.webp
Theres something in this phrase that you have quoted that you dont understand ?
( you will tell me that is the same for windows or OSX, but honestly 1.2% or 2% will not do any differences ...
https://forums.guru3d.com/data/avatars/m/262/262564.jpg
Source? OEMs aren't using Ryzen because they're mostly only interested in the A320/A300 chipsets and the APUs. In case you haven't noticed, most OEM PCs don't support overclocking or decent GPUs. Of the outlying OEMs that cater to enthusiasts, some are already making products involving the B350/X370 and Ryzen (like Alienware or iBUYPOWER).
OEMs are excited about Ryzen but cannot go all-in because the platform is too unstable right now. They need to release the next microcode update that adds Hynix memory compatibility as well as other brands and is stable with high speed memory in general. Once that happens, I think you will see more OEMs embracing Ryzen. Until then ...
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Take Linux. GCN tanks compared to NV,
[youtube]iYWzMvlj2RQ[/youtube] You seriously have no idea what you're talking about. AMD cards currently have the best open source drivers by far on Linux. If you say that the open source driver doesn't matter, you'll just prove you literally have no clue what you're saying.
and then Ryzen tanks compared to Intel.
LOL. Not only it's not, but if you have any sort of a recent kernel (4.10+), you actually get an extra 20%+ on Geekbench SMT tests while Intel's performance is the same between the two. Ryzen is probably the best CPU you can get for Linux right now, especially if you go source-based and compile for it. It destroys the 6900k and a lot of much more expensive, even dual slot Xeons.
And on top of that FreeSync does not work. Thats double/tripple tanking 😀
Freesync works. Gsync doesn't work when a compositor is present, which means you have to kill your DE's compositor to make it work. That means you don't get window borders to draw.
And all that within open source software stack, which makes that famous AMD's finger pointing and blaming others not even possible.
For Freesync you need AMDGPUPRO, which is part open source. And it works. Not only you're completely clueless, you're one of the most rabid fanbois I've seen in a while. The only reason I respond to your posts lately is to make sure that people don't fall for any of the things you say.
data/avatar/default/avatar22.webp
You seem to have tunnel vision. There are several tests where AMD GPUs (on the open source drivers) outperform Nvidia in Linux. Sure, nvidia wins most of the time, but that's because game devs are specifically only supporting Nvidia. There is very little AMD can do to change this aside from bribing devs or making application-specific code in their kernel drivers, which they adamantly avoid. It wasn't very long ago until AMD got OpenGL 4.5 support, and their Vulkan support still needs work. Compared to the Windows drivers, AMD Linux is closing the performance gap on a monthly basis. If you want the best Linux gaming experience, Nvidia currently is the only realistic option. If you want an adequate experience, AMD is perfectly fine. Very rarely will my R9 290 not keep up with whatever I play in Linux with open source drivers.
Actually I am pleasantly surprised just how well 290 works in Linux overall. I've tried radeon, now I'm on amdgpu driver(4.10 kernel and 17.1 Mesa), and I am not even considering proprietary driver. True, the games runbellow Windows FPS, but they work well. And I don't need hundreds of FPS for this kind of gaming. But that does not mean I wouldn't like more performance, or that I should be madly in love with the possibilities of "tripple tanking" with double AMD cobo (you've seen Dota 2 Vulkan with Fury/Ryzen on Phoronix?). Actually thats quadruple tanking LOL. Fury tanks, Ryzen tanks, Vulkan tanks, and on top of that FreeSync does not work. Regardless of what Primeminister says. Unless someone thinks that "working" means working only in handful of games and only with the proprietary driver.
data/avatar/default/avatar07.webp
For Freesync you need AMDGPUPRO, which is part open source.
For someone who is that vocal about open source, that's a double laughable statement. First for pointing out that we should use proprietary driver for FreeSync. And that's after glorifying AMD open source efforts (which are to be commended btw). And then pointing out that it's partly open source. LMAO partly open-soruce. Yet the proprietary part is also the one which deals with the supposedly open standard FreeSync. Whole things is hilarious
data/avatar/default/avatar24.webp
These GTX 20 series specs already make Vega look old NVIDIA GTX 20 Series Hardware SPECS and release date: NOTE: GTX 20 series GPUs will NOT make use of HBM2. NVIDIA TITAN Xv - GV102 "Volta" (Fully enabled GV102) - 16.5 Billion Transistors - 42 SMs - 5,376 CUDA Cores - 336 Texture Units - 96 ROPs - 384-bit - 24 GB GDDR6 - 768 GB/s Memory Bandwidth - TSMC 12nm FFN Release Date : Q3 2018 NVIDIA GTX 2080 Ti - GV102 "Volta" (Cut down GV102) - 16.5 Billion Transistors - - 40 SMs - 5,120 CUDA Cores - 320 Texture Units - 88 ROPs - 352-bit - 22GB GDDR6 - 704 GB/s Memory Bandwidth - TSMC 12nm FFN Release Date : Q4 2018 Around The Holidays NVIDIA GTX 2080 - GV104 "Volta" (Fully enabled GV104) - 11 Billion Transistors - 28 SMs - 3,584 CUDA Cores - 224 Texture Units - 64 ROPs - 256-bit - 16 GB GDDR6 - 512 GB/s Memory Bandwidth - TSMC 12nm FFN Release Date : Q1 2018 NVIDIA GTX 2070 - GV104 "Volta" (Cut Down GV104) - 11 Billion Transistors - 21 SMs - 2,688 CUDA Cores - 168 Texture Units - 64 ROPs - 256-bit - 16 GB GDDR6 - 410 GB/s Memory Bandwidth Release Date : Q1 2018 NVIDIA GTX 2060 Ti : - GV106 "Volta" (Fully enabled GV106) - 6.6 Billion Transistors - 14 SMs - 1,792 CUDA Cores - 112 Texture Units - 48 ROPs - 192-bit - 12 GB GDDR6 - 300 GB/s Memory Bandwidth - TSMC 12nm FFN Release Date : Q3 2018
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
NVIDIA GTX 20 Series Hardware SPECS and release date: NOTE: GTX 20 series GPUs will NOT make use of HBM2. NVIDIA TITAN Xv - GV102 "Volta" (Fully enabled GV102) - 16.5 Billion Transistors - 42 SMs - 5,376 CUDA Cores - 336 Texture Units - 96 ROPs - 384-bit - 24 GB GDDR6 - 768 GB/s Memory Bandwidth Release Date : Q3 2018 NVIDIA GTX 2080 Ti - GV102 "Volta" (Cut down GV102) - 16.5 Billion Transistors - - 40 SMs - 5,120 CUDA Cores - 320 Texture Units - 88 ROPs - 352-bit - 22GB GDDR6 - 704 GB/s Memory Bandwidth - TSMC 12nm FFN Release Date : Q4 2018 Around The Holidays NVIDIA GTX 2080 - GV104 "Volta" (Fully enabled GV104) - 11.2 Billion Transistors - 28 SMs - 3,584 CUDA Cores - 224 Texture Units - 64 ROPs - 256-bit - 16 GB GDDR6 - 512 GB/s Memory Bandwidth - TSMC 12nm FFN Release Date : Q1 2018 NVIDIA GTX 2070 - GV104 "Volta" (Cut Down GV104) - 11.2 Billion Transistors - 21 SMs - 2,688 CUDA Cores - 168 Texture Units - 64 ROPs - 256-bit - 16 GB GDDR6 - 410 GB/s Memory Bandwidth Release Date : Q1 2018 NVIDIA GTX 2060 Ti : - GV106 "Volta" (Fully enabled GV106) - 6.6 Billion Transistors - 14 SMs - 1,792 CUDA Cores - 168 Texture Units - 48 ROPs - 192-bit - 12 GB GDDR6 - 300 GB/s Memory Bandwidth Release Date : Q3 2018
Nice troll post, why did you join these forums to post this exact post in three different forums?