AMD RX Vega Shown Against GTX 1080 at Budapest Event

Published by

Click here to post a comment for AMD RX Vega Shown Against GTX 1080 at Budapest Event on our message forum
https://forums.guru3d.com/data/avatars/m/63/63170.jpg
I would expect this to be the Aircooled version, if there was a $300 diff in AMD's favour. I can't see a watercooled card go for cheaper than an aircooled card. (unless they used an expensive watercooled 1080...) Not too long to wait 🙂
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
wow on the TV it says "where we heal our planet", yeah good luck healing our planet with 400W heater that perform the same as the competition 200W gpu
data/avatar/default/avatar06.webp
I would expect this to be the Aircooled version, if there was a $300 diff in AMD's favour. I can't see a watercooled card go for cheaper than an aircooled card. (unless they used an expensive watercooled 1080...) Not too long to wait 🙂
They were comparing the total costs of the two systems (including Freesync and G-Sync monitors). So most of the $300 difference is made up by the monitor cost. I feel like they've tried to hide as much as possible about Vega. We don't know which system was which, but one was generally felt to offer a small but noticeable improvement over the other. And so we don't have any real idea of its performance. We don't know the price - that was hidden behind the monitor price differential. I don't think we'll know much at all until reviews after 31st July (and I fully expect there'll be more 'wait for mature drivers' after that going by their marketing tactics so far).
https://forums.guru3d.com/data/avatars/m/179/179962.jpg
Don't know about you but my attention was on the 2 hungarian babes in the back:wanker:, especially the one in the right 🤓 🤓. That's why "I like VEGA":3eyes:
https://forums.guru3d.com/data/avatars/m/232/232130.jpg
Doesn't look good at all. 1) No FPS counter. 2) Monitor could be using freesync, thus making image more fluent against Nvidia. 3) AMD just pulling time here. Is it trying to magically boost fps with drivers or bios?
"AMD told attendees that the AMD and NVIDIA systems had a $300 difference in AMD's favor."
This is plain stupid statement. What are "systems"? Nvidia running with Intel i9 CPU? I would love to see Vega succeed, but AMD makes it very hard to believe. Very unprofessional.
data/avatar/default/avatar16.webp
Doesn't look good at all. ... I would love to see Vega succeed, but AMD makes it very hard to believe. Very unprofessional.
Why unprofessional!? They have a turd and do EVERYTHING to make us think it is actually not a turd. Quite professional, I believe.
data/avatar/default/avatar36.webp
Doesn't look good at all. 1) No FPS counter. 2) Monitor could be using freesync, thus making image more fluent against Nvidia. 3) AMD just pulling time here. Is it trying to magically boost fps with drivers or bios? This is plain stupid statement. What are "systems"? Nvidia running with Intel i9 CPU? I would love to see Vega succeed, but AMD makes it very hard to believe. Very unprofessional.
According to a few from /r/amd who were at the event, both systems were meant to have Freesync/G-sync enabled. Though apparently there was a systems problem identified with one halfway through that they took time out to correct.
data/avatar/default/avatar03.webp
Doesn't look good at all. 1) No FPS counter. 2) Monitor could be using freesync, thus making image more fluent against Nvidia. 3) AMD just pulling time here. Is it trying to magically boost fps with drivers or bios? This is plain stupid statement. What are "systems"? Nvidia running with Intel i9 CPU? I would love to see Vega succeed, but AMD makes it very hard to believe. Very unprofessional.
Secrecy always has a reason behind it. If you had something that is soooo great there is no reason to do anything but show and hype it everywhere with big blinking labels. I lost my hope long ago. Luckily the CPU side look way better.
https://forums.guru3d.com/data/avatars/m/232/232130.jpg
According to a few from /r/amd who were at the event, both systems were meant to have Freesync/G-sync enabled. Though apparently there was a systems problem identified with one halfway through that they took time out to correct.
In that case "Nvidia system" probably just added itself a cost for having G-Sync.
Why unprofessional!? They have a turd and do EVERYTHING to make us think it is actually not a turd. Quite professional, I believe.
Eh, yeah. At this stage, I don't want to know how deep AMD's rabbit hole goes. Communication wise, AMD digging grave for Vega.
data/avatar/default/avatar19.webp
SERIOUSLY? Guru 3D is now on the low level of ****? Random reddit user makes a claim he was "told" by AMD employee it was an 1080, WCCF steals the BS and makes an article, that's business as usual...but Guru3D...this is really low. I would take it down Hilbert, or at least don't draw conclusions from that BS. The facts are that the monitors/computers were masked out, and noone knows which was which, with what components. Random user claims he was told something is just BS not news.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
wow on the TV it says "where we heal our planet", yeah good luck healing our planet with 400W heater that perform the same as the competition 200W gpu
Good idea. 400W of power (which is not GPU in article) can consume in one day up to 9,6 kWh. Daily commute in car to work and back is in range of 100kWh burned. (2ppl, 1 hour) One sourced by nuclear fusion, other by fossil fuel. Then you have lovely 140MW airplanes. Flight taking 3 hours... 420MWh. (400 ppl trip) So, yes, horrid GPUs powered by electricity. = = = = And please note, that Heaters are not competing with GPUs. And I am sure 400W heater does its job better than 200W GPU.
data/avatar/default/avatar24.webp
Especially with how they have "cloaked" the screens, I can totally see the price being influenced by G-SYNC the most. Overall system cost is not what interests us at this point, just the card is the info we need (and one might even say "deserve" at this point, all the secrecy is getting old).
https://forums.guru3d.com/data/avatars/m/103/103120.jpg
More than a full year after GTX 1080. Seem like a whole architecture is another flop again... So hardly we can expect some noticeable advances on the market and competition.
data/avatar/default/avatar25.webp
I was there. Check-in. OK. 60-80 people front ot the doors standing. OK. What is inside? --> 3. person finally said something -> you can play 2-2 minutes with each games. What games? - Dunno. Okay, let's drink a glass of water. Oh, its 16:04 and free drink was until 16. Cool! Well, let's see how fast the row goes. Well, in 20 minutes nothing changed, just the temperature was like nearly standing in a sauna. Okay, someone told me, they let 20 persons inside, but you couldnt see what is inside. ---> 2-3 hours standing in a sauna without water (or you go out and pay for it and lose your place in the row) ----------> bye-bye and **** you AMD. So standing in a little dark place, seeing only nerds, boah man. Never again such an event.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
BeZol, did you take any photos I can add to this article?
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Not that you can base an opinion on AIDA64 GFX results but if two RX580s on Crossfire are faster then an AIB 1080 Ti, that would disappoint how precisely?
data/avatar/default/avatar37.webp
Budapest ive been there many tines, One of best looking city I have ever seen in my life....Gergous women too! My god.
data/avatar/default/avatar11.webp
= = = = And please note, that Heaters are not competing with GPUs. And I am sure 400W heater does its job better than 200W GPU.
It certainly does. OTOH... In the middle of the winter, having 400W GPU, you are no better or worse than having a 400W heater. Lets do it one more time: Q: What is the difference, heat-production wise, between a GPU Which consumes 400W and a 400W heater? A: NONE. 100% of electrical energy is converted into heat in both cases. Very slightly diminished by noise, led lights and fans movement. Q:If your GPU produces 400W, how big your cooler has to be: A: At least 400W TDP. Everyone agrees? Right. Let us proceed then: ===$100 QUESTION BELLOW === JOIN NOW, AND WIN BIG === $$$ ==== Q: What is the difference between a GPU which consumes 400W and a 400W TDP GPU? A: ??
https://forums.guru3d.com/data/avatars/m/63/63170.jpg
Especially with how they have "cloaked" the screens, I can totally see the price being influenced by G-SYNC the most. Overall system cost is not what interests us at this point, just the card is the info we need (and one might even say "deserve" at this point, all the secrecy is getting old).
From what i've read, the AMD system had FreeSync, and the Nvidia System had Gsync. Someone summised that Gsync monitors are about $200 more expensive than Freesync, so that would leave about $100 difference in price for the card. they were supposed to be identical systems, apart from the monitor tech, and the GPU's.