Here it is: Radeon RX Vega reference card shows up on photo

Published by

Click here to post a comment for Here it is: Radeon RX Vega reference card shows up on photo on our message forum
https://forums.guru3d.com/data/avatars/m/115/115462.jpg
Maaan it's been ages since I've last been rickrolled! 😀
https://forums.guru3d.com/data/avatars/m/267/267249.jpg
Rick will be like WTF! This track still pulling views
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
I think people have forgotten how unexpected Pascal performance actually was. Even Nvidia themselves said that it had exceeded expectations and it was one of the largest leaps in performance in living memory. I personally don't remember such an increase. RX Vega just didn't reach expectations and landed in-between 1080 - 1080Ti. Still not a bad card for the money but that's the whole point isn't it. The leader can ask what ever they want and get away with it. Hopefully it'll bring 1080 prices down at least.
https://forums.guru3d.com/data/avatars/m/260/260826.jpg
I think people have forgotten how unexpected Pascal performance actually was. Even Nvidia themselves said that it had exceeded expectations and it was one of the largest leaps in performance in living memory. I personally don't remember such an increase. RX Vega just didn't reach expectations and landed in-between 1080 - 1080Ti. Still not a bad card for the money but that's the whole point isn't it. The leader can ask what ever they want and get away with it. Hopefully it'll bring 1080 prices down at least.
Did you see the rumored price for Sweden market? +120 euros more than a 1080Ti....and 400 euros more than a 1080. :3eyes: I guess final MSRP price will be lower but AMD CEO believes in mining to increase (or sustain?) AMD GPUs sales:
Lisa Su, CEO of chip maker Advanced Micro Devices, said in an analyst conference call that enthusiasm for cryptocurrency and gaming will drive demand for the company’s Vega graphics chips in the third quarter.
https://venturebeat.com/2017/07/25/amd-ceo-cryptocurrency-mining-and-gaming-will-boost-q3-graphics-shipments/
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
Did you see the rumored price for Sweden market? +120 euros more than a 1080Ti....and 400 euros more than a 1080. :3eyes: I guess final MSRP price will be lower but AMD CEO believes in mining to increase (or sustain?) AMD GPUs sales: https://venturebeat.com/2017/07/25/amd-ceo-cryptocurrency-mining-and-gaming-will-boost-q3-graphics-shipments/
I didn't take mining into consideration for RX Vega top end cards because in my mind i believe they won't be great economically and being the size of that thing i bet power consumption is pretty high. It looks like 600w of cooling on that thing.
https://forums.guru3d.com/data/avatars/m/261/261024.jpg
Yeah...I'm skeptical as to why they keep doing these blind gaming tests. Seems odd.
https://forums.guru3d.com/data/avatars/m/260/260826.jpg
I hate it when they cant be bothered to show the results in post it self, and i stead make people sit threw video just to find out I get the want hits for the video and all but is really that hard to just put the results in writing? to
In this case the (subjective) result could be: "My 100 Hz sync is better than your 100 Hz sync!" What about GPUs performance?: "Was this a GPU performance review?" LOL
data/avatar/default/avatar27.webp
[youtube]VcgRaf38uCQ[/youtube]
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
i find irritating people thinking that making a card that will crash the competition is as simple as writing ! "omg 1 year later they should crash 1080ti .... " if it was that simple to do that intel would have been into the gpu market long time ago even mid range ... Iris anyone ? and so many companies disappear in the way S3 ,matrox ,xgi ,3dfx ,powervr (or left the pc market) they did not disappear because they did not though of how fast their cards shoulds be they disappear because the competition was so fierce they could not keep up because is HARD ! edit: i do not say that vega is good card or you should buy one that will be on everyones judgement after the official reviews
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
Congratulations both Systems were able to produce 100fps @ 100Hz. This feels like AMD is sticking two fingers up to Nvidia fanboys just because the monitor costs more. It's a PR move to try shake a few Nvidia users out of that fan-boy-ism and join team red. Damn if i'd only known that Free Sync would of cost me 300$ less i would of jumped ship. ^^
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
i find irritating people thinking that making a card that will crash the competition is as simple as writing ! "omg 1 year later they should crash 1080ti .... " if it was that simple to do that intel would have been into the gpu market long time ago even mid range ... Iris anyone ?
I agree - let's not forget how expensive it is to design these things. GPUs are arguably more advanced than CPUs yet you'll often find GPU architectures get replaced more often than CPU. Nvidia makes more money than AMD, so they can afford more major changes more often.
and so many companies disappear in the way S3 ,matrox ,xgi ,3dfx ,powervr (or left the pc market) they did not disappear because they did not though of how fast their cards shoulds be they disappear because the competition was so fierce they could not keep up because is HARD !
You're right about the competition being fierce, though the situation isn't quite as rough as you might think: Matrox (to my knowledge) is still around but pretty much just found in servers. I don't think competition was the problem, they just didn't play things smart. VIA (S3/Chrome) is also still around but almost completely vanished from mainstream PCs once Intel integrated the northbridge and pretty much forced their southbridge to be used (I don't think VIA had much to do with AMD). In other words, it wasn't the GPU market that crushed them but rather the evolution of x86. Kind of a shame, because VIA made decent budget hardware and they started Pico ITX. 3dfx was bought out by Nvidia, so they're still with us in spirit - SLI was a 3dfx technology. As you pointed out, Powervr isn't in the PC market anymore but they're still doing ok in ARM. I think the real challenge really comes down to patents, cost, and manufacturing. Anybody new who would get even slightly close to encroaching on AMD's or Nvidia's territory (maybe even Intel's too) is going to get bought out. It would definitely be nice to see some more competition, from someone new.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
So... I can expect to never get a video card upgrade at this rate. Nice. I guess I'll just play League of Legends and old emulated games since they require squat for GPU power. If I save $50 a day now maybe I can afford a GTX 3080 Ti when it's out in 2023. Thanks miners.
data/avatar/default/avatar34.webp
RX failure is a better name.
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
Currently it seems the card just might cost 1000€ here which would make it more expensive then 1080ti. now getting 1080ti and g-sync display might actually be cheaper then. Also unless that 1000€ is somehow actually faster and amd has been sandbagging like epic which I doubt.
https://forums.guru3d.com/data/avatars/m/206/206288.jpg
The HardOCP tests seemed like it would be interesting, but was actually far from it. Doom was a poor choice imo, both cards should be more than capable of easily maintaining 100fps, so this tells us nothing. It is pointing to Vega being very expensive though.
data/avatar/default/avatar39.webp
Why? The sole purpose is to prove that people's expectations are skewed. AMD's point (or at least what they're gambling) is despite the numbers, you're not going to notice a difference, so why not go for them and pay less? Blind tests are the best tests. It removes bias. In blind tests, Pepsi wins over Coke. Cheap wine wins over expensive. More qualified people get a job regardless of their race, sex, ethnicity, religion, etc.
Sorry for being late to the party but I have to disagree with this. This was a crap blind test because there wasn't a direct comparison of the two items in question. They used additional variables in the form of the G-sync and freesync monitors (whose both function is to hide any dropped frames). They were purposefully quiet about what monitors were chosen - definitely to hide the real value of Vega and also scuttlebutt says they artificially increased the price gap between the monitors by choosing a higher tier monitor for Nvidia. Imagine a pepsi vs coke blind test where they're each drunk after eating a ghost chili. Again there's no difference, but only a small percentage of people would be interested in comparing the flavours when combined with a super hot chili. Most people just want to know if they're getting a good flavour by itself or if a cheaper version is inferior. Put Vega on a non-adaptive sync monitor vs a GTX 1080 on a non-adaptive sync monitor (preferably the same damn monitor) and then ask which is better for a meaningful blind test. Why didn't they do this? AMD's marketing really turns me off them. I love what they're offering on the CPU front but I still think they advertised Ryzen like gits (comparing Ryzen's single threaded performance to Intel's HEDT line and gaming + streaming performance to Intel's consumer line; why not straight up gaming and say 'look, for a good chunk of change less, we offer 90% of an i7/i5' - because they think they'll get more sales by not being upfront). I think they're purposefully doing the same thing in avoiding a Vega vs 1080 comparison:because it's not better in performance vs performance or performance/cost (why all the shenanigans with G/free-sync if so?). That's the expectation I'm taking into launch. The only optimism I have is for a gradual increase in performance over driver improvements (the same way that a launch Rx 480 was closer to a 970 than a 980).
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
The HardOCP tests seemed like it would be interesting, but was actually far from it. Doom was a poor choice imo, both cards should be more than capable of easily maintaining 100fps, so this tells us nothing. It is pointing to Vega being very expensive though.
Pretty sure that AMD told HardOCP how to conduct the test and what game to use. The only place where they deviated was the choice of graphics card. AMD wanted them to use a 1080, they used a 1080Ti.