Intel Arc A770 Now Outperforms RTX 3060 According to new Intel Slides

Published by

Click here to post a comment for Intel Arc A770 Now Outperforms RTX 3060 According to new Intel Slides on our message forum
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
Outperform... in performance per dollard So despite Intel have really something good in hand with Arc, and despite they have made huge effort in their driver (the weak point of those card)... it mean more "let's have a com where we will display random result were our card are in front of our concurent" than other things.
https://forums.guru3d.com/data/avatars/m/178/178348.jpg
Really just goes to show how "half baked" these cards were on release and needed a bit more dev time. Personally I'd still avoid until firstly we were sure Intel will continue to support these cards (there had been mutters they were going to give up) second see how the next versions perform.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
You know whats interesting? It will outperform and outlast even the 3070.
https://forums.guru3d.com/data/avatars/m/243/243189.jpg
Stairmand:

Really just goes to show how "half baked" these cards were on release and needed a bit more dev time. Personally I'd still avoid until firstly we were sure Intel will continue to support these cards (there had been mutters they were going to give up) second see how the next versions perform.
It is better for consumers though. As cheap or cheaper than 3060 or somewhere between 6600xt and 6700 for cost and get increasing performance to an undefined point but likely exceeding the others for relative price point with loads of vram relative to its price point too. Rewarding early adopters and those buying now at a discount, assuming Intel does not suddenly can development like they sometimes do with new projects.
https://forums.guru3d.com/data/avatars/m/273/273754.jpg
Stairmand:

Really just goes to show how "half baked" these cards were on release and needed a bit more dev time. Personally I'd still avoid until firstly we were sure Intel will continue to support these cards (there had been mutters they were going to give up) second see how the next versions perform.
The cards weren't half-baked. The drivers were the issue. The drivers should, at least, had much less basic operations bugs. Otherwise, the drivers were fine for Intel's attempt at a real first graphics card. But, over the last few months, they improved A LOT and it shows how commited they are to providing a real GPU with support. I think we're at a point where I would legit start to recommand the cards.
https://forums.guru3d.com/data/avatars/m/54/54823.jpg
Dead island 2 perf leak
https://forums.guru3d.com/data/avatars/m/239/239175.jpg
I'll sell my gaming PC and replace it with slides then.
data/avatar/default/avatar19.webp
The card is really coming along. I “downgraded” from a 3080, simply because I’m tired of Nvidia’s bullshit and I want to support a third competitor. Here’s a video (not mine) that gives a current look at performance. [youtube=l59xt0clbpE]
https://forums.guru3d.com/data/avatars/m/54/54823.jpg
Greggy_D:

The card is really coming along. I “downgraded” from a 3080, simply because I’m tired of Nvidia’s bullshit and I want to support a third competitor. Here’s a video (not mine) that gives a current look at performance. [youtube=l59xt0clbpE]
On the bright side you won't run into VRAM limits
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Undying:

You know whats interesting? It will outperform and outlast even the 3070.
In theory it probably could, at least the 16GB version. The tricky part is whether Intel will be willing to invest further into it. Intel has a bad tendency to prematurely abandon products. This one in particular has already burned a lot of money, so there is an incentive for Intel to cut their losses.
data/avatar/default/avatar40.webp
schmidtbag:

In theory it probably could, at least the 16GB version. The tricky part is whether Intel will be willing to invest further into it. Intel has a bad tendency to prematurely abandon products. This one in particular has already burned a lot of money, so there is an incentive for Intel to cut their losses.
No he is on the money. It has 16gb vram therefore it will outlast it. Its the new thing in this hobby.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
schmidtbag:

In theory it probably could, at least the 16GB version. The tricky part is whether Intel will be willing to invest further into it. Intel has a bad tendency to prematurely abandon products. This one in particular has already burned a lot of money, so there is an incentive for Intel to cut their losses.
the difference is that the software ARC team has proven the hardware ARC team right. Intel has been very quick on the trigger to your point, but the question is the underlying viability of the products cancelled. many would be quick to point out Intel's previous efforts in the gpu space, but there is a huge difference between then and now in the gpu market. right now gpu sales are a garnish to Intel's plate, nothing serious. however, Intel has regained their focus and is holding on to marketshare vs AMD. selling gpus are a proven source of knock-on cpu sales and vice-versa as AMD has proven w/ rdna 2 what i would say to Intel (:p) is hire more software engineers and double down on production for 2nd gen
data/avatar/default/avatar14.webp
Since Intel is selling these cards at a loss, I can imagine that they don't want these ones to become a runaway sales success (hence the very limited supply). I think that they use the current crop as demonstrators for what could come with Battlemage. Of course to make some of their investments back they will have to sell those with a profit margin that is positive.
https://forums.guru3d.com/data/avatars/m/283/283018.jpg
Undying:

You know whats interesting? It will outperform and outlast even the 3070.
Indeed INTEL has come a long way from their first venture into the discrete GPU arena. This is all good news for me and my friends in the cornfields of America! The home of shallow pockets. Also Intel clearly has the cash to do something real and so much more with their new found GPU venture. Also good news to bringing the business back to the U.S and not to be depending solely on foreign nations and men wearing leather jackets to the party..
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
how the DX11 performance, I heard the card take huge hit there, also the card SHOULD Not be ran in systems that dont have Rebar , so thing along the line it was desgined that way and any system that dont have that will suffer greatly?
https://forums.guru3d.com/data/avatars/m/248/248627.jpg
Is too bad the drivers are what really prevented adoption of these cards. I'll be looking more seriously at the next gen depending on pricing, Nvidia has just straight up lost their minds on pricing and ppl keep suckering into buying them....
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
tunejunky:

the difference is that the software ARC team has proven the hardware ARC team right. Intel has been very quick on the trigger to your point, but the question is the underlying viability of the products cancelled. many would be quick to point out Intel's previous efforts in the gpu space, but there is a huge difference between then and now in the gpu market. right now gpu sales are a garnish to Intel's plate, nothing serious. however, Intel has regained their focus and is holding on to marketshare vs AMD. selling gpus are a proven source of knock-on cpu sales and vice-versa as AMD has proven w/ rdna 2 what i would say to Intel 🙂p) is hire more software engineers and double down on production for 2nd gen
That is true the GPU market is much larger now, but, that's mostly because of the compute space. That's an important distinction because there aren't as many application-specific optimizations to do; it's easier to tap into.
https://forums.guru3d.com/data/avatars/m/239/239175.jpg
schmidtbag:

That is true the GPU market is much larger now, but, that's mostly because of the compute space. That's an important distinction because there aren't as many application-specific optimizations to do; it's easier to tap into.
Everybody hoped governments around the world would ban cryptocurrency. Now we hope they ban AI.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
Often the card would get a huge boost with dxvk especially on older games for example arc+dx9+gta4 = stutter fest ... Arc+dxvk+GTA 4 seems to be really smooth and mostly if not completely free of stuttering! Would be interesting to see dxvk resaults mixed in.... Although that would be a pain in the butt and is something that I am not willing to do and experiment admittedly .
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
Who'd have thought we would be looking at Intel as our gaming saviours.