Intel DG2-448 Arc GPU spotted high perf Arc GPU could be easier to produce than DG2-512

Published by

Click here to post a comment for Intel DG2-448 Arc GPU spotted high perf Arc GPU could be easier to produce than DG2-512 on our message forum
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Dg2-512 will match the 3080? I though the its only gonna compare to 3070. Intel could be improving their drivers. 🙂
data/avatar/default/avatar04.webp
Undying:

Dg2-512 will match the 3080? I though the its only gonna compare to 3070. Intel could be improving their drivers. 🙂
Give it another 2 generations maybe then yes.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
Undying:

Dg2-512 will match the 3080? I though the its only gonna compare to 3070. Intel could be improving their drivers. 🙂
Intel has some of the largest and best driver teams in the world. I would not doubt they make huge strides optimizing the drivers over the next few months.
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
Don't see the point on DG2-128EU, can't the iGPU or APUs do a similar job? Waste of wafers if you ask me. It's great to see Intel betting big with DG2-512EU, but my interest is on this new DG2-448: I think it will yield the more interesting GPUs from a price/performance point of view. DG2-384EU doesn't look too bad considering the amount of memory and all, but only 80W? Is Intel really tuning these cards for efficiency? This could be a big slap on Nvidia for pushing power levels lately.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
what was that card with 448-bit bus ? and why aren't we seeing anything with more than 384-bit these days ? would it not make more sense to make 3080ti 14G 448-bit g6 rather than 384-bit with awfully power inefficient g6x ?
data/avatar/default/avatar16.webp
cucaulay malkin:

what was that card with 448-bit bus ? and why aren't we seeing anything with more than 384-bit these days ? would it not make more sense to make 3080ti 14G 448-bit g6 rather than 384-bit with awfully power inefficient g6x ?
cost basically far more expensive to build bigger buses than 384 bit
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
cucaulay malkin:

what was that card with 448-bit bus ? and why aren't we seeing anything with more than 384-bit these days ? would it not make more sense to make 3080ti 14G 448-bit g6 rather than 384-bit with awfully power inefficient g6x ?
A wider bus has several disadvantages. It needs more traces on the PCB, which complicates design and increases cost. A wider bus also complicates the memory controller desing. It uses more space on die, leaving less for other units. And it consumes more power from inside the die. And it heats up the chip. I forgot the exact scaling, but it's not very efficient to just increase bus width. With greater speeds of GDDR, more technics for data compression, and with better and bigger caches, bigger busses aren't that necessary.
data/avatar/default/avatar02.webp
cucaulay malkin:

what was that card with 448-bit bus ? and why aren't we seeing anything with more than 384-bit these days ? would it not make more sense to make 3080ti 14G 448-bit g6 rather than 384-bit with awfully power inefficient g6x ?
exactly because of power efficiency. wider bus always draws more than narrower one, normalized to total bandwidth. ps gddr6x is not the issue https://abload.de/img/micronpower2myjz3.png
https://forums.guru3d.com/data/avatars/m/263/263710.jpg
16 GB VRAM is the new standard? .... assuming there's no "political" involvement during game development
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
JamesSneed:

Intel has some of the largest and best driver teams in the world. I would not doubt they make huge strides optimizing the drivers over the next few months.
It seems to me graphics card drivers are the most difficult ones to get to work perfectly, of all hardware. It's not like AMD would have wanted to acquire a bad reputation with its drivers: They are just so difficult to create. At least Intel has needed to make drivers for the iGPUs for many years now, so they have significant background in the work and aren't starting from nothing.
Horus-Anhur:

It needs more traces on the PCB, which complicates design and increases cost.
I have a feeling the PCB of a video card costs something like a dollar for the manufacturers. Maybe making it more complicated with a wider bus would make it cost 1.5 dollars. Or maybe they are a dollar more if they look fancy and not ugly green. I have never ordered custom PCBs, but from what I have seen in ads, they don't seem to be expensive even if you aren't ordering them by the thousands.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
Kaarme:

It seems to me graphics card drivers are the most difficult ones to get to work perfectly, of all hardware. It's not like AMD would have wanted to acquire a bad reputation with its drivers: They are just so difficult to create. At least Intel has needed to make drivers for the iGPUs for many years now, so they have significant background in the work and aren't starting from nothing.
This is very true. Since the beginning of nVidia, they have made sure they have the best drivers. But it also helps they ave been the leading GPU manufacturer for over 2 decades now. Only missing out during the nv30 and R300 generation. This meant that most game developers prioritized testing and optimization on nVidia. I remember a few years ago, an interview with an dev, about issues his game had with AMD. And they said he didn't have the budget and time as much on AMD's hardware. An indie dev even said he couldn't test on AMD GPUs, because he only had a couple of nVidia cards and no AMD cards.
Kaarme:

I have a feeling the PCB of a video card costs something like a dollar for the manufacturers. Maybe making it more complicated with a wider bus would make it cost 1.5 dollars. Or maybe they are a dollar more if they look fancy and not ugly green. I have never ordered custom PCBs, but from what I have seen in ads, they don't seem to be expensive even if you aren't ordering them by the thousands.
Like I said, it's one of several disadvantages. And consider that both AMD and nVidia no longer go for such wide busses for gaming GPUs.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Horus-Anhur:

This is very true. Since the beginning of nVidia, they have made sure they have the best drivers. But it also helps they ave been the leading GPU manufacturer for over 2 decades now. Only missing out during the nv30 and R300 generation. This meant that most game developers prioritized testing and optimization on nVidia. I remember a few years ago, an interview with an dev, about issues his game had with AMD. And they said he didn't have the budget and time as much on AMD's hardware. An indie dev even said he couldn't test on AMD GPUs, because he only had a couple of nVidia cards and no AMD cards. Like I said, it's one of several disadvantages. And consider that both AMD and nVidia no longer go for such wide busses for gaming GPUs.
what do you think will sell better ? a game with fsr 1.0 implemented or one with rtx on on the box and rtx on trailers all over the news and youtube. they have cutting edge hardware,software,game developers behind them and their main competitor amd fine with getting the leftovers they can still sell at unprecedented margins.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
cucaulay malkin:

what do you think will sell better ? a game with fsr 1.0 implemented or one with rtx on on the box and rtx on trailers all over the news and youtube.
The interviews I referred to, were in the days of Kepler, Maxwell and GCN. A long time ago, in the GPU market. Of course DLSS and RTX sell much better. But in over 2 decades, nVidia never had such a big lead in features. In fact, nVidia has done so well, that RTX has become synonym with ray-tracing. Such is nvidia's dominance. Personally I don't thing RT is that important. For me, the image quality, doesn't justify the performance loss. So RTX is one of the first things I sacrifice to get better frame rates. The big feature that gives a huge lead to nVidia, is DLSS 2.x. Getting a 30-40% performance improvement, without little loss in image quality, sometimes becoming better than native, is just a huge feature. FSR is OK, if there's nothing else. But given the choice, I'll always pick DLSS.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Horus-Anhur:

The interviews I referred to, were in the days of Kepler, Maxwell and GCN. A long time ago, in the GPU market. Of course DLSS and RTX sell much better. But in over 2 decades, nVidia never had such a big lead in features. In fact, nVidia has done so well, that RTX has become synonym with ray-tracing. Such is nvidia's dominance. Personally I don't thing RT is that important. For me, the image quality, doesn't justify the performance loss. So RTX is one of the first things I sacrifice to get better frame rates. The big feature that gives a huge lead to nVidia, is DLSS 2.x. Getting a 30-40% performance improvement, without little loss in image quality, sometimes becoming better than native, is just a huge feature. FSR is OK, if there's nothing else. But given the choice, I'll always pick DLSS.
most people use dlss to recoup that rt performance hit look at 2070S doing better with rt+dlssq than native no rt [SPOILER]
Screenshot 2021-10-26 at 21-35-.png

Guardians of the Galaxy im Technik.png
[/SPOILER]
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
cucaulay malkin:

most people use dlss to recoup that rt performance hit
I use it to get to around 100 fps. If there's room for RT, I might turn on a couple of settings.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Horus-Anhur:

I use it to get to around 100 fps. If there's room for RT, I might turn on a couple of settings.
what rtx + dlss games really require more than 70-80fps to get an extra fluid experience. most of them are single players.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
cucaulay malkin:

what rtx + dlss games really require more than 70-80fps to get an extra fluid experience. most of them are single players.
I prefer having around 100 fps. And yes, even in single player games. It's just smoother.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
Kaarme:

It seems to me graphics card drivers are the most difficult ones to get to work perfectly, of all hardware. It's not like AMD would have wanted to acquire a bad reputation with its drivers: They are just so difficult to create. At least Intel has needed to make drivers for the iGPUs for many years now, so they have significant background in the work and aren't starting from nothing. I have a feeling the PCB of a video card costs something like a dollar for the manufacturers. Maybe making it more complicated with a wider bus would make it cost 1.5 dollars. Or maybe they are a dollar more if they look fancy and not ugly green. I have never ordered custom PCBs, but from what I have seen in ads, they don't seem to be expensive even if you aren't ordering them by the thousands.
Yeah it is very hard. However I have a lot of confidence in Intel because like you said they have a history with doing the drivers for the iGPU's and they have a ton of man power to toss at it. This coming from a guy with a Zen profile pic 😉