Intel Architecture day 2021: Xe and XeSS

Graphics cards 1049 Page 1 of 1 Published by

Click here to post a comment for Intel Architecture day 2021: Xe and XeSS on our message forum
https://forums.guru3d.com/data/avatars/m/265/265607.jpg
The core count discussion is always like chicken and egg. There are few games that make use of more cores, because most people don't have CPUs with a lot of core, but at the same time most people don't have more cores because there are no games that use it. Either way regarding CPUs, AMD still seem to be more interesting option, at least to me. I'm far more interested in their GPUs, could really shake the market up. Although their claims about their version of upscaling seem ridiculous.
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
Backstabak:

The core count discussion is always like chicken and egg.
I get what you are saying but we all know the egg came first, as the chicken evolved from another egg-laying lifeform.
https://forums.guru3d.com/data/avatars/m/174/174772.jpg
Loobyluggs:

It’s not about supporting games, it’s about making them.
Where does it say that there will be no CPU's from Intel with more cores/threads? Dependent on which part of game development you work with you might consider something meant for professional use instead of an average desktop CPU.
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
Mineria:

Where does it say that there will be no CPU's from Intel with more cores/threads? Dependent on which part of game development you work with you might consider something meant for professional use instead of an average desktop CPU.
I already have, the cost versus performance is tempting, but the net result of saved time is not significant enough to warrant it, when other items can be purchased like, a monitor for HDR color grading. Desktop CPU's from AMD have better multi-threaded performance for obvious reasons and intel have not yet met the curve on this. This is not me debating anything because this is not a debate, this is me being consistently surprised with intel as a company over the last 5 or so years. We need another 'Sandybridge' chip equivalent performance boost from intel, and I really do not think we are going to get it, yet, thye have been riding on the coattails of that design for (nearly) ten years. I would argue they have burnt through their credibility as a desktop CPU provider. In short, their CPU designs are dated.
https://forums.guru3d.com/data/avatars/m/283/283018.jpg
My focus is on making money. Who cares about cores? Fanboys or no fanboys. And then all that guessing going on here. Did well with AMD. Now cashed out. Wall Street loves Intel and their relationship with Microsoft, their strong financial position and their own chip manufacturing independencies. Looking for some more "easy stock cash" by Intel's next 4th quarter report. Money talks BS walks!
https://forums.guru3d.com/data/avatars/m/278/278016.jpg
I hope Intel succeeds, the competition only brings profit to consumers and I look forward to seeing Intel Thread Director Technology in practice and whether it will also bring profits to Amd processors and especially to its larger models.
https://forums.guru3d.com/data/avatars/m/174/174772.jpg
Loobyluggs:

I would argue they have burnt through their credibility as a desktop CPU provider. In short, their CPU designs are dated.
Fully depends on how the new design plays out, Alder Lake comes with up to 8+8 cores, not quite sure how that works but their main concern is power consumption, guess we can thank California for that. As a regular desktop CPU for everyday tasks and gaming Intel is doing fine, for workstations I guess they put more efforts towards Xeon instead. I can understand that it can be frustrating for small or part time game developers but hey, you can always use an AMD CPU that costs more, at least here Intel has been pretty aggressive with their prices, table turned. Intel's CPU design isn't exactly dated, I can agree on that they currently are outperformed in some benchmarks, but that is about it.
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
Mineria:

Intel's CPU design isn't exactly dated, I can agree on that they currently are outperformed in some benchmarks, but that is about it.
Appreciate the comment, but c'mon...5nm chip designs versus 14nm? That tells you the design of the intel chips is dated, and...wtf is 'intel7'? Some theoretical fab process to pacify investors? Need more detail on this, but the last five years have been wasted by intel and they have been caught up to and passed/surpassed by AMD for desktops.
https://forums.guru3d.com/data/avatars/m/174/174772.jpg
Loobyluggs:

Appreciate the comment, but c'mon...5nm chip designs versus 14nm? That tells you the design of the intel chips is dated, and...wtf is 'intel7'? Some theoretical fab process to pacify investors? Need more detail on this, but the last five years have been wasted by intel and they have been caught up to and passed/surpassed by AMD for desktops.
It's not all about gate sizes though. Apart from that Intel's 10nm yield had it's issues, so they used the time to optimize and stabilize their 14nm design, Alder Lake comes with a 10nm design, beside of that, Intel also used some time getting SRAM density down. So it will be interesting to see what Alder Lake really brings when it comes to everyday use. AMD did nothing to get down to 7nm and 5nm btw. TSCM did, and I suppose Samsung has 7nm processors in production as well, still, those are just marketing numbers.
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
Mineria:

It's not all about gate sizes though. Apart from that Intel's 10nm yield had it's issues, so they used the time to optimize and stabilize their 14nm design, Alder Lake comes with a 10nm design, beside of that, Intel also used some time getting SRAM density down. So it will be interesting to see what Alder Lake really brings when it comes to everyday use. AMD did nothing to get down to 7nm and 5nm btw. TSCM did, and I suppose Samsung has 7nm processors in production as well, still, those are just marketing numbers.
But, surely that is not the general consensus? AMD's design can be shrunk, Intel's cannot.
https://forums.guru3d.com/data/avatars/m/278/278016.jpg
have removed 23 tests
Opera Snapshot_2021-08-22_161208_wccftech.com.png

Screenshot 2021-08-22 142238.png

Screenshot 2021-08-22 142238.png
Ryzen use 3200 ddr4 and 12900k 4800 ddr5,win10 pro.
https://forums.guru3d.com/data/avatars/m/174/174772.jpg
Loobyluggs:

But, surely that is not the general consensus? AMD's design can be shrunk, Intel's cannot.
[youtube=1kQUXpZpLXI]
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
DeskStar:

One needs to play a game under the Vulkan API. Wolfenstein 2 The New Colossus I have no problem utilizing 40 to almost 60% of a 24 core 48 thread 3960x. It's all down to the graphics engine/API. Vanilla Minecraft I can use 60% to 55% of my 3960x
Never said games don't use it, just its rarer for games to use 8+ cores, or in some cases, use a few cores fully, then a load of cores a small amount also just cause its using them does not mean there is a real world benefit to it, hence why i mentioned that video of hardware unboxed cutting cores of the 11900k and there being very small drops in performance or in some cases no drops even when cutting it down to 6 cores from 10
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Horus-Anhur:

https://www.3dcenter.org/news/intel-stellt-das-alchemist-grafikchip-design-der-xe-hpg-architektur-vor https://pbs.twimg.com/media/E9UngzrVIAEIqf-?format=png&name=medium BTW, we need a section for Intel GPUs and Drivers, on the part of the Graphics Cards of the forum.
~6800 with tensor cores for image reconstruction would be sweeeet
kanenas:

have removed 23 tests
Opera Snapshot_2021-08-22_161208_wccftech.com.png

Screenshot 2021-08-22 142238.png

Screenshot 2021-08-22 142238.png
Ryzen use 3200 ddr4 and 12900k 4800 ddr5,win10 pro.
ddr5 4800 is the slowest it can be it's slow as molasses since it's 4800 cl40 3200 ddr4 has 25% higher read and 20% higher write than 4800 cl40 ddr5 early ddr5 is gonna suck i have found a kit of ripjaws v 4600 cl19 at 119eur,new how long before ddr5 matches that in price/perf ? probably 2 generations
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
cucaulay malkin:

~6800 with tensor cores for image reconstruction would be sweeeet
Probably in RDNA3.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
So the rumor of dg2-512 will be only competing with 3070 and 6700xt ware true?
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Undying:

So the rumor of dg2-512 will be only competing with 3070 and 6700xt ware true?
Even if so, good start for 1st time discrete GPU release from a company. Their follow up to that should be the real interesting thing to see.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
alanm:

Even if so, good start for 1st time discrete GPU release from a company. Their follow up to that should be the real interesting thing to see.
I agree with that but considering its still six months away im afraid their competitors will already be teasing or even releasing new cards.