Intel Teams up with ASUS and Colorful for first dedicated Iris Xe (DG1) Graphics cards
Intel published that it is entering the discrete GPU market in collaboration with AIBs, the first two listed are ASUS and Colorful. Initial dedicated graphics cards will be intended for entry-level users and small businesses. The cards are OEM based and cannot be purchased in stores.
Based on Iris Xe, a 10nm DG1 GPU will have 80 Execution Units (640 Shader Units). Intel has not shared specs like a clock speed for the non-Max variant; the Iris Xe MAX would be clocked at 1650 MHz. The cards come with 4GB of LPDDR4X memory on a 128-bit bus interface. The cards will not be available to the public; instead, they are OEM-based and can end up in pre-built systems.
-- Intel -- Intel codesigned and partnered with two ecosystem partners, including ASUS, to launch the Intel® Iris® Xe discrete desktop graphics cards (code-named “DG1”) in systems targeted to mainstream users and small- and medium-sized businesses. The cards are sold to system integrators who will offer Iris Xe discrete graphics as part of pre-built systems.
Following the launch of Intel® Iris® Xe MAX for notebooks, Intel’s first Xe-based discrete graphics processing unit, Intel and its partners saw the opportunity to better serve the high-volume, value-desktop market with improved graphics, display, and media acceleration capabilities.
The new cards offer a compelling upgrade to existing options in the market segment. They feature three display outputs; hardware video decode and encode acceleration, including AV1 decode support, Adaptive-Sync; Display HDR support, and artificial intelligence capabilities thanks to DP4a deep-learning inference acceleration. The Iris Xe discrete graphics cards come with 80 execution units and 4 gigabytes of video memory.
Intel Teases Intel Xe-HP Accelerator - 12/09/2020 08:33 AM
What to make of Xe eh? Raja Koduri, ex AMD RTX, now senior vice president, chief architect, and general manager of Architecture, Graphics, and Software at Intel Corporation has been teasing Intel's ...
Intel Teases to deliver Xe details in 20 Days, then quickly removes Tweet - 07/25/2020 08:06 PM
We have an unusual amount of Intel news to cover this weekend it seems; this one is interesting though. As recently, I wrote that aside from the IGP Xe implementations, things went silent on Intel's ...
Intel Teases Its New Intel Graphics UI - 03/13/2019 08:39 AM
Intel has released a teaser video that shows off the new Intel Graphics UI with an announcement of "Coming this month." ...
Intel Tests Smallest Spin Qubit Chip for Quantum Computing - 06/13/2018 08:28 AM
Look at the pencil, wait for it - yes there it is! Intel researchers are taking new steps toward quantum computers by testing a tiny new "spin qubit” chip, smaller than a pencil&rs...
Senior Member
Posts: 3941
Joined: 2009-09-08
AIB for low entry + OEM only...
I wasn't expecting a RTX 3090 or 6900XT killer of course but i feel it is not too good for Intel's start on GPU segment...
Entry level, in other words - it just sucks.. and we want make at least some money on it..
This was supposed to be their cutting edge product and compatitor to AMD/NV?
A GPU that is barely faster than the PS4 from 10y ago?
This is just their entry level GPU, so it´s no meant to be very powerful.
And they already announced bigger and better versions to be released in the future, so we need to wait.
If they can produce something on par with the 3060, i think that´s going to be a very good start for them.
Senior Member
Posts: 19505
Joined: 2008-08-28
Nvidia and amd always start with the high end. I expected a beafy gpu showcase from intel not a oem gt1030 competitior, eh.
Senior Member
Posts: 6284
Joined: 2010-10-17
How good or bad do we think this is going to be? Will it be overpriced like most intel products?
Senior Member
Posts: 565
Joined: 2016-05-24
For those who believe than entry level is good start, well wait to TDP, if it would be quite high as i expect around 50-75W, it means that is just forced to be entry level, because it sucks.. and i will not scale up great with more power. I now that now even mid range cards are 200W, but its i guess just thing of present generation and normal would be again 120-160W.. and some if performace of these multiple 2,3 or wouldnt be comparable.. they just sucks.
Well, there was not Cyberpunk capable driver at launch, even week, or two after than.. it means, that Intel knows that it sucks and even not invest into drivers..
Lets call it educated guess, Intel planned something much better and now they are in damage control mode..
Senior Member
Posts: 2979
Joined: 2013-03-10
ofcourse this wont be a flagship card, that will be too risky, they are taking the right approach.
This card is effectively to test the manufacturing process and driver stack to ensure it all goes well and to iron out any bugs.
Intel already sold a supercomputer to the US government a while ago, requiring Intel CPUs plus Intel GPUs. Obviously that computer is still waiting to be built, yet Intel needs to deliver on schedule. That's where their top priority lies, not in gaming GPUs. Those will come eventually.