Intel's discrete graphics cards will not start at $200

Published by

Click here to post a comment for Intel's discrete graphics cards will not start at $200 on our message forum
https://forums.guru3d.com/data/avatars/m/40/40086.jpg
So is this a real attempt at getting into competitive discrete graphics or are they just dipping their toes in the water again?
https://forums.guru3d.com/data/avatars/m/274/274779.jpg
Will it be on 14nm? 😀
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
If it does not start at 200$ where does it start? At 400$ like competition? This generation we are f*cked with these prices.
https://forums.guru3d.com/data/avatars/m/274/274779.jpg
Undying:

If it does not start at 200$ where does it start? At 400$ like competition? This generation we are f*cked with these prices.
Intel maybe skip over Nvidia with prices.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Undying:

If it does not start at 200$ where does it start? At 400$ like competition? This generation we are f*cked with these prices.
Pretty sure article says $100 and up.......
https://forums.guru3d.com/data/avatars/m/260/260048.jpg
Petr V:

Will it be on 14nm? 😀
14nm+++++++++*
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Aura89:

Pretty sure article says $100 and up.......
They want to hit every segment but then again, where does it start? Just like with nvidia first will come expensive showcase products then 100$ peasant cards. Its Intel afterall....
https://forums.guru3d.com/data/avatars/m/274/274779.jpg
Undying:

They want to hit every segment but then again, where does it start? Just like with nvidia first will come expensive showcase products then 100$ peasant cards. Its Intel afterall....
"peasant cards. Its Intel afterall.." LMAO.
https://forums.guru3d.com/data/avatars/m/265/265607.jpg
Oh well price is relative, depends what they will offer.
https://forums.guru3d.com/data/avatars/m/189/189980.jpg
Overall, more competition, the better. But there's a catch, Intel as already an discrete graphics architecture ready for production and manufacturing? Is this what Raja concocted until now? If that's the case, let's see what they're offering.
https://forums.guru3d.com/data/avatars/m/163/163032.jpg
cryohellinc:

14nm+++++++++*
DAMMIT! you beat me to it
data/avatar/default/avatar37.webp
There goes my TPU comment posted on Friday: "As an Intel product, I can hardly believe this at this point, but hey, go along and we will see, good luck there."
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Kind of an odd statement for him to make. He didn't really say anything about what we should expect, but instead basically just listed the markets that will rake in the most cash for Intel. I guess if you're an investor, what he said is very appealing, but for everyone else it's like "we don't care how much money you make, we want something competitive". Anyway - I'm sure Intel learned from their previous mistakes. Intel's near infinite funding and Raja's experience with GPUs will likely make this GPU competitive. Though, since this is Intel's first serious GPU (that is actually a GPU and not just a bunch of Atoms "bonded" together), I think it'll be best to wait for the 2nd generation, especially considering the drivers will be very immature.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Petr V:

Will it be on 14nm? 😀
Would be pretty good if it is honestly - Intel's 14nm has extremely good yields, great performance and it's density is roughly equivalent to TSMC's 10nm process. Whether or not the GPU will be good is another story entirely. Intel has a lot of driver catch up to do.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Denial:

Would be pretty good if it is honestly - Intel's 14nm has extremely good yields, great performance and it's density is roughly equivalent to TSMC's 10nm process. Whether or not the GPU will be good is another story entirely. Intel has a lot of driver catch up to do.
I'd actually say this is the best opportunity for them to use 10nm. I'm still certain Intel was ready with 10nm 2 years ago, but couldn't use it because it wasn't able to achieve the same clock speeds as 14nm, and didn't offer enough of a performance boost to make up for that. But these GPUs are a clean slate with nothing (from Intel) to compare to. They don't need to push the limits with clock speed either. I assume they know where their node falls short, and could've worked the GPU's architecture around those problems. But yeah, I don't think it'd be a bad thing if they used 14nm.
https://forums.guru3d.com/data/avatars/m/277/277333.jpg
I don't really understand why some people assume Intel GPUs will be bad because it's Intel doing it. Intel isn't a person, and that Raja, the guy designing these new GPUs (as I understand it), has already designed Vega for AMD. I mean, I'm not saying Vega was awesome or that Intel GPUs will be the second coming of Christ, but it's pretty naive to think that such engineering team with Intel funding can't make good/decent hardware. Like Denial pointed above, it's much more likely that Intel's drivers will be a problem, rather than the architecture/hardware itself. Still, it's pretty clear that they want to enter the GPU battle along with NVIDIA and AMD, and these statements of "cost $100, all the way to Data Center-class graphics" and "2-3 years - to have the full stack" just makes it a matter of when.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Ricardo:

I don't really understand why some people assume Intel GPUs will be bad because it's Intel doing it. Intel isn't a person, and that Raja, the guy designing these new GPUs (as I understand it), has already designed Vega for AMD. I mean, I'm not saying Vega was awesome or that Intel GPUs will be the second coming of Christ, but it's pretty naive to think that such engineering team with Intel funding can't make good/decent hardware.
People still cling onto impressions from 10 years ago. Intel GPUs have actually been very good for what they intend to do for a long while. That being said, they suck at gaming (and even then, not as much as people think) because they're not built for gaming. They're great for things like driving multiple high-res monitors, low-power video decoding, certain workstation tasks, and OpenCL.
https://forums.guru3d.com/data/avatars/m/258/258688.jpg
Even if Intel in 2-3 years puts out a $200 GPU, it doesn't mean anyone will want it...;)