Intel's 14th Generation Meteor Lake Processors: Emphasizing AI and Energy Efficiency

Published by

Click here to post a comment for Intel's 14th Generation Meteor Lake Processors: Emphasizing AI and Energy Efficiency on our message forum
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
So... that VPU... are they putting it into all them CPUs now?
data/avatar/default/avatar39.webp
fantaskarsef:

So... that VPU... are they putting it into all them CPUs now?
They at least say "All SKUs", so all meteor lake CPUs will have it. If it continues into the future depends on adoption, i guess, but at least this upcoming generation should be consistent.
https://forums.guru3d.com/data/avatars/m/50/50906.jpg
I'm assuming that this is Intel's response to AMD's Ryzen AI (Xilinx) on the Ryzen 7040 family (and assuming most future products).
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
heffeque:

I'm assuming that this is Intel's response to AMD's Ryzen AI (Xilinx) on the Ryzen 7040 family (and assuming most future products).
Not only AMD, there is also Qualcomm, Apple,... and so more, nearly everyone is in the train 🙂 .
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
At this moment, every company is fighting of nvidia's leftovers in the AI market.
https://forums.guru3d.com/data/avatars/m/50/50906.jpg
rl66:

Not only AMD, there is also Qualcomm, Apple,... and so more, nearly everyone is in the train 🙂 .
Well, Qualcomm and Apple are not in the x86 industry...
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
nevcairiel:

They at least say "All SKUs", so all meteor lake CPUs will have it. If it continues into the future depends on adoption, i guess, but at least this upcoming generation should be consistent.
Good, thanks I guess I missed that one. Still have to find out why I should want one of those VPUs, but at least I know you can't buy a CPU without it.
https://forums.guru3d.com/data/avatars/m/50/50906.jpg
fantaskarsef:

Good, thanks I guess I missed that one. Still have to find out why I should want one of those VPUs, but at least I know you can't buy a CPU without it.
More and more programs will use AI. Examples are programs that use your webcam (doing better at blurring background, etc.), or your microphone (doing better at taking out background noise)... Also the obvious photoshop, video editing, etc... and probably more programs that we can't imagine right now that they could use it, but will.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
I´m getting so tired of earing that everything has or is going to use AI...
https://forums.guru3d.com/data/avatars/m/50/50906.jpg
Well... get used to it. Years ago it was "cloud this, cloud that" and it has now been normalized. Before that it was "internet 2.0 this, internet 2.0 that" and now it's a given. AI is a breakthrough that is on its infancy (and thus is currently advancing rapidly) and will only get better, more powerful and with more use-cases as time advances.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
heffeque:

More and more programs will use AI. Examples are programs that use your webcam (doing better at blurring background, etc.), or your microphone (doing better at taking out background noise)... Also the obvious photoshop, video editing, etc... and probably more programs that we can't imagine right now that they could use it, but will.
Webcam? Don't have one. Microphone? I'm well understood right now (as in, good mic with little background noise), don't need that. I do no photoshop or video editing... so I still don't need one, please thanks mkay? I know it's just my use case, but I don't see sense in paying for a VPU, cooling a VPU, have one in my power budget while gaming, etc.
heffeque:

Well... get used to it. Years ago it was "cloud this, cloud that" and it has now been normalized. Before that it was "internet 2.0 this, internet 2.0 that" and now it's a given. AI is a breakthrough that is on its infancy (and thus is currently advancing rapidly) and will only get better, more powerful and with more use-cases as time advances.
Yeah, I believe you are right. For better or worse, with little to no real gain in what you do with all that stuff, they will push it everywhere they can. I still don't upload my pictures to the cloud and don't need cloud space either. Thanks, I can afford my own harddrives. I don't need my own webpage, as much as they wanted me to believe it during myspace times. And I will live long enough to be the grandpa telling you "I told you we don't need it" just before the nukes from Terminator burn us down.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I too find it a little annoying how buzzy of a term AI has become, but, it makes sense considering how rapidly it is growing. It'd be nice if it were only used where machine learning was involved at some point. What I don't get though is why the average person would want a dedicated processor for it. I think it makes sense in the server space, but I don't see why a home user couldn't just do FP16 on a GPU (though from what I recall, on Nvidia, there's no performance advantage to doing that).
https://forums.guru3d.com/data/avatars/m/224/224952.jpg
I've been waiting for Meteor Lake as my next upgrade to replace a 10700K for gaming and my 6700K for browsing, light gaming, secuirty etc. But, the 6700K only uses 57 to 60W idle with a 1080ti and is on practically 24/7. I bet a Meteor Lake system wont be able to match that, especially with a modern GPU. When on 24/7, every 10W difference is approx £25 a year electricity, it adds up. To the point that the Titanium efficient PSU probably paid for itself vs Gold standard. But the PSU is still only around 90% efficient at 75W (10% of the 750W PSU). There needs to be a higher efficiency standard than Titanium covering 0 to 9% of max power use, given the size of PSUs we now have to fit and the minimum loads PCs achieve. I eagerly await Meteor Lake power use tests of motherboards and CPUs.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Mufflore:

But, the 6700K only uses 57 to 60W idle with a 1080ti and is on practically 24/7. I bet a Meteor Lake system wont be able to match that, especially with a modern GPU.
Why do you say that? While 60W for a 1080Ti and a K-series CPU is decent, it's still pretty high for an idle and if efficiency matters to you. Meteor Lake may have a lot more bells and whistles but they're much more refined.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
schmidtbag:

Why do you say that? While 60W for a 1080Ti and a K-series CPU is decent, it's still pretty high for an idle and if efficiency matters to you. Meteor Lake may have a lot more bells and whistles but they're much more refined.
it is quite high, 10700f + 6800 use about 10-20 altogether. I'd rather not have it run at 60w when the pc is on almost 24/7.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
how is your 6700k+1080ti idle using that much wattage? my 6700k + 1070ti barely uses 30 watts idle combined and that with FF open on these forums, less hwinfo64 and rtss arnt show me the full truth Normal both are under 10 watts idle with nothing open running other then windows desktop Intrested in this energy efficacy can they 95 tdp cpu again? that actual runs around that as max load? never mind PL2 as it stand when and if i build new pc the PL2 will locked to 10-20% over PL1 or it will be disabled out right.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
cucaulay malkin:

it is quite high, 10700f + 6800 use about 10-20 altogether. I'd rather not have it run at 60w when the pc is on almost 24/7.
That's why I prefer ARM for my home server. I don't know if the entire system ever pulled more than 15W from the wall. Granted, it's not very fast, though the platform is several years old so I can get more than double the performance within the same power envelope. However, Mufflore's PC is used as a PC and not a server, so of course I'm not exactly comparing apples to apples here. In any case, it's not hard to make an x86 system sip power, even under load.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
I saw other instance idle power draw is wrong, like my uncle has 1050ti in (8400 i5 that 65watts but idles around 10watt too), is should be pulling 10watts if that idle, but in his system idle is drawing 45watts, i put the card in another system and it idle around 10watts. and as far as i can both systems power saving stuff is setup the same. what one build pulls idle compared to another system pretty much identical dont mean it gona pull same idle watt apparently my whole idle behind new pc builds is IF consoles does this(x) with 300watts or less I want to do 2x it pefer for under 400watts, it why i like cpu that 95watts maxed out and why i pefer gpu under 200watts maxed ( ideally around 150watts) was perfectly doable out box with messing with bios or setting atest it was before pl2 became thing
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
My system also uses around 60W on idle because my card runs it's memory at full speed to support the high refresh rate (144hz) of my screen. The card alone burns 45W just showing the desktop background...