Intel introduces Adaptive Boost Technology for Core i9-11900K and Core i9-11900KF

Published by

Click here to post a comment for Intel introduces Adaptive Boost Technology for Core i9-11900K and Core i9-11900KF on our message forum
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Hmm interesting. I hope that this will be looked at with the reviews, quite curious to see if that actually does make a difference in regards to different workloads (boosting higher in gaming, more average boost across 4 cores when doing calculation workloads etc).
https://forums.guru3d.com/data/avatars/m/263/263435.jpg
Now i need one just gotte update my z490 ace.
data/avatar/default/avatar10.webp
I think for benchmarks this will be a huge difference. I hope this will be available in z490 and 500 series chipset
https://forums.guru3d.com/data/avatars/m/63/63170.jpg
From the image in the article, only TVB is constrained to a 70°C limit, the other three are noted as having 100°C limits. In the Article you have it as 70°C also for adaptive boost 😉 Adaptive Boost will only be available for the Core i9 K and -KF models: In other words, only Core i9-11900K and Core i9-11900KF. With a processor temperature of less than 70°C, it achieves a Thermal Velocity Boost of 5.3 GHz on two cores. That's 5.2 GHz without TVB. If more than two cores are working, a maximum of 5.1 GHz is possible, but so far only for up to four cores if TVB activates. So four cores max. Adaptive Boost now takes it up a notch, if a proper power supply is applied, and if temps remain below 70 Degrees C, all eight cores can boost to 5.1 GHz. So for more than four cores the up to 5.1 GHz far exceeds the previous specification of 4.9 or 4.8 GHz.
data/avatar/default/avatar34.webp
Does this have any advantage over just setting all cores manually to 5.1?
https://forums.guru3d.com/data/avatars/m/253/253059.jpg
PSU requirements: 1.21 jiggawats. Don't get your uranium from terrorists to power this, just wait for Mr. Fusion.
data/avatar/default/avatar18.webp
Glottiz:

Does this have any advantage over just setting all cores manually to 5.1?
Manual 5.1 allcore may need more then 300W at full load and a 360 rad could be in the 90 degrees area, especially if AVX-512 is used. I think this CPU is going to run on the laptop principle, boost to the moon and hope the load is gone, before the CPU hits 100 degrees.
data/avatar/default/avatar22.webp
Glottiz:

Who started this myth? It's not even true. AMD or Intel, almost same power usage...
You can attribute some of the blame to that 11700k Anand "review" where everyone saw the AVX512 extreme stress test on a poor air cooler.
data/avatar/default/avatar31.webp
Intel is like, i heard you like boost, so we put boost in your boost so you can get more boost. More or less this is how it works.
data/avatar/default/avatar28.webp
Witcher29:

Now i need one just gotte update my z490 ace.
Yeah, for sure! Upgrade from CL? I mean downgrade. XD
data/avatar/default/avatar06.webp
BReal85:

Yeah, for sure! Upgrade from CL? I mean downgrade. XD
While everyone should take Intel claims and marketing slides with a grain of salt, do people really think they are going to flat out lie about their 11900k claims? It is their most binned SKU and people are really downplaying the role of their AI/DL Boost stuff for workflow applications.
https://forums.guru3d.com/data/avatars/m/253/253059.jpg
Glottiz:

Who started this myth? It's not even true. AMD or Intel, almost same power usage... https://tpucdn.com/review/amd-ryzen-7-5800x/images/power-gaming.png
Well these new ones aren’t on there. And this chart is gaming which usually doesn’t tax the CPU. We will see when the full review is out. I expect the review to be... hot lol
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Glottiz:

Who started this myth? It's not even true. AMD or Intel, almost same power usage... https://tpucdn.com/review/amd-ryzen-7-5800x/images/power-gaming.png
intel is still very efficient for a lot of home/gaming pc usages it's that max stress test power draw with avx on that's utterly ridiculous. but yeah,I don't think people who choose ryzens over intel for home rigs because of power savings realize this idle https://pclab.pl/zdjecia/artykuly/mbrzostek/2020/cml_s/wykresy/power_idle.svg video playback https://pclab.pl/zdjecia/artykuly/mbrzostek/2020/cml_s/wykresy/power_video.svg gaming https://pclab.pl/zdjecia/artykuly/mbrzostek/2020/cml_s/wykresy/power_3d.svg all values are for the whole system w. rtx2080 here's a great test,albeit in Polish,telling you exactly what every tasks costs in energy https://pclab.pl/art84541.html you won't learn this from memes or youtube trash this is power draw for every minute of the cpu sitting in idle https://pclab.pl/zdjecia/artykuly/mbrzostek/2020/cpu_energy/idle_60sek.svg
data/avatar/default/avatar14.webp
Well gaming may vary a lot, depending if you are gaming vsync off uncapped or vsynced. I can imagine a game like witcher 3 is playing with vsync. I do not know the details of that review, but yes, amd and intel haven't totally different power consumption on normal gaming.
Glottiz:

Who started this myth? It's not even true. AMD or Intel, almost same power usage... https://tpucdn.com/review/amd-ryzen-7-5800x/images/power-gaming.png
data/avatar/default/avatar23.webp
cucaulay malkin:

intel is still very efficient for a lot of home/gaming pc usages it's that max stress test power draw with avx on that's utterly ridiculous. but yeah,I don't think people who choose ryzens over intel for home rigs because of power savings realize this idle https://pclab.pl/zdjecia/artykuly/mbrzostek/2020/cml_s/wykresy/power_idle.svg video playback https://pclab.pl/zdjecia/artykuly/mbrzostek/2020/cml_s/wykresy/power_video.svg gaming https://pclab.pl/zdjecia/artykuly/mbrzostek/2020/cml_s/wykresy/power_3d.svg all values are for the whole system w. rtx2080 here's a great test,albeit in Polish,telling you exactly what every tasks costs in energy https://pclab.pl/art84541.html you won't learn this from memes or youtube trash this is power draw for every minute of the cpu sitting in idle https://pclab.pl/zdjecia/artykuly/mbrzostek/2020/cpu_energy/idle_60sek.svg
Sadly is last gen . Do they have with Zen3? AMD Zen3 looks better here https://www.guru3d.com/index.php?ct=articles&action=file&id=65966
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
kapu:

Sadly is last gen . Do they have with Zen3?
they closed pclab last year I think it sucks cause some of their tests were very informative,like this one they were they only ones that e.g. tested skylake on ddr3 vs ddr4. they are missed.
data/avatar/default/avatar26.webp
cucaulay malkin:

they closed pclab last year I think it sucks cause some of their tests were very informative,like this one they were they only ones that e.g. tested skylake on ddr3 vs ddr4. they are missed.
Zen2 was not that great in terms of idle power consumption . Only load full MT was much better than intel . But is it different story on Zen3 .Stil Intel 14nm+++++++++ is optimized thru the roof so it is pretty good at some scenarios
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
yes but mainly due to boosting algorythms
https://forums.guru3d.com/data/avatars/m/282/282657.jpg
To benefit from boosting the boost, to boost the boost of Rocket Lake up to the moon, CPU must work in Intel spec. temp. range, without a $$TEC, I think you can´t boost nothing and btw Intel is boosting themselves sick, but reviews will show. Edit: now we have a 1000W modded HOF cards + Intel ABT+TVB + cooling device situation where a new PSU must be invented.1h gaming = 1$, maybe even more.