Video from Intel CEO Bob Swan to COMPUTEX: did Intel give up on benchmarks?

Published by

Click here to post a comment for Video from Intel CEO Bob Swan to COMPUTEX: did Intel give up on benchmarks? on our message forum
https://forums.guru3d.com/data/avatars/m/271/271684.jpg
"You shouldn't pay attention to actual, measurable metrics. Instead, just eat up our marketing without questioning it." - that's what I got from it.
https://forums.guru3d.com/data/avatars/m/220/220188.jpg
lol, actually, if we only ever looked at benchmarks, intel looks alright, they can mostly keep up with AMD, its when you look around that their product gets rekt; higher prices, no pcie4, worse upgrade paths, shorter platform lifespans, worse power efficiency, software locked features left & right, hardware vulnerabilities their mobile products today are specially bad compared to ryzen, not even a contest, on desktop they at least can crank up power & heat to stay on the chart, but he's talking like ryzen 4000 is still some far off rumor, or maybe he thinks wifi cards are more important? as usual for a decadent company, upper management seems to have no idea about anything
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
So... as Intel moves away from benchmarks, they focus on the platform? Which is currently behind AMD's in some rather prominent features? This doesn't make sense for me in any way....
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Undying:

Why? Intel always dominated benchmarks.
A few reasons probably. 1) They don't really "dominate" in benchmarks, in any sense, even gaming. They are a tad better there, but they get their faces smashed in everything else, including the platform. 2) They have performance estimates for Zen 3. I think that the unified CCX will be a bigger deal than most people realize.
data/avatar/default/avatar39.webp
Mineria:

Problem is, that you won't really get that much fps @4K with newer AAA titles maxing things out, with 4K and above there is no huge benefit using an Intel CPU over AMD for gaming. There are probably 2080TI owners that run lower resolutions, but I would guess most of them have a 4K monitor?
The majority actually use 1440p 165 hz, though i personally use 4k 60 hz... and yeah, at 60 hz even my old 4930k is plenty.
https://forums.guru3d.com/data/avatars/m/258/258688.jpg
Sadly, the pre-schoolers Intel hired to author the "moah cores" Userbenchmark site haven't been as effective as Intel would prefer--thus the company has decided that talking about CPU performance with benchmarks is not as effective as pointing out, for instance, what great cold-climate heaters their 14nm cpus can be. "Why buy a separate space heater for your Antarctic home when you can buy an Intel-inside computer and get compute performance and home heating at the same time?" That sort of thing...clever Intel.../s
data/avatar/default/avatar09.webp
You always know you're in for a YT treat when the comments are turned off. "Our opinion sucks so much we cannot afford conversation on it". Next, the like/dislikes. I don't get Intel. How can you take so many years to change architectures with that kind of R&D cash and engineering?
https://forums.guru3d.com/data/avatars/m/267/267641.jpg
I Intel is managed by this guy good luck.. i would not buy stocks of such company.. he sounds and looks like someone from 1970s..
https://forums.guru3d.com/data/avatars/m/270/270017.jpg
So why does intel suddenly not care about benchmarks when they used game speed and AI capacity to advertise 'how good' their CPU's are??? When to buy AMD or Intel question lingers here... as always... Buy intel when you can get the same performance for the same price as AMD and DO NOT wish to upgrade your CPU later to a NEWER generation CPU, or don't feel you need to use PCI-E 4.0 for anything now or in the near future. If you have a good AC unit or live in Antarctica, the intel CPU is a sure bet. Buy AMD to save a bunch and still get similar (if a bit less in games but more in productivity) performance, a socket with longer life (even though we're nearing the end of AM4's life, a 3700x used for productivity can upgrade to a 3900x or 3950x or newer XT models and keep the socket/board in most cases), or might need PCI-E 4.0 for that fast drive or future video card (when you get one that supports 4.0). If you don't have AC, or live in the desert or other hot climate, live on the Sun, or inside an active volcano, you might want to avoid the heat from intel processors and choose AMD for it's 7nm efficiency. It's not about which one is better or faster, it's about what YOU need to do with it, and what you WANT TO, or CAN AFFORD or otherwise think is a GOOD purchase. Not about what someone else says, it's about what YOU need to use the PC for and is most efficient for YOU. DO NOT forget to factor in the cost of the motherboards, the cost of coolers (a Ryzen 3950x, and most all intel K, KF, or X series processors do not come with coolers in the package), delidding tools, good paste, an extra system fan or two and so-forth. How would the above fit for myself if I had to take my own advice? I primarily run simple 2d games and conversely do content creation for 3D games, and I like being able to upgrade my processor; so I chose a 3700x / x570 combo due to a great deal at Micro Center at the end of July last summer. It would have cost considerably more to go with a 9900k/Z390 or Z370 motherboard at the time with similar features, and have been harder to keep cool (I definitely wanted something I can tolerate being in the same room with for those 8~12 hour content creation sessions, very happy with that). This was a big jump from my 4790k that was nearing 5 years old, so I was very happy, and it doesn't get nearly as hot in here to boot. So choose what's right for yourselves, and be happy with it; no matter what anyone says - it's your money. Until then, we all can just sit back while fanboys sling mud at one another in the wccftech cattle pasture, and bask in the glory of competition returning to the CPU marketplace/industry.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
ruthan:

I Intel is managed by this guy good luck.. i would not buy stocks of such company.. he sounds and looks like someone from 1970s..
Idk, I actually kind of like him based on some of his decisions already.. word on the street from the Intel engineers is that they like him too (probably mostly because he increased CapEx spending by like $20B this year). Personally I think most people are taking his words out of context with this whole thing. Intel has already kind of shifted in the way he's talking about on a broader scale, they are no longer building one architecture and trying to force it into everything, both up and down -- they are branching out and building specific architectures for specific industries/problems. Going into the GPU space for HPC is one example of that. Building AI dedicated "engines" inside their CPU's is an example of that. Etc. I think the benchmark comment was just a offhand way to segue into that. I also think it's about shifting Intel's corporate culture, especially with the whole "solving problems for society" part, which was a much larger aspect of the talk then moving away from benchmarks.. There is a lot of people coming out of college now, looking for jobs that offer them the ability to solve the worlds problems - whether that be at Nvidia doing AI research for developing AI based radiology systems, to SpaceX launching men to Moon/Mars eventually, or various other bio-tech/tech firms working on directly saving peoples lives or bettering life in general. Intel has always kind of been bad at doing this - even when I was in school (2010ish) their job fair advertising was kind of weak in gathering attention of the mostly leftist college students at the top of their game. While Nvidia was advertising how their new Tesla GPUs (at the time) were solving scientific problems and you could come work on all these interesting projects, furthering space exploration and whatnot, Intel was advertising stuff like "join us and work on the latest wifi chipset". Microsoft has also been pushing similar initiatives lately - there is a reason why Nadella comes out on stage and talks about building software for people, solving real "human" problems and challenges around the world - it's because the customers want to hear these messages now (especially with all the recent COVID stuff) and potential employees are looking to work on those kinds of projects, mostly because education systems are pushing them towards working on those kind of projects. The nice part about both Microsoft and Nvidia is that they are actually backing up their talk with real products and solutions -- their corporate cultures internally have definitely shifted in that direction as well. Intel needs to follow suit or it will have trouble getting the best engineers/workers and become like an IBM where it's basically a last choice for most people.
data/avatar/default/avatar12.webp
Um Earth to Bob Swan Games and other forms of real world applications are NOT synthetic benchmarks.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Dragam1337:

Read what i replied to... what res and hz does the average 2080 ti owner use.
You're right, I read it wrong. We still don't have data for that either anyway, so we're all speaking in the air.
Denial:

Idk, I actually kind of like him based on some of his decisions already.. word on the street from the Intel engineers is that they like him too (probably mostly because he increased CapEx spending by like $20B this year). Personally I think most people are taking his words out of context with this whole thing. Intel has already kind of shifted in the way he's talking about on a broader scale, they are no longer building one architecture and trying to force it into everything, both up and down -- they are branching out and building specific architectures for specific industries/problems. Going into the GPU space for HPC is one example of that. Building AI dedicated "engines" inside their CPU's is an example of that. Etc. I think the benchmark comment was just a offhand way to segue into that.
This makes sense. AMD's continued existence has proven that there is a limited amount of manpower and money you can pour into a single architecture at a time. Intel is a massive company, they are much better specializing in a wide scale, and then possibly bringing in the specialized tools to make one great general one. I really hope we get something like the PS5 I/O subsystem out of this :P
Denial:

I also think it's about shifting Intel's corporate culture, especially with the whole "solving problems for society" part, which was a much larger aspect of the talk then moving away from benchmarks.. There is a lot of people coming out of college now, looking for jobs that offer them the ability to solve the worlds problems - whether that be at Nvidia doing AI research for developing AI based radiology systems, to SpaceX launching men to Moon/Mars eventually, or various other bio-tech/tech firms working on directly saving peoples lives or bettering life in general. Intel has always kind of been bad at doing this - even when I was in school (2010ish) their job fair advertising was kind of weak in gathering attention of the mostly leftist college students at the top of their game. While Nvidia was advertising how their new Tesla GPUs (at the time) were solving scientific problems and you could come work on all these interesting projects, furthering space exploration and whatnot, Intel was advertising stuff like "join us and work on the latest wifi chipset".
Intel has horrible internal issues. See "blue badge" vs "green badge", and that's the least of it. They do need to project a proper corporate image to attract the people they need, and right now it doesn't seem to be happening.
https://forums.guru3d.com/data/avatars/m/174/174772.jpg
Dragam1337:

The majority actually use 1440p 165 hz, though i personally use 4k 60 hz... and yeah, at 60 hz even my old 4930k is plenty.
@4K 60Hz my 4970K system is a bottleneck for the RTX 2080S, even when overclocked, so doubt that a 4930K is plenty. Even at lower resolutions there is an fps drop with RTX cards with these old CPU's, plenty of comparisons on YouTube showing it, heck even with a 5Ghz OC the 4970K falls behind.
data/avatar/default/avatar02.webp
Mineria:

@4K 60Hz my 4970K system is a bottleneck for the RTX 2080S, even when overclocked, so doubt that a 4930K is plenty. Even at lower resolutions there is an fps drop with RTX cards with these old CPU's, plenty of comparisons on YouTube showing it, heck even with a 5Ghz OC the 4970K falls behind.
First off its a 4790k, not 4970k. Secondly, the 4790k is a 4 core, the 4930k is a 6 core - which makes all the difference for new games. And lowering the resolution doesn't lessen the load on the cpu /facepalm
https://forums.guru3d.com/data/avatars/m/174/174772.jpg
Dragam1337:

First off its a 4790k, not 4970k. Secondly, the 4790k is a 4 core, the 4930k is a 6 core - which makes all the difference for new games. And lowering the resolution doesn't lessen the load on the cpu /facepalm
Your right, 4790K, my mistake. Yes, it does no lessen the CPU load but it does lessen the bottleneck, it's still there though even with the 4930K, which you can OC to somewhere around 4.5GHz?