Apple M1 chip outperforms Intel Core i7-11700K in PassMark (single-thread)

Published by

Click here to post a comment for Apple M1 chip outperforms Intel Core i7-11700K in PassMark (single-thread) on our message forum
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
fredgml7:

I find M1 very impressive, and I Wonder if that’s going to be a trend to home computing market. I would gladly change to RISC (or something more complex based on it) in the near future, but I despise soldered stuff (CPU, RAM etc.) not allowing upgrades, especially on Desktops but also notebooks/laptops.
A trend in what, specifically? These days, computer hardware is becoming less and less interesting. The days of overclocking are nearing an end (now it's basically just providing a cooling solution capable of maintaining boost clocks). All motherboards are pretty much the same thing with ever-so-slightly different variations, and even then, they're mostly just black with RGB heatsinks. I wouldn't be surprised if in a few years, all motherboards come with soldered-on CPUs.
Strijker:

But damn impressive that new M1 chip. Wished we had that chip for pc's (RIP Intel & AMD though then most likely). I wonder with kickass cooling & more power you could even get much higher...
It wouldn't matter if the chip came to PCs - most of the performance enhancements is in Apple's software. The CPU itself is otherwise mediocre. ARM chips are also notorious for overclocking poorly.
data/avatar/default/avatar01.webp
schmidtbag:

t wouldn't matter if the chip came to PCs - most of the performance enhancements is in Apple's software. The CPU itself is otherwise mediocre. ARM chips are also notorious for overclocking poorly.
Really just software? Don't taking that part. C'mon mediocre, it's amazing. I rather bash Apple whenever I can but now I just can't. MS could make windows run on ARM cpu's, they already made a build for that in recent times if I'm not mistaken. But yea all the other software must be altered for that as well but when taking that road it's possible. Could take some time but Apple is more hardcore in big changes for sure. Yea ARM chips are mostly on phones, no good coolers as well and Apple main concern is silence, more over heat. Let it burn&throttle but let it be silent (if possible) when looking at the past history. But this thing is 15watt.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
I hope leads to Desktop parts that use 15wattage with that performance on single core would amazing if it could be applied to multi core. Gonaa hope in one hand and shit in the other and see which fill up first 🙄
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Strijker:

Really just software? Don't taking that part. C'mon mediocre, it's amazing. I rather bash Apple whenever I can but now I just can't. MS could make windows run on ARM cpu's, they already made a build for that in recent times if I'm not mistaken. But yea all the other software must be altered for that as well but when taking that road it's possible. Could take some time but Apple is more hardcore in big changes for sure. Yea ARM chips are mostly on phones, no good coolers as well and Apple main concern is silence, more over heat. Let it burn&throttle but let it be silent (if possible) when looking at the past history. But this thing is 15watt.
Yes, mostly software. I assume Apple basically just copied the instructions from Intel that they actually used, excluded the rest, and ended up with a chip that could do pretty much the same thing at 10% the wattage. Software is absolutely critical to how well something performs. I mentioned Clear Linux earlier as a real-world example of how much performance you can squeeze into a CPU if you actually care to optimize it. Apple's closed ecosystem makes it much easier for them to optimize things, because they don't have to retain compatibility with a wide array of platforms. Windows does run on ARM. I'm using the Lenovo C630 as I write this - an 8-core ARM-based laptop that comes with Windows 10. Windows gets the job done but Linux has better software support, and Mac is better optimized. EDIT: Windows also has a compatibility layer to run x86 software, but it's basically just emulation and runs like crap. Mac's Rosetta 2 seems to go a lot further, where you're not going to get native-level performance but it isn't horribly slow either. Linux takes a similar route to Windows but the driver layer is less complex so it seems to take less of a performance hit. Again, the M1 isn't any ordinary ARM chip. Even if you port Windows 10 to the new M1 Macs (which in theory should be possible, Linux is already bootable on them), you're not going to get the same level of performance, because Windows isn't built to be optimized.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Undying:

Is there anything else apple chip excel at beside passmark?
A beefed up M1 Air is probably the best laptop I've used. Really responsive, zero lag on anything pretty much, ten hours plus of battery life. It also feels much faster than the 2020 13" Air, on clean big Sur installations. They have definitely done things about I/O with it, and as a package is great. A colleague has the M1 Mini at home, and he only praises it. The M1 is anything but mediocre. If anything, macOS is actually slower than Windows.
https://forums.guru3d.com/data/avatars/m/286/286028.jpg
Pretty much every modern processor is internally RISC with a front end CISC decoder. Surprising to see so many believe they are still primarly CISC based. That has not been true for a long time.
https://forums.guru3d.com/data/avatars/m/284/284929.jpg
OldManRiver:

Pretty much every modern processor is internally RISC with a front end CISC decoder. Surprising to see so many believe they are still primarly CISC based. That has not been true for a long time.
Care to explain more? I think that x86 has not really changed all that much since its invention and remains a CISC system.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Raserian:

Care to explain more? I think that x86 has not really changed all that much since its invention and remains a CISC system.
To the user, x86 architecture is *very* CISC. Inside a chip it can be a different story. A modern high end Intel chip takes those CISC instructions and translates (decodes) them down into an internal RISC-like instruction set and executes those in parallel and can often “execute” (retire is Intel’s term for when an instruction completes execution) as many as 4 at a time (that was what they could do in 2015 on Haswell class machines). Today, Intel or AMD x86 processors do not execute x86, its instructions get decoded and translated into micro-instructions in the processor frontend; the backend executes those micro-instructions and looks a lot more like a RISC processor.
https://forums.guru3d.com/data/avatars/m/270/270792.jpg
schmidtbag:

A trend in what, specifically?
Specifically SOC (that implies not modular parts as it is nowadays on desktops) and non x86 like processors.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
OldManRiver:

Pretty much every modern processor is internally RISC with a front end CISC decoder. Surprising to see so many believe they are still primarly CISC based. That has not been true for a long time.
You say that as though that changes anything discussed to any significant degree, because as Astyanax pointed out, the end result of many modern architectures is something that is very much CISC. By your logic, that's like saying "seaweed isn't a plant, it's a protist, so it's not a vegetable". The things that separate seaweeds from being plants don't separate it from being a vegetable. All that being said, MIPS is a true RISC architecture that is still fairly modern. Even today, the floating point instructions aren't integral to the architecture, and yet, MIPS can still run a full OS.
fredgml7:

Specifically SOC (that implies not modular parts as it is nowadays on desktops) and non x86 like processors.
Desktop and laptop PCs have become more and more of a SoC. For one big example, there is no longer a separate northbridge and southbridge. I wouldn't be surprised if within the next decade, there will no longer be a discrete motherboard chipset anymore. I think it'll still be a long while until RAM becomes fully integrated, like it is in many ARM platforms. As for non-x86 processors, I think that's very possible. Most tablets run on ARM and have obsoleted the need for desktops or laptops for most people. Apple is moving away from x86. Many tools we use have become cloud based, where it doesn't matter what architecture you use. The Chinese government is trying to move to MIPS. For Linux users, there's a rising interest in POWER, ARM, and RISC-V. MS's 2nd attempt at ARM is actually usable, and I'm sure they're just trying to prepare in the future in case ARM becomes a serious contender. So yeah, I think it's very possible x86 may lose quite a lot of popularity. It will basically just remain popular to those who want raw CPU processing power, and that's a dying trend thanks to things like OpenCL or CUDA.
https://forums.guru3d.com/data/avatars/m/250/250667.jpg
Magic power of ASUS DARK HERO DOCS
Capture54377.PNG
https://forums.guru3d.com/data/avatars/m/63/63215.jpg
I've seen loads of videos about the M1 in previews and post release. As a home recording enthusiast, it's very tempting, especially as more software is starting to support it (although still a while to go yet). I'm sure a year from now it will be very compelling. It runs emulated windows apps pretty decently too. I'll definitely be watching it's progress in the coming years.
https://forums.guru3d.com/data/avatars/m/286/286028.jpg
schmidtbag:

You say that as though that changes anything discussed to any significant degree, because as Astyanax pointed out, the end result of many modern architectures is something that is very much CISC....
You meant RISC. I said that because of ridiculous backward claims like that. I think it's funny you cite Sora, for once, clearly based on my post, got it more or less correct. You got it wrong. Look elsewhere to incite conflict. Thanks.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
Undying:

Is there anything else apple chip excel at beside passmark?
Yes most bench and program. Everyone were sceptical about it (don't forget that the 1st version is a tablet CPU) but the more they put it with more core on computer the more i like it... The good news is that they will make even more core and powerfull version. The bad point it is that it is Apple exclusive... 🙁
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
All this talk about CISC and RISC is completely irrelevant. All modern CPUs just translate incoming command streams to whatever they use internally. It's really a pity it's an Apple exclusive. I even see Apple going to RISC V in 20 years, just not to be in the mercy of Nvidia.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
OldManRiver:

You meant RISC. I said that because of ridiculous backward claims like that. I think it's funny you cite Sora, for once, clearly based on my post, got it more or less correct. You got it wrong. Look elsewhere to incite conflict. Thanks.
I meant what I said. Who the hell is Sora? How can I be "more or less correct" while also getting it wrong? You're the one getting pedantic about what is actually RISC. If you don't want to incite conflict, you're barking up the wrong tree.
https://forums.guru3d.com/data/avatars/m/286/286028.jpg
schmidtbag:

I meant what I said. Who the hell is Sora? How can I be "more or less correct" while also getting it wrong? You're the one getting pedantic about what is actually RISC. If you don't want to incite conflict, you're barking up the wrong tree.
You're still arguing with me because? More or less correct was not addressed to you. I am not arguing "what is RISC" at all. Learn to read. You're still seeking conflict, duh.
PrMinisterGR:

All this talk about CISC and RISC is completely irrelevant. All modern CPUs just translate incoming command streams to whatever they use internally.
Well no, it is not actually irrelevant what the internal processor structure is. It it were, there would be no differences in uarch... 'cause it would be irrelevant'. Man... poor quality posting around here.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
OldManRiver:

You're still arguing with me because? More or less correct was not addressed to you. I am not arguing "what is RISC" at all. Learn to read. You're still seeking conflict, duh. Well no, it is not actually irrelevant what the internal processor structure is. It it were, there would be no differences in uarch... 'cause it would be irrelevant'. Man... poor quality posting around here.
May i suggest you scream at em to get of your lawn ?
data/avatar/default/avatar26.webp
schmidtbag:

A trend in what, specifically? These days, computer hardware is becoming less and less interesting. The days of overclocking are nearing an end (now it's basically just providing a cooling solution capable of maintaining boost clocks). All motherboards are pretty much the same thing with ever-so-slightly different variations, and even then, they're mostly just black with RGB heatsinks. I wouldn't be surprised if in a few years, all motherboards come with soldered-on CPUs. It wouldn't matter if the chip came to PCs - most of the performance enhancements is in Apple's software. The CPU itself is otherwise mediocre. ARM chips are also notorious for overclocking poorly.
Id say at least Zen2/3 there is, alot of head room for OC/UV and ram, tweaking. Much more than it was ever on intel side. Was quite fun for me 🙂 Zen3 especially is most tweakable cpu in recent history
https://forums.guru3d.com/data/avatars/m/172/172560.jpg
it's faster than AMD cpus too, but the point now is to s*it over rocket lake as hard as possible. Sad really.