Starting 2020 Apple Will No Longer Use Intel processors
Over at Bloomberg word got out that Apple will be fabbing a self-developed CPU for their Mac computers. That means Apple will be giving Intel the boot when it comes to processors.
It's even mentioned that it will not stop with just processors as Apple wants to tie everything together, a larger project with code name Kalamata learns that Apple wants all its devices to work seamlessly together, so besides Macs also the iPhones and iPads. Apple is working on a new software platform called Marzipan (which may already be released this year), which makes it possible to run apps for the iPhone and iPad on the Macs. Apple is 5% of Intel's business, the shares of Intel dropped by 9.2 percent when the news was revealed.
-- Bloomberg -- The shift would be a blow to Intel, whose partnership helped revive Apple’s Mac success and linked the chipmaker to one of the leading brands in electronics. Apple provides Intel with about 5 percent of its annual revenue, according to Bloomberg supply chain analysis.Intel shares dropped as much as 9.2 percent, the biggest intraday drop in more than two years, on the news. They were down 6.4 percent at $48.75 at 3:30 p.m. in New York.
Apple could still theoretically abandon or delay the switch. The company declined to comment. Intel said, “We don’t comment on speculation about our customers.”
For Apple, the change would be a defining moment. Intel chips remain some of the only major processor components designed by others inside Apple’s product portfolio. Currently, all iPhones, iPads, Apple Watches, and Apple TVs use main processors designed by Apple and based on technology from Arm Holdings Plc. Moving to its own chips inside Macs would let Apple release new models on its own timelines, instead of relying on Intel’s processor roadmap.
“We think that Apple is looking at ways to further integrate their hardware and software platforms, and they’ve clearly made some moves in this space, trying to integrate iOS and macOS,” said Shannon Cross, an analyst at Cross Research. “It makes sense that they’re going in this direction. If you look at incremental R&D spend, it’s gone into ways to try to vertically integrate their components so they can add more functionality for competitive differentiation.”
HP launches F-Series monitors: 1080p IPS FreeSync starting at $99 - 01/08/2018 02:59 PM
HP today announced affordable IPS FreeSync monitors, tagged under the F-series. Five 1080p monitors are being released and they all (21.5 to 27-inch) get an IPS panel. FreeSync is included as well,...
AMD Radeon RX Vega 56 Available Starting Today - 08/28/2017 01:45 PM
Two weeks ago you probably have read our Vega 56 and 64 reviews, what you perhaps missed is that the Vega 64 actually went on sale, the Vega 56 cards however did not. Until today that is....
Intel Core i9 7980 XE Available Starting September 25th - 08/09/2017 04:08 PM
Yesterday Intel released the last specs for their Core X-series Processor Family Specs 14- to 18-Core. Earlier indication for the Core i9-7980XE would be a release in October. It now seems that the ...
AMD Readies Ryzen 5 Series and will offer six- and four-core processors starting April 11 - 03/16/2017 05:22 AM
In a conference call yesterday AMD announced more Ryzen processors. As you guys know AMD made a promise to release the Ryzen 5 series in Q2 this year. Well, they have been able to meet that time slot ...
OEMs won't be able to sell PCs with Win7 / 8.1 starting November - 10/03/2016 08:14 AM
Windows 7, perhaps Microsoft's most succesful release of Windows, along with the less popular Windows 8.1 won't come installed on large OEM PCs after October 31, 2016....
Senior Member
Posts: 2068
Joined: 2017-03-10
Hey man, don't you know that the A10X is as fast as a Core i7 3770? Geekbench says so! :p
http://browser.geekbench.com/v4/cpu/compare/3036382?baseline=231758
Member
Posts: 23
Joined: 2017-11-24
Why has no one considered they are designing their own chip in collaboration with Global Foundries on their 7nm fab process? This would be a win/win for both companies. GloFo gets a reliable customer to keep the foundry profitable and share development costs. Apple gets the best possible chips based on the latest process.The roadmap seems to jive. I say Apple realized Intel had been milking the cow with quad core for years too long, and have only decided to ramp up development as competition grew. Why would Apple want to be strapped down to 10nm Intel chips that are only coming because AMD is on the upswing, and looks to overtake Intel in the near future.
Member
Posts: 22472
Joined: 2008-07-14
Apple can buy AMD and call it a day....
With all the cash Apple has it will be easier to simply buy the whole AMD and solves all his future desktop/laptop GPU and CPU needs.
Apple already use AMD GPUs in their imac pro, using AMD CPUs and APU can make sense if Apple buy AMD.
Samsung (not US company...) can't buy AMD because of US gov block, Intel can't do it due to monopoly and the same for Nvidia (foreign and monopoly) but Apple is an US company and can buy it without US gov blocking the deal.
Whether or not Apple could buy AMD and be able to produce x86 CPUs that way depends on the terms of the licensing agreement AMD has with Intel and whether or not the license is transferable. It's my understanding, which could very well be wrong, that AMD's x86 license is non-transferable through sale of the company.
NVidia is a US based company. Founded and Incorporated in the US. The present corporate headquarters is in Santa Clara, California. It was incorporated in the state of Delaware. Anti-Trust laws are the only thing stopping NVidia from buying or merging with AMD. Jen-Hsun Huang was a chip designer for AMD before co-founding NVidia. Would be a bit ironic for NVidia to buy out or merge with AMD....
Quite honestly, nobody that values PC gaming should have any desire to see NVidia buy out or merge with AMD. Even if AMD were to sell the graphics division before hand, it would be the death of affordable computing. The same goes for Apple buying or merging with AMD. Consumer choice would be limited to buying Apple and their overpriced, walled garden or buying overpriced Intel based systems with Windows. Whether it be NVidia or Apple buying out or merging with AMD, consumers lose.
Btw....
Certificate of Incorporation for NVidia: http://s22.q4cdn.com/364334381/files/doc_downloads/governance_documents/NVIDIA_Corporation_-_Certificate_of_Incorporation.pdf
that would be a neat trick I would love to see.
big doubt here.
There are workloads where RISC instruction sets supposedly perform better than CISC (x86)... However, nothing Intel or AMD produce today are actually CISC processors in the conventional sense...
Yeah they cannot do that, except if they pay a lot of money to Intel to license the instruction set, which might cost them more then using Intel CPUs in the first place. If they use anything, its going to be ARM based, which is likely slower then Intels offerings (I don't see ARM producing a high-performance desktop/mobile CPU in the next 2 years) and wouldn't run old programs - or would need binary translation to run them, which is yet again even slower.
They would need a license from Intel for x86 and another license from AMD for 86x64 if they wanted to produce a 64bit processor. Both of which would fall under FRAND since they would both be necessary to compete in the x86 CPU market at this point.
ARM doesn't produce chips. They simply create chip designs that can either be produced or modified by CPU companies depending on their license. I think it was Qualcomm that was actually working on a RISC based server CPU.... Qualcomm, Apple and Samsung are the most likely companies to be capable of producing a RISC based processor with the necessary level of performance to compete with an i3 or i5 processor in desktop workloads.
Senior Member
Posts: 990
Joined: 2010-08-24
Interesting, considering they use AMD GPUs.
@topic, can't say that I'm surprised by this. They wanna wall their walled garden even higher. I do half-expect for them to do stupid shit like go ARM for the MacBook Pros or something similarly dumb.
Senior Member
Posts: 853
Joined: 2015-05-19
Yeah they cannot do that, except if they pay a lot of money to Intel to license the instruction set, which might cost them more then using Intel CPUs in the first place. If they use anything, its going to be ARM based, which is likely slower then Intels offerings (I don't see ARM producing a high-performance desktop/mobile CPU in the next 2 years) and wouldn't run old programs - or would need binary translation to run them, which is yet again even slower.