NVIDIA to license Kepler as a technology to mobile manufacturers
Nvidia today shared its intention to license out its graphics technology to other companies in an effort to take advantage of the huge market for smartphones, tablets, and other devices as its core PC business continues to shrink. According to a blog post published by the company, Nvidia will start by licensing out the GPU core in its Kepler GPU, and it'll also license out its visual computing portfolio, which the company says will let licensees develop their own GPU technology while taking advantage of Nvidia's vast array of patents.
Unsurprisingly, the impetus behind this change is the major shift in computing towards mobile — the company says that the "explosion of Android devices" presents it with a huge business opportunity even as its traditional PC market continues to shrink. If the company's plan works out, we could see Nvidia's Kepler GPU technology in a lot more devices down the line.
Nvidia's announcement echoes CEO Jen-Hsun Huang's comments made earlier this year — he wants his technology to "light every single pixel," so getting more Nvidia technology into mobile devices will be a key part of that plan. And while this new licensing plan represents a definite shift in strategy, it's not something altogether unfamiliar to Nvidia: the company licensed out earlier GPU technology to Sony to help build the graphics technology in the PlayStation 3. It's a strategy that sounds similar to what AMD is already doing by putting its processors in the next-generation gaming consoles from Sony and Microsoft as well as in standard off-the-shelf laptops. The company says these opportunities didn't exist until recently because there was only one type of computing device, the PC, but that's clearly no longer the case — so the company's doing what it can to make sure it'll be inside the currently-dominant smartphone and tablet form factors, as well as whatever may come next.
NVIDIA to Bundle Metro: Last Light for GTX 660 and Above - 04/12/2013 12:04 PM
it seems that the Never Settle (Reloaded) promotion ignited a bit of a mud-fight in-between AMD and NVIDIA. AMD gives away totally awesome games with their graphics cards as you know. And NVIDIA on t...
NVIDIA Tegra 4 benchmarks - 02/25/2013 02:12 PM
When NVIDIA unveiled Tegra 4 back at CES we we really needed to see benchmarks, some more ata to help show the difference between Tegra 4 devices and whatever's currently on the market. Over at the M...
NVIDIA Tegra 4i-based Phoenix reference phone shown - 02/21/2013 08:38 AM
NVIDIA has showeed its first reference smartphone design. Based on the Tegra 4i "Grey" SoC with ARM Cortex-A9 cores, the Phoenix is a 5" phone with a 1080p Full HD screen. It's not ...
NVIDIA Tegra 4 Will Ship from July - 02/14/2013 10:01 AM
NVIDIA's upcoming Tegra 4 SoC chips will start shipping to device manufacturers starting in July. In the latest conference call with financial analysts it became clear that shipments will start in t...
NVIDIA Tegra 4 will have 6x more Shader cores - 12/19/2012 09:25 AM
An interesting tidbit of news appeared on the web. Tegra 3 the current SoC from NVIDIA used in mobile phones has 12 shader cores. At CES the Tegra 4 will be announced, but Chiphell has posted some int...
Senior Member
Posts: 21798
Joined: 2008-07-14
You expect these companies to actually be honest in their marketing?
Senior Member
Posts: 2843
Joined: 2009-09-15
Nuh not really, but I expected atleast some common sense from them... lol! Even tho I'm a bit over-exaggerating on what was written in the article, it still seems funny to me.

Next up would be like: "......featuring their professional overclocking up to a whopping 3.9 GHz (4770k only)."
Btw my posts in this thread are for entertaining purposes only. No offense meant to AMD, Origin PC's or GURU3D.
Senior Member
Posts: 2843
Joined: 2009-09-15
I quote the article: "........ featuring their professional overclocking up to a whopping 5.0 GHz (FX-9590 only).
Well hello Mr Origin. And here I thought the 5.0ghz was the default maximum turbo speed of that CPU. lol!