NVIDIA announces Tegra 4i SoC
Nvidia today announced the Tegra 4i, a smartpone chip with an integrated LTE modem on the same physical die. This is a follow-on to the Tegra 4 that was announced at this year’s CES 2013. Formerly known as “Project Grey”, Tegra 4i delivers the highest performance of any single-chip smartphone processor and is only half the size of its nearest competitor (Snapdragon 800).
Tegra 4i features:
- 60 custom NVIDIA GPU cores
- Quad-core CPU based on ARM’s newest and most efficient core— the R4 Cortex-A9 CPU— plus a fifth battery saver core
- A version of the NVIDIA i500 LTE modem optimized for integration
- Unique Chimera Computational Photography Architecture like Tegra 4
This brings Tegra 4 goodness to the mainstream smartphone market -- an extremely power efficient, compact, high- performance mobile processor that enables smartphone performance and capability previously available only in expensive super phones.
NVIDIA will also introduce its “Phoenix” reference smartphone platform for the Tegra 4i processor to demonstrate its unique mobile technologies. Phoenix is a blueprint that phone makers can reference in designing and building future Tegra 4i smartphones to help get them to market quicker.
NVIDIA aiming for their own tablets and smartphones? - 01/29/2013 02:45 PM
It would be great if they pulled that off. There are now rumors on the web that NVIDIA might be working on reference smartphone and tablet designs. These designs should by ready by mid-2013 and sho...
NVIDIA Announces Tegra 4: 72 GPU Cores, 4G LTE - 01/07/2013 10:28 AM
Graphics Guru's NVIDIA have announced announced the Tegra 4 chip, a quad core Cortex A15 chip packed with 72 GPU cores, expected to launch later this year. Alongside the chip, NVIDIA also introduced...
NVIDIA asks EVGA to Pull EVBot Support from GTX 680 Classified - 10/03/2012 06:59 AM
It was to be expected. According to an Overclockers website, NVIDIA asked EVGA to remove voltage control support from its EVBot module, specifically for the GeForce GTX 680 Classified graphics card. E...
NVIDIA and Rambus settle all patent license agreements - 02/09/2012 12:47 PM
Rambus announced it has signed a patent license agreement with NVIDIA. The agreement lasts for five years and settles all disputes between the two firms. Details on how much money is involved weren't ...
NVIDIA Asking GeForce Owners To Send Cards In For Examination - 11/04/2011 12:48 PM
According to this post in the NVIDIA tech support forums, the company has been able to reproduce (and fix) some of the BF3 and Windows Media Center bugs players have been experiencing but they are ask...
Senior Member
Posts: 14092
Joined: 2004-05-16
"based on ARM’s newest and most efficient core"
Nope still not seeing best in that sentence. I mean maybe if I anagram it. Not even to mention that nothing in that sentence is wrong either.
The A15 only gets 40% more performance for nearly double the TDP.
I agree, nvidia shooting for performance in a low power environment is like eating a cake while you're on a treadmill - you might enjoy it at the time but in the end it's counterproductive.
However, maybe nvidia doesn't sincerely care about power efficiency. After all, they originally wanted a license to x86, and when intel said no, nvidia shot for the next company who would listen to them. I'm sure if it were up to nvidia, they'd have a real powerhouse of a cpu+gpu, but for now that just isn't going to happen.
What?
The T4i has Quad A9's unlike the T4's Quad A15s. Shield has a 4-8w TDP as calculated by the battery reports given by Nvidia's CEO -- which is in line with what the TDP is for the dual A15 (Exynos) powering the Nexus 10, except the T4 has double the cores.
With Quad A9's the T4i will come in lower than the Quad Krait and the Krait is already in phones with decent battery life (DNA gets 4 hours S.O. with a 2020 battery).
Pretty sure they care about battery life or why else would they build a chip with integrated LTE and A9s?
Senior Member
Posts: 19051
Joined: 2009-01-25
Seeing that this SoC is destined for the budget market, this is rather impressive. NVIDIA expects to have these in devices that have a $100-300 price tag (without carrier contract or subsidy from the sellers ... I know people are going to point at the Nexus 4 having a Qualcomm S4 Pro). There are no other SoCs with comparable performance nor feature-set that can touch the Tegra 4i right now for that price.
Now there's to hoping that ST-Ericsson, Broadcom and Mediatek have an answer ready for the new NVIDIA SoC. They're pretty big players in the low-mid range markets. I don't think Qualcomm would care too much since they're comfortably sitting in the mid-high end market which NVIDIA will have a tough time competing as SoC manufacturers like Samsung and Texas Instruments are up there with Qualcomm.
Samsung is likely going exclusive for their SoC design going forward as Apple has scaled back their chip orders from Samsung due to their legal spat. Samsung may still use Qualcomm chips in the future but not as much as they are now in their high end devices.
deltatux
Senior Member
Posts: 8230
Joined: 2010-11-16
Tegra will continue to lose money for another year, unless Shield turns this around.
Luckily NV has plenty of cash to weather whole restructuring business until Tegra (5) conquers the world.
Senior Member
Posts: 19051
Joined: 2009-01-25
Tegra will continue to lose money for another year, unless Shield turns this around.
Luckily NV has plenty of cash to weather whole restructuring business until Tegra (5) conquers the world.
What's funny is that the bulk of the ARM tablet market is dominated by NVIDIA's Tegra 3...
It's hard to find a tablet that isn't powered by Tegra 3, there are a couple, namely the Galaxy Tab 2, Galaxy Note 10.1, Lenovo S2109/2110 and Dell's XPS 10. That's about it in terms of mid-high end tablets that aren't powered by Tegra 3.
deltatux
Senior Member
Posts: 7441
Joined: 2012-11-10
Seconded. I had to buy a new battery for my S3 as the standard 2100mah one just got destroyed with wi-fi, downloading, games, and web surfing. Even with a battery saver app configured properly as well, screen brightness turned down and it would only last around 6 - 8 hours of consistent screen time if I was lucky. I now have a 4300mah batter and I can have it on for about two days without needing to charge it. Right now its on 17% battery and its been on that since last night with no charge and its been on a total of 2 days
They really need to focus on battery life rather than performance.
I agree, nvidia shooting for performance in a low power environment is like eating a cake while you're on a treadmill - you might enjoy it at the time but in the end it's counterproductive.
However, maybe nvidia doesn't sincerely care about power efficiency. After all, they originally wanted a license to x86, and when intel said no, nvidia shot for the next company who would listen to them. I'm sure if it were up to nvidia, they'd have a real powerhouse of a cpu+gpu, but for now that just isn't going to happen.