NVIDIA unveils Tegra K1 mobile processor, claims 3x the performance of Apple’s 64-bit A7 processor

“‘We believe the Android operating system will be the most important console operating system in the future,’ Nvidia CEO and co-founder Jen-Hsun Huang said at the company’s CES 2014 press conference tonight. Then he effectively declared war on next-generation console gaming by announcing the Tegra K1, the company’s next mobile chip with a whopping 192-cores,” Jason Evangelho reports for Forbes.

“Prior to the announcement, Jen-Hsun caught up the audience with the company’s Tegra milestones like the Tegra 3 being the first mobile quad-core processor. ”So what can we do next? We could do 8 cores, but that seems a little pedestrian,” Jen-Hsun joked. Then he unveiled the 192-core Tegra K1, rooted in Kepler architecture,” Evangelho reports. “According to Nvidia, Tegra K1 offers 3x the performance of Apple’s A7 processor.”

“Multiple presentations showed off photo-realistic environments, dynamic lighting, and next-generation level graphics. From what I’ve seen, they have bridged the gap between next-generation console gaming and mobile gaming,” Evangelho reports. “Tegra K1 will be offered in two pin-to-pin compatible versions: a 32-bit quad-core (4-Plus-1 ARM Cortex-A15 CPU) and a custom, Nvidia-designed 64-bit dual ‘Super Core’ Denver CPU.”

NVIDIA Tegra K1mobile processor available in 32-bit and 64-bit versions
NVIDIA Tegra K1mobile processor available in 32-bit and 64-bit versions

 
Read more in the full article here.

MacDailyNews Note: According to NVIDIA (whitepaper, “The architecture of the Kepler GPU in Tegra K1 is virtually identical to the Kepler GPU architecture used in high-end systems, but also includes a number of optimizations for mobile system usage to conserve power and deliver industry-leading mobile GPU performance. While the highest-end Kepler GPUs in desktop, workstation, and supercomputers include up to 2880 single-precision floating point CUDA cores and consume a few hundred watts of power, the Kepler GPU in Tegra K1 consists of 192 CUDA cores and consumes less than two watts (average power measured on GPU power rail while playing a collection of popular mobile games).

[Thanks to MacDailyNews Reader “Take To Task” for the heads up.]

62 Comments

        1. Freaking posts – this was supposed to be a reply to Bob, who for some reason has 5 star votes on that comment by people who obviously don’t realize the K1 has significantly less power consumption than A7. Talk about blind faith. I would think everyone here has an iPhone and is sick of the poor battery performance.

            1. But I do belong. I own apple products and have every right to speak the truth as much as anyone else here, including you ! Everyone who up voted that comment is obviously unaware the K1 uses much less power then A7.
              And based on reactions to the newer story about iPhone 6 thickness, it is quite clear that many iPhone owners are fed up of important battery life.

            2. Nothing poor about my phone’s performance, considering the stuff it’s expected to do. The only times my phone’s battery suffers is when I’ve been taking lots of photos, using mapping software,mand forgotten to turn the screen off when I put it back in my pocket; I don’t enable auto-screenlock, so occasionally suffer.
              I do, however, carry a pack with a 10,000mAh battery in to charge my phone, pad and camera with, and any other accessories, something any intelligent person should be doing who uses and relies on modern consumer electronics.
              Instead of snivelling like a little girl, ivid, perhaps you ought to invest twenty or thirty bucks, and then shut up your whining.

            3. That’s your best ? You have to resort to insulting me? Give me a break. No one should have to haul around gear to extend battery life, that’s ridiculous. In fact that just validates what I am saying.
              Having to charge our phones daily or more is lame. Everyone knows this and sorry if you can’t handle that truth but don’t insult me for speaking the same truth that many many iPhone owners complain about on this very site all the time! Extending battery life is one of the top feature requests every year. I wonder why?
              I speak truth and get insulted, I certainly didn’t “whine” in any way. What a wonderful example you are.

            1. I apologize if I came off obnoxious by speaking the truth. We all know iPhone battery is terrible (as competition’s may too but I wouldn’t know). What’s obnoxious to me is people jumping on the K1 assuming it will have terrible battery life when MDN clearly added a note to explain it takes a mere 2 watts ! WTF is so obnoxious about my statements?
              And calling me stupid really makes you sound immature.

            1. My battery lasts 4 hours when idle. This is because the placement of my desk in my office causes constant switching from 3G to GPRS.

              I have resorted to switching to airplane mode when at my desk.

              Oh well. I hope one day we can find a more power friendly way to activate radios.

            2. I agree airplane mode is the savior. When I’m overseas like I am right now my iPhone is on airplane mode and it lasts several days even with frequent photo & video camera use.

    1. Also, it takes 192 cores to ONLY be 3x faster (in theory) than the dual core A7, which can run an iPhone or iPad for 10 hours of typical “real world” usage. WOW! 😉

      1. How many GPU core’s does the A7 have? One? Two? (Too lazy to remember or look up)

        But agree… Seems like 192 is a lot. But did it say “single purpose core”?? Does this mean 1 core per GPU instruction? Or just a core per screen grid section?

        192 cores sounds cool… Next up the iPhone 7 with dedicated NVIDIA GPU’s… !!

      1. The number of times that’s been claimed when reality shows something rather different. There’s more than sounding impressive there’s real life experience. I’m sure this will be pretty good but talk and bluster is easy.

        1. It refers to using the GPU to assist the main CPU and use the GPU to perform standard CPU calculations rather than graphics.
          Here is nvidia’s explanation:
          GPU-accelerated computing is the use of a graphics processing unit (GPU) together with a CPU to accelerate scientific, engineering, and enterprise applications. Pioneered in 2007 by NVIDIA, GPUs now power energy-efficient datacenters in government labs, universities, enterprises, and small-and-medium businesses around the world.

          http://www.nvidia.ca/object/what-is-gpu-computing.html

  1. The 64-bit Denver core based chip has been announced to be shipping “in the second half of 2014”.

    Well, guess what? Apple’s A8 chip will be shipping then too.

    Will this then shipping Nvidia chip be 3x the speed of the A8? No one knows for sure, but it is extremely unlikely. Apple’s A8 could even be faster!

    1. Yes you always have to suspect the motives let alone intelligence of those who claim superiority of a future chip against one gat is already approaching its half life. Intel did the same thing with its new Atom that proved inferior to the A7 prior to its actual release in the end. Apple could easily boast about future chips but it isn’t its style.

    1. Most Android users just end up owning one somehow. There is not a lot of thought going into the purchase. There are definately a small minority of Android users who research and want the latest greatest specs. Most, however, end up with Android becuase the failed to do any research.

  2. “We believe the Android operating system will be the most important console operating system in the future.”

    Spoken as true CEO for company that does not count Apple as a customer.

  3. Good for them for pushing the edge, we all benefit from that. But will the pedestrian – not worth charging money for – Android OS be able to take full advantage of what chip makers are producing?

  4. And what software changes will allow for the addressing of this many cores?

    First the OS has to enable it. Second, the app has to access it.

    And what about backwards compatibility?

  5. People posting reactions here seem to be ignorant to the fact that Nvidia has been making excellent processors for a very long time. Far longer than Apple and their A chips. I think Nvidia is pretty damn good at it.

    1. Yes but has generally been out performed by the A line or haven’t you noticed. Or no doubt you would have been on here before prattling on about more than raw and potential performance that like Samsung specs never relates to the real world. So let’s see shall we when it’s actually measurable rather than board table in nature.

      1. Perhaps you didn’t read the full story, they are showing it in action at CES including a demo of something than ran previously on the GTX Titan board with same performance. Then there are Epic’s comments. This is the first Kepler based mobile chip and I do know what Kepler architecture is capable of so I stand by by comment. Nvidia is damn good at it. I am sure the A8 will leapfrog the K1 but then the K2 will be announced soon after.
        Whatever the case, the weakness of the K1 won’t be Nvidia, it will be android.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.