Comments Locked

25 Comments

Back to Article

  • Drumsticks - Tuesday, May 5, 2015 - link

    It's a shame to see competition shut down, although i'm not sure that nvidia ever really competed here. If i remember correctly, the 4i sucked because it used A9 cores in a market full of A15 and great Krait cores. Wemll never know really how it would have fared with a more modern core to stand up better to the competition.
  • mczak - Tuesday, May 5, 2015 - link

    Not to mention the GPU sucked too. Quite an outdated design at that time, missing heavily on features (needed or not these are selling points) and not really competitive on performance neither. The cpu cores (the late Cortex-A9 revisions were definitely better than earlier ones) were actually more competitive than the gpu (on the cpu side it was competitive with snapdragon 600, but on the gpu side it was not)...
    That said as a lower-end chip there wasn't much wrong with it (even today this still easily beats the likes of snapdragon 400/410 both on the cpu and gpu front, though obviously the gpu features are still missing). But I guess their cost structure wasn't really meant to compete with that or the mediatek stuff...
  • eiriklf - Wednesday, May 6, 2015 - link

    Tegra 4i was targeted to a mid range market below the krait and cortex A15 devices.
  • lilmoe - Wednesday, May 6, 2015 - link

    The 4i is what they should have focused on and sold dirt cheap. This is where I find NVidia's strategy problematic;

    1) They focused too much on the high end without having a guaranteed outlet. Qualcomm had their built-in modems and huge customer base, and Samsung had their Galaxy lineup (50%+ of Android at one point). NVidia had neither. YOU DO NOT mess with the big boys head on unless you know what you're doing. They should have competed primarily in the mid-range against Mediatek and HiSilicon while still using their superior GPUs AND softmodems for more appeal to OEMs. In silicon, VOLUME is the name of the game.

    2) Even for the high end, their SoC designs either flopped flat out, or didn't deliver any face value to OEMs whatsoever. They're always late, and support was horrible.

    3) They tried to make proprietary of what should have been open. This was very dependent on Google's favoring their tech but were instead given the finger, for good reason.

    4) Instead of suing, they should have gotten themselves a good deal from Samsung to use their fabs or made a cross deal to license their GPUs. Exynos + Maxwell at 14nm would have bee drool worthy.

    Too many mistakes. NVidia has too many enemies, and their making more. I say it's time for a management shuffle.
  • TheJian - Thursday, May 7, 2015 - link

    LOL. The suit is because samsung wouldn't license the tech they're stealing from NV currently (along with EVERY other soc/mobile device maker). They tried for 2 years with samsung and samsung is just continuing with willful infringement. Which is why they will get stung for likely far more than Intel (not willful), and don't forget this is a foreign company screwing american company, which is why NV chose a JURY trial. Samsung will settle after Markman hearing, and probably make some deal over fabs, modem use and possibly all samsung must use NV IP or some such crap for X years. Or maybe Samsung just pays a huge fine (doubtful, they'll want to deal on fabs, modem, memory for NV's own devices etc).

    Anandtech called it the wild west of patent infringement for a reason, and the markman hearing shows who is about to clean house. All of these guys will be paying NV for a decade or more in some way shape or form (infringing on all the 2002-2015 patents for decades that will come in pascal, volta etc etc). At this point we're just waiting to see where the BILLS fall (likely mostly at device maker’s feet I'd say, then some to chip makers etc, as the piece that infringes most is sold to an end user's hands). The 7 patents Nv chose to use were from 1999-2002 or so, and they will all be infringing on everything created after then also (probably much of AMD patents too, but AMD will wisely wait for NV to do the heavy lifting). You can't really make a gpu today without treading on NV/AMD/Intel (probably in that order).

    I see no mistakes here; they know exactly what they were doing. The first 4-5 revs of tegras were just to gain brand recognition etc, waiting for NOW when gpu is becoming king as gaming amps up, 64bit OS gets polish etc and they're able to get the DESKTOP gpus into mobile. Now that mobile is graduating to pretty much playing PC games (albeit turned down a bit), they are all "cheating" as you say and will pay just like INTEL (1.5B). The console/handheld rev2 (coming soon says fcc filing) etc will push gaming more on android (google working to help along gaming massively too), so gpu only gets more important. Not sure what you mean by google gave NV the finger, tegra is in Nexus 9 and I suspect whatever they make for Nov will be 14nm Samsung produced tegra. Google will want a top gaming gpu and NV’s experience for android 6 show no doubt.

    The whole point of Denver and games amping up on mobile is to push into x86 notebook/pc turf at some point. The console is a start in that direction; next stop full PC, 500w psu 16-32GB, SSD/HD, discrete NV etc. Meaning, an ARM box that is just as capable as a full-fledged WINTEL PC, with or without a discrete gpu included from NV (lowend running socs amped up, higher end using discrete). You just have to wait for a bit longer for unreal 4/unity5 etc games to come out to push the need for a full box (then the apps port over eventually to some degree). NV isn't alone in this either, but they have the best hand owning all the gpu tech vs. qcom/samsung/apple/img.l/mediatek etc etc. Auto's etc will pay for the bills on tegra soon if the consoles/handhelds/grid stuff etc doesn’t, until they get to the point where everyone is paying up. An example of what is about to happen is MS getting $4-15 for every android device sold, due to some code android infringes. Microsoft makes billions from that, far more than they make themselves in mobile…LOL. Qcom gets a percentage of the FULL DEVICE for usage of their tech. NV has multiple examples of deals to put before a court if people desire to push them that far and not settle. It will get ugly for all. Your device doesn’t do much without a gpu to put something on screen. Samsung themselves paid MS $1.041B the first year of this agreement (far larger now no doubt). Samsung was late paying and got sued…LOL. Why do you think Samsung is trying to make their own OS constantly? This is a big part, but no way to make your own gpu without infringing at least not for the foreseeable future.

    Enemies? Who cares? If you have what everyone REQUIRES (or possibly be banned from USA sales), they pay up. What are you going to do with gaming being ~75% of app store sales, go back to a gpu that can’t play any games? Once NV wins (I think samsung settles now soon, long before it gets in front of 12 americans, probably after next phase of court case in june/july), samsung will either deal of be banned from making gpus in their chips. How does that work out for galaxy etc? Apple probably already dealing now after markman hearing too as IMG.L chips are in samsung devices and named as infringing. So if they're illegal in samsung, they are in apple too ;) IF you have qcom/samsung/apple covered you're pretty much guaranteed everyone else will get the point or just quit. Apple/Samsung hate each other, but that doesn’t stop business when it benefits both. Everyone wants to sell in USA, so you’ll deal. A case in Europe would surely follow too at some point if required. How does qcom’s huge customer base react to a banned gpu? ;) You need to start thinking LONG, not just today. This war is just beginning and modem is losing its luster as gpu takes over.

    Everyone is moving into NV/AMD territory now as games kick it up a notch (hope amd gets in before NV rules it all). All who have come against these two have failed, including Intel (more than once), Imagination, SGI, matrox, 3dlabs etc etc (google: list of defunct gpu). They have ruled gaming for almost 2 decades. Devs don't have to learn anything to optimize for these two as they've been doing it for years, and both AMD/NV fully understand working with game devs and making great drivers. The rest have to learn all this crap, and have to do something just to get legal first. There really is nothing that says NV must license their tech to anyone (it's not frand patents here or something, it’s proprietary gpu tech). Once you're found guilty, you either have to pay a price for each device or you can't make any more devices without making a chip that does gpu stuff radically different and requiring devs to learn new ways to make games (try to get devs to go for that, ask Intel, it didn't work out). This is like trying to get MIPS to take out ARM…LOL. The modem isn’t important now, hence the sale. They’ll get it from Samsung/qcom in deals and with shrinks/better power management now, power of an external modem won’t mean much going forward (and may be able to integrate the tech from either depending on deals). I think most users would rather play GREAT GAMES, than hit their cap even faster than they do now…LOL. The modem ruled the last decade+, now it’s the gpus turn. The only way that changes is unlimited data for all and CHEAP and that just won’t happen, unfortunately for Qcom.
  • Samus - Wednesday, May 6, 2015 - link

    It took hundreds of millions of dollars in R&D for Intel to put 3G in an SoC. It's amazing NVidia, with their more limited manufacturing and talent reach, put LTE in an SoC for a fraction of the price after buying an entire company for less than Intel spent.

    It's too bad they're throwing in the towel. But the real nail in the coffin was Tegra, not the i500. And with everything going integrated, selling the discrete i500 obviously wasn't a hot topic.

    If this wasn't NVidia, AMD would be a natural buyer since they haven't even started (as far as I know) integrating LTE onto an SoC yet. This would give them a huge head start. And because this is NVidia, I doubt Intel will ever give them an x86 license, not that they'd want it.

    Quite a pickle.
  • Yojimbo - Wednesday, May 6, 2015 - link

    No, it was the lack of good LTE and Qualcomm's licensing deals using their modem IP to gain an advantage which did in NVIDIA's efforts, not Tegra. Tegra looked good, it just didn't get many design wins.
  • testbug00 - Wednesday, May 6, 2015 - link

    IT didn't get many design wins because it didn't look good once you factored in that OEMs got burned on Tegra 2 and 3. The only Tegra I can look at that seems compelling for phones is Tegra 3 for a few months before the 28nm transition fully hit. Even at that point, I would personally take 2 higher clocked A9s over Tegra 3. But, it doesn't mean that Tegra 3 didn't have it's place.

    Every single Tegra after 3 just couldn't fit in phones. It would have been a show exactly like what the SD810 is getting now. To damn hot (for different reasons than the 810 problems.)

    Now, for chromebooks/WART/mini desktops/etc. Tegra has looked amazing since Tegra 4.
  • Samus - Wednesday, May 6, 2015 - link

    Seriously. What would you rather have in a low-end tablet or Chromebook, a Baytrail Celeron or a Tegra "anything"?

    Tegra is up against two brick walls. It isn't the best ARM solution, especially given the price, and it lacks x86 compatibility to compete with Intel.

    It can have the best integrated baseband modem in the world and it still wouldn't sell.
  • testbug00 - Wednesday, May 6, 2015 - link

    Given the supposed selling points of the chips, I would say the Tegra. Based on contra-revenue and/or Intel selling chips at a loss, I will take baytrail.
  • npaladin2000 - Tuesday, May 5, 2015 - link

    It's kinda too bad, then again the baseband they implemented in the Tegra Note, ostensibly Icera-based, really sucked compared to what I'd seen from Qualcomm and what I'm currently seeing from Samsung. It's not just that NVIDIA doesn't need it anymore, it's that it's not competitive anymore.
  • Frenetic Pony - Tuesday, May 5, 2015 - link

    Which is why it's being shut down... I've noticed Nvidia is good at GPU's and hype, but so far not much else. Even its Denver cores, which seemed promising at least, have disappeared without word of when they'll come back.
  • D. Lister - Wednesday, May 6, 2015 - link

    Their HPC side is doing quite well too. Also :), there's nothing wrong with being good at generating hype - as long as you deliver on that hype consistently enough.
  • testbug00 - Wednesday, May 6, 2015 - link

    HPC side of Nvidia involves how many Tegra chips again? Oh, right, ZERO. At least, zero publicly announced. And, I Would assume that means zero total.
  • Samus - Wednesday, May 6, 2015 - link

    Yes, Tegra, hands down, sucks when you look at the big picture. Exynos and Snapdragon have more competitive fully integrated offerings, then you've got Intel coming in full-speed with an LTE x86 SoC later this year.
  • Yojimbo - Wednesday, May 6, 2015 - link

    NVIDIA did very well with chipsets until that segment dried up for third party designers. Their shield line has done reasonably well (using their Tegra) and they have been successful in automotive (again using Tegra). Was the NVIDIA shield tablet terrible in some way? No it got good reviews and has sold pretty well. They simply can't compete with any device whose primary use demands a cellular modem. Then it's a matter of getting OEMs to use their Tegras in tablets even though Qualcomm is supplying chips for most of the OEMs' lineups and is trying hard to not let NVIDIA get a foot in the door.

    But most importantly, NVIDIA's major strength doesn't even lie with their GPUs per se, it lies with their software surrounding the GPUs. That's what allows them to be successful in PC graphics (drivers), HPC (CUDA), automotive, and GPU virtualization. So to say that NVIDIA is only good at GPUs doesn't even make sense for their core competencies.
  • gpumaven - Wednesday, May 6, 2015 - link

    what were the problems with the Tegra Note baseband?
  • D. Lister - Wednesday, May 6, 2015 - link

    It just seems so ruthless, how the livelihoods of hundreds of people can be written off so casually. I hope those guys received enough severance to keep themselves comfortable until they find a new job.
  • jmunjr - Wednesday, May 6, 2015 - link

    Really? Would it be ruthless on their part if they quit? Of course not. The fact is Nvidia no longer could afford to pay them a lot of money for pretty much no value in return. They're lucky they've been getting paid at all. I find it hilarious employees need severances to survive. I've always had enough saved as an emergency to last about six months of bills. In my (and Nvida's) field, if one can't get a job in six months, they should be worried about lot more than a severance.
  • D. Lister - Wednesday, May 6, 2015 - link

    Well now, your exponentially manlier outlook of temporary unemployment has made the futility of common courtesy on the internet glaringly obvious. Oh wait, it hasn't, because it isn't.
  • Murloc - Wednesday, May 6, 2015 - link

    in Europe you're usually insured against joblessness under various systems depending on the country, so the need for an emergency fund is quite more limited.
  • psychobriggsy - Wednesday, May 6, 2015 - link

    I'm sure these employees will not have much trouble finding new jobs if Icera doesn't get bought by another company. There's a good chance that the company will get bought, but as the article says, for a lot less that Nvidia paid - there are a lot of SoC designers out there. LG could be a candidate, they like buying cast off technology (WebOS, etc), and they are working on their own SoC designs.
  • jjj - Wednesday, May 6, 2015 - link

    So they are out of phones for real and that's no fun (for us).
    Long term this is tricky, as the form factor shifts to likely glasses,they would be out of that too and that's not ideal since computer vision is one of their things.
    Ofc it's hard to say where connectivity wil be in 5-10 years and if not having a modem would block their access to the new form factors. Even if a complex integrated modem is still needed, ARM and the likes could license such IP by then.
    More competition would have been nice in mobile and their future in consumer CE isn't too bright right now so makes you wonder about their trajectory in the next 10 years.
    Do wonder if they'll cooperate with a major player on LTE or maybe invest in someone like Altair Semiconductor (like Sandisk did recently).
  • testbug00 - Wednesday, May 6, 2015 - link

    How is it no fun? They never had a compelling phone product. The best they had was a small gap between the 40 --> 28nm transition when they had Tegra 3. Even that wasn't really compelling. Even in tablets their product isn't that compelling going by the numbers in general. Or they asked to much for it, I don't know if Nvidia was aiming for the same kind of premiums it generally has in the consumer GPU market.

    In the phone space, didn't even lose a competitor. In the tablet space, perhaps. However, Samsung's new efforts along with Mediatek, Huawei, Rockchip, and more, I don't think that losing NVidia's chips there is that bad.

    Now, if this means they're pulling out of CPUs altogether, I would be sad. I want to see what they can do with their custom ISA core.
  • sonicmerlin - Monday, May 11, 2015 - link

    I think their tablet GPUs are pretty awesome. The Denver CPU performed well on benchmarks even if performance was inconsistent in the Nexus 9. But that may be lollipop's fault

Log in

Don't have an account? Sign up now