The AMD Embedded G-Series platform being introduced tonight is the world's first Accelerated Processing Unit (APU) for embedded systems. AMD has had quite a bit of history of supporting x86 based embedded systems. Starting with the Geode processor in 2003 (obtained from National Semiconductors and used in the OLPC project), AMD went on to introduce AMD64 technology into the embedded markets with the AMD Opteron processors in 2005. In 2007, the addition of graphics and other chipset options by AMD enabled comprehensive embedded solutions. In 2009, AMD introduced BGA (Ball Grid Array) packaging to meet customer demand.

At CES 2011, they gave us a sneak peek into the Embedded G-Series platform based on Brazos. AMD has increased performance and features in every generation while bringing down the power, area and price barriers for x86 in the embedded market.

The embedded market space is dominated by SoCs based on RISC processors such as ARM and MIPS. For most power sensitive embedded applications, PowerPC and x86 based solutions do not make the cut. x86, in particular, has been the dark horse due to the excessive power consumption for systems based on that architecture. Process shrinks have helped lower the power consumption numbers. However, we are still a few nodes away from when the x86 based solutions can really compete with RISC based solutions on the power front.

In the meantime, solutions like what we are seeing from AMD today integrate premium graphics capabilities within power envelops similar to what x86 used to consume in the previous generation—so you get CPU+GPU instead of just a CPU. RISC based embedded solutions may still be winning on the power front; however, for applications where slightly higher power consumption is not a concern, the x86 threat from the AMD embedded G-Series platform can become a cause for concern. MIPS is usually popular in such applications (set top boxes, digital signage etc.) and they will be facing credible opposition with AMD's integrated graphics capabilities.

The AMD Embedded G-Series
Comments Locked


View All Comments

  • nitrousoxide - Wednesday, January 19, 2011 - link

    Such stupid typo >.<
  • MeanBruce - Wednesday, January 19, 2011 - link

    Happens to all of us ;)
  • Vesku - Wednesday, January 19, 2011 - link

    Surprised at all the comments offended that this isn't written as a review. This is a news item regarding a press release, of course it's going to read differently then a review. Commentary is sound, these products are being introduced to compete with the Atom embedded product line with a bit of a question mark about how much market share this AMD higher performing two chip solution will grab.
  • jido - Thursday, January 20, 2011 - link

    Is that the kind of processor we can find in a digicam or is it too power-hungry for that?
  • vol7ron - Thursday, January 20, 2011 - link

    I think it'd be overly power hungry, slow, and powerful.

    I think digicams use digital signal processors, that are generally smaller and have much less instruction sets, def not x86. I think Canon makes its own (DIGIC), but there are other third party vendors. Again, I think TI would be useful in those as well.
  • silverblue - Thursday, January 20, 2011 - link

    And all you're doing here is hijack AMD articles without providing a counterpoint. Please refrain from making comments unless you can contribute to the discussion.

    I must say that "AMDiot" doesn't roll off the tongue very well. :/ You need to work out a better one!
  • yyrkoon - Thursday, January 20, 2011 - link

    Do not despair in the face of nay-sayers Ganesh.

    All I have to say is that *if* someone does not like the content. . . then they do not need to read it. Personally, I could have got by with a simple data sheet. But you know what ? I did not write the article ! So I take the content as it was given. Seriously, I doubt 1/10th the comment's on this article were written by people who even understand what embedded truly means. Having delusions of grandeur that they will be running COD4 on this platform someday . . .

    From a Point of sales standpoint however. I am not completely convinced this is necessary. Unless you think a cashier would better serve us better by staring at a pretty AERO desktop instead of checking the customer out at the check stand. Perhaps self check out(but that really is just a KIOSK). Medical imaging . . . again not completely convinced for multiple reasons. A handheld battery powered gaming platform - Yes. Assuming power consumption could be kept low enough. Thin client ? Yes. KIOSK ? Yes.

    With all of the above said. I am still very sad for the industry. It is very obvious to me that the technology companies are having a hard time thinking out side of the box. Otherwise we might have desktop systems that use laptop technology. Instead of requiring a small nuclear power plant to power a moderate sized LAN party. We could have very powerful desktops that sip power. In this persons eyes, the technology companies are going about things wrong.

    Perhaps it would behoove AMD to parter with ARM if they truly care about power consumption in the embedded market.
  • vol7ron - Thursday, January 20, 2011 - link

    I liked the article, but your point "All I have to say is that *if* someone does not like the content. . . then they do not need to read it." does not make sense. How are you to know what the article is or sounds like unless you've already read it?

    "Otherwise we might have desktop systems that use laptop technology."
    Ummm... why? (1)You could just use your laptop (2) There are already motherboards available for desktops that can use laptop procs. What laptop technology are you referring to?

    "We could have very powerful desktops that sip power."
    That's the way things are headed. Why not just underclock your components if you're that cautious?
  • yyrkoon - Thursday, January 20, 2011 - link

    Key words; Flexibility, and cost..

    Example scenario:

    A user needs a certain amount of CPU/GPU performance to perform certain computational tasks. This person may, or may not need a huge screen to perform these tasks. Yet still may need enough performance to get the desired results in a timely fashion.

    Why must this user pay a ridiculous power bill on top of having to buy the expensive, and possibly proprietary hardware? A laptop GPU for a very simple example, can use far less power, and offer 50%-75% the performance, At all but the highest resolutions, that performance difference will pale in comparison to the power used to achieve the end goal.

    To be sure, gaming as an application could fit real well into this category. But gaming is not the only application that can make very good use of this type of hardware. Image manipulation, video/audio editing/encoding, medium duty servers, development systems, multi function larger embedded systems, etc. All of these could greatly benefit on multiple levels by using this type of hardware( and the list goes on).

    1) Laptops -> notoriously inflexible.
    2) Motherboards you suggest -> also inflexible. Unless, perhaps the end user is not forced into using a desktop classed power hog of a graphics card. Perhaps single or multiple MXM card slots could be implemented.

    Again, the technology is already there. Why not use it ? Reinventing the wheel in the shape of a triangle just does not seem to be working well for the industry . . .
  • yyrkoon - Thursday, January 20, 2011 - link

    Also . . .

    "I liked the article, but your point "All I have to say is that *if* someone does not like the content. . . then they do not need to read it." does not make sense. How are you to know what the article is or sounds like unless you've already read it?"

    It makes perfect sense. Unless you're the type who needs to be handheld every step of your life. I am sure you have more than 3 braincells to rub together . . . just be careful you do not start a fire.

    Simply put, in terms perhaps you can now understand. *You* did not pay to view the content. *You* did not write the content. If *you* do not like it. That is your problem. Keep it to yourself. If it is really that much of a problem for *you*. email the author and complain to him in private. You could even be constructive, and give him a step by step why *you* do not like it.

    Now, because I feel you wont get this, since I have had to hold your hand a couple of times already. "*You*" meaning anyone who feels compelled to complain about the article.

    Apply said "idea" to my post while you're at it.

Log in

Don't have an account? Sign up now