Battery Life

Battery life remains probably the single largest differentiator for devices lately, and of huge concern to enthusiasts and normal shoppers alike. We’ve already caught a glimpse of how well 8974 fares from a power perspective inside the LG G2, a device that posted some seriously impressive battery numbers. The Note 3 we’re looking at is also 8974 based since it’s a T-Mobile model, and thus we expect the same kind of battery life.

With this generation of Note, battery gets even larger. The Note started with a then quite large 9.25 watt hour battery, then Note 2 moved to 11.78 watt hours, and Note 3 now moves to a very large 12.16 watt hour battery with of course the newest 3.8V chemistry and all that comes along with it. Display size goes up, but those power gains are offset in other places.

After we talked about the panel self refresh features in the G2 a few people reached out and let me know that this feature has been shipping for a while in some phones, and it’s easy to check for. If we look under the display subsystem we can see that the same MIPI_CMD_PANEL type 9 is used, which refers to this type of interface.

 

Qualcomm HWC state:
 MDPVersion=500
 DisplayPanel=9

define MIPI_CMD_PANEL ‘9’

Our battery life tests are unchanged and consist of a set of popular webpages that are loaded on a schedule with the display set to exactly 200 nits and repeated until the battery runs out and the device dies on both WiFi and cellular data connections. In this case that means T-Mobile LTE which is 10 MHz FDD in my market, I haven’t had a chance to run the Note 3 on HSPA+ yet, or complete the call test (which is starting to get ridiculous, and probably breaks 24 hours in the case of the Note 3).

AT Smartphone Bench 2013: Web Browsing Battery Life (4G LTE)

On LTE the Note 3 does very well, coming just shy of the pack of iPhones, at just over 8 hours. Interestingly enough it’s just north of the G2s as well, which do have a smaller battery but also smaller display. The Note 3 also is the first device to ship with Qualcomm’s QFE1100 envelope tracker solution from the RF360 front end portfolio, which lowers power consumption by up to 20 percent and heat dissipation by up to 30 percent by allowing the power amplifiers to follow the desired output waveform. There’s more on that later in the cellular section.

AT Smartphone Bench 2013: Web Browsing Battery Life (WiFi)

On WiFi the Note 3 does better by 22 percent, but not the kind of huge jump I’m used to seeing between cellular and WiFi testing. This tells me the Note 3 battery life is really gated by the display, which is almost always the largest consumer of power in a device. That said the Note 3 does very well all things considered, especially in comparison to the APQ8064 (Fusion 3) phones which came before it, like SGS4. New silicon and new process inside MSM8974 definitely helps move battery life forward here with the race to sleep game.

Charging is an interesting story on the Note 3, but primarily because of what doesn’t change. The Note 3 continues to use Samsung’s tablet charging specification and charger, which has 2 amps of maximum output. The Note 3 draws 2 amps over a considerable amount of the charging curve, like other Samsung devices (in the linear part of the charge curve). USB 3.0 doesn’t change things up here quite yet with the new supported charge voltages that are coming eventually with the power delivery specification.

Device Charge Time - 0 to 100 Percent

The Note 3 does charge faster overall compared to the SGS4 however thanks in part to the new PMIC (PM8941) which is part of the overall 8974 platform story.

 

S Pen Performance: CPU, GPU, NAND & USB 3.0
Comments Locked

302 Comments

View All Comments

  • darkich - Tuesday, October 1, 2013 - link

    ..according to just published analysis on Display mate site, Note 3 pretty much has THE BEST DISPLAY on the market right now, with excellent readings across the board.
    And the performance!
    Both the processing ability and memory speed are alongnthe lines of a high end ultrabook from just a couple of years ago!

    All in all, gadget of the year imho.
    One that I'll be very happy to spend my money on.
  • repoman27 - Tuesday, October 1, 2013 - link

    "3.0 at present should give you faster transfer rate (it doesn't in practice as you'll soon see), and eventually faster charging, but the Note 3 continues to use Samsung’s 2.0 amp charging spec and rate, but more on that later."

    I thought USB 3.0 only specced up to 900 mA, and Battery Charging 1.2 Specification (which is applicable to both USB 2.0 and 3.0 devices) went up to a max of 1.5 A for "PDs" or portable devices. That would make Sammy's 2.0 A spec proprietary, just like Apple's 2.1 A mode, so they really could have gone wherever they wanted with it up to the 3.0 A safety limit for Micro-B/AB connectors.

    USB Power Delivery Specification 1.0 defines new modes up to 5V @ 2A, or 20V @ 3A for Micro-B/AB, but it requires new detectable cables for > 5 V or 1.5 A.
  • WhitneyLand - Tuesday, October 1, 2013 - link

    Another vote for taking a stronger stance against benchmark manipulation. Yes, AT called it out first, but more can be done. It needs to gain a dedicated subheading in future reviews and ideally dedicated pipeline articles.

    Someone commented this practice is no different that turning off power savings in a PCs bios and running a benchmark, but that is a bad analogy. The problems here are:

    1). Benchmarks become incomparable between devices.

    2). The practice is a deliberate move to make benchmarks more artificial while the best benchmarks try to move closer to approximating real world use.

    3). The lack of full disclosure is disingenuous on the part of manufactures and weakens trust in the industry.

    Call it cheating or not, the results are not good for anyone. Of all the people on the planet who can improve this situation AT may be in the best position to do so. Situations like these are the highest calling of journalism. We think this is important and respectfully ask for your help.
  • dawheat - Tuesday, October 1, 2013 - link

    I think there's a big difference between running your hardware at over 100% in a benchmark and running it at 100%. There needs to be clarity on what exactly Samsung is doing here (unlike the S4).

    If they are truly only running it at 100%, then I have to wonder why the benchmark isn't doing so by default? I'd be concerned then in software variation between devices where the benchmark runs one device closer to 100% than another. There should be validation that a CPU benchmark is stressing the CPUs at 100% across all the devices it's being tested across. If not, you're testing the benchmark software as much as the device, instead of just the device.
  • ddriver - Tuesday, October 1, 2013 - link

    Blame power saving features implementation - if you ask me it makes no difference whether you explicitly hint the CPU to go full frequency for a particular task, or the CPU analyzes load and applies the clocking accordingly. The latter takes time, thus the scores are a little lower, because the CPU doesn't run at full frequency for the duration of the test.

    As I already mentioned, this hack doesn't really make the CPU any better than it is, it just ensures the benchmark is ran at the processor peak capabilities and no performance is lost due to underclocking and adjusting the frequency dynamically.
  • ddriver - Tuesday, October 1, 2013 - link

    Benchmarks are ALREADY incompatible between OS vendors, measuring CPU performance with JS code that runs on fundamentally different JS engine implementations is pointless to say the least.
  • dawheat - Tuesday, October 1, 2013 - link

    Brian - your brightness results are in line with the DisplayMate review (http://www.displaymate.com/Galaxy_Note3_ShootOut_1... but could you also see if you can replicate the auto-brightness "overdrive" for lack of a better term in bright ambient light?
  • tanyet - Tuesday, October 1, 2013 - link

    Just wanted to fix the link

    http://www.displaymate.com/Galaxy_Note3_ShootOut_1...
  • tanyet - Tuesday, October 1, 2013 - link

    I'm may be reading this wrong but the impression I got from DisplayMate was that the Note 3 was much brighter.

    "Up until the Galaxy Note 3, OLED displays have been somewhat to significantly dimmer than competing LCD displays. The Note 3 has changed that in a big way…it’s an impressive 55 percent brighter than the Note II and a solid 25 percent brighter than the Galaxy S4. For most image content it provides over 400 cd/m2, comparable or higher than most LCD displays in this size class. Even more impressive is that when Automatic Brightness is turned on, the Note 3 hits an incredible 660 cd/m2 in high ambient light, where it’s needed (85 percent brighter than the Note II and 40 percent brighter than the Galaxy S4 with Automatic Brightness) – the brightest mobile display we have ever tested in the Shoot-Out series. An impressive achievement for OLEDs!"
  • Brian Klug - Tuesday, October 1, 2013 - link

    I consider the ambient light boost method almost along the same lines as benchmarkboost – it isn't something that's there all the time, and it's not accessible unless you're in certain circumstances. Other OEMs have done this, and I continue to measure in a dark room with the slider at 100%.

    That said, the fact that AMOLED can go to 600+ nits is old, even back with SGS2 you could modify the kernel and drive the panel that high.

    -Brian

Log in

Don't have an account? Sign up now