CPU Performance

The original Note I played with was based on Qualcomm’s APQ8060 platform with MDM9200 baseband (the so-called Fusion 2 platform) and was for its time a pretty awesome piece of kit, combining LTE and a dual core SoC. The Note 2 I played with next was based on Samsung’s own Exynos 4412 SoC with quad core Cortex A9 at 1.6 GHz and Mali–400MP4 GPU. For the Note 3, I’m looking at a T-Mobile variant (SM-N900T if you want to be exact about it) which means it includes a Snapdragon 800 SoC, and Samsung has gone for the 2.3 GHz bin (really 2.265 GHz rounded up). Inside are 4 Krait 400 CPUs running at up to 2.3 GHz, and Adreno 330 graphics at up to 450 MHz, all built on TSMC’s 28nm HPM HK-MG process.

I should note that this is MSM8974 and not MSM8974AB which oddly enough one of Qualcomm’s customers already announced (Xiaomi for the Mi3) which boosts GPU clocks up to 550 MHz and the LPDDR3 memory interface up to 933 MHz, among a few other changes. I’ve confirmed that GPU clocks on the Note 3 are indeed maxing out at 450 MHz, and quite honestly it’s a bit early for 8974AB in the first place, though it wouldn’t surprise me to see Samsung eventually get that faster bin at some point and put it in something.

 

I should mention that the Note 3 (like many other Android devices - SGS4, HTC One) detects certain benchmarks and ensures CPU frequencies are running at max while running them, rather than relying on the benchmark workload to organically drive DVFS to those frequencies. Max supported CPU frequency is never exceeded in this process, the platform simply primes itself for running those tests as soon as they're detected. The impact is likely small since most of these tests should drive CPU frequencies to their max state regardless (at least on the CPU side), but I'm going to make it a point to call out this behavior whenever I see it from now on. Make no mistake, this is cheating plain and simple. It's a stupid cheat that most Android OEMs seem to be ok with and honestly isn't worth the effort. Update: Of our CPU tests only AndEBench is affected exclusively by Samsung's optimizations, the performance gain appears to be around 4%. Vellamo is gamed by all of the Snapdragon 800 platforms we have here (ASUS, LG and Samsung). None of this is ok and we want it to stop, but I'm assuming it's not going to. In light of that we're working with all of the benchmark vendors we use to detect and disable any cheats as we find them. We have renamed versions of nearly all of our benchmarks and will have uniquely named versions of all future benchmarks we use. We'll be repopulating our Bench data where appropriate.

CPU performance is honestly excellent. The Galaxy Note 3 is more or less the fastest Android smartphone we've tested up to this point. In the situations where we can do cross platform (OS/browser) comparisons, it isn't quite as fast as the iPhone 5s but in some cases it comes close.

AndEBench - Java

AndEBench - Native

SunSpider Javascript Benchmark 1.0 - Stock Browser

Google Octane Benchmark v1

Mozilla Kraken Benchmark - 1.1

Browsermark 2.0

Vellamo Benchmark - 2.0

Vellamo Benchmark - 2.0

GPU Performance

Samsung definitely likes to win, and the Galaxy Note 3 walks away with the GPU performance crown in literally every single offscreen test we've got here. The onscreen tests are obviously governed by display resolution, but all things being equal the Note 3 manages to get the edge over the PowerVR G6430 in Apple's iPhone 5s. It's also interesting to note that the Galaxy Note 3 appears to outperform all other Snapdragon 800 smartphones we've tested thus far. There's a couple of potential explanations here. First, the Galaxy Note 3 is using newer drivers than any of the other S800 platforms we've tested:

Note 3: 04.03.00.125.077
Padfone: 04.02.02.050.116
G2: 4.02.02.050.141

Secondly, it's unclear how much the manual CPU DVFS setting upon benchmark launch is influencing things - although I suspect it's significant in the case of something like 3DMark. 

Finally each manufacturer has the ability to define their own thermal limits/governor behavior, it could simply be that Samsung is a bit more aggressive on this front. We honestly haven't had enough time to dig into finding out exactly what's going on here (Samsung gave us less than a week to review 3 devices), but the end result are some incredibly quick scores for the Note 3. If I had to guess I'd assume it's actually a combination of all three vectors: drivers, high CPU frequencies and being more lenient with thermals.

Update: GFXBench 2.7 isn't affected by any optimizations here, but Basemark X and 3DMark are. We expect the Note 3's performance is inflated by somewhere in the 3 - 10% range. We're working on neutralizing this optimization across our entire suite.

GLBenchmark 2.7 - T-Rex HD

GLBenchmark 2.7 - T-Rex HD (Offscreen 1080p)

GLBenchmark 2.7 - Egypt HD

GLBenchmark 2.7 - Egypt HD (Offscreen 1080p)

3DMark Unlimited - Ice Storm

Basemark X - On Screen

Basemark X - Off Screen

Epic Citadel - Ultra High Quality, 100% Resolution

NAND & USB 3.0 Performance

Our Galaxy Note 3 review sample posted some incredible storage performance results, at least compared to all other Android smartphones we've tested. Sequential read and write performance are both class leading - the latter is nearly 2x better than the next fastest phone we've tested. Random read performance is decent, but it's random write performance that's surprising. Unlike the Moto X, the Galaxy Note 3 doesn't rely on a flash-friendly file system to get great random write performance - this is raw eMMC horsepower (if you can call ~600 IOPS that). The result isn't quite as good as what you get out of the Moto X, but it comes very close. Android 4.3 should bring FSTRIM support to the Galaxy Note 3, so as long as you remember to leave around 20% of your storage as free space you should enjoy relatively speedy IO regardless of what you do to the phone.

Sequential Read (256KB) Performance

Sequential Write (256KB) Performance

 

Random Read (4KB) Performance

Random Write (4KB) Performance

The Galaxy Note 3 ships with USB 3.0, unfortunately at least in its current state it doesn't seem to get any benefit from the interface. Although the internal eMMC is capable of being read from at ~100MB/s, sustained transfers from the device over adb averaged around 30MB/s regardless of whether or not I connected the Note 3 to a USB 2.0 or 3.0 host.

Update: USB 3.0 does work on the Note 3, but only when connected to a Windows PC with USB 3.0. Doing so brings up a new option in the "USB Computer Connection" picker with USB 3.0 as an option. Ticking this alerts you that using USB 3.0 might interfere with calls and data, but then switches over. Connection transfer speed is indeed faster in this mode as well, like you'd expect.

 

It only appears on Windows as well, my earlier attempts were on OS X where this popup option never appears. 

Battery Life & Charge Time Display
Comments Locked

302 Comments

View All Comments

  • darkich - Tuesday, October 1, 2013 - link

    ..according to just published analysis on Display mate site, Note 3 pretty much has THE BEST DISPLAY on the market right now, with excellent readings across the board.
    And the performance!
    Both the processing ability and memory speed are alongnthe lines of a high end ultrabook from just a couple of years ago!

    All in all, gadget of the year imho.
    One that I'll be very happy to spend my money on.
  • repoman27 - Tuesday, October 1, 2013 - link

    "3.0 at present should give you faster transfer rate (it doesn't in practice as you'll soon see), and eventually faster charging, but the Note 3 continues to use Samsung’s 2.0 amp charging spec and rate, but more on that later."

    I thought USB 3.0 only specced up to 900 mA, and Battery Charging 1.2 Specification (which is applicable to both USB 2.0 and 3.0 devices) went up to a max of 1.5 A for "PDs" or portable devices. That would make Sammy's 2.0 A spec proprietary, just like Apple's 2.1 A mode, so they really could have gone wherever they wanted with it up to the 3.0 A safety limit for Micro-B/AB connectors.

    USB Power Delivery Specification 1.0 defines new modes up to 5V @ 2A, or 20V @ 3A for Micro-B/AB, but it requires new detectable cables for > 5 V or 1.5 A.
  • WhitneyLand - Tuesday, October 1, 2013 - link

    Another vote for taking a stronger stance against benchmark manipulation. Yes, AT called it out first, but more can be done. It needs to gain a dedicated subheading in future reviews and ideally dedicated pipeline articles.

    Someone commented this practice is no different that turning off power savings in a PCs bios and running a benchmark, but that is a bad analogy. The problems here are:

    1). Benchmarks become incomparable between devices.

    2). The practice is a deliberate move to make benchmarks more artificial while the best benchmarks try to move closer to approximating real world use.

    3). The lack of full disclosure is disingenuous on the part of manufactures and weakens trust in the industry.

    Call it cheating or not, the results are not good for anyone. Of all the people on the planet who can improve this situation AT may be in the best position to do so. Situations like these are the highest calling of journalism. We think this is important and respectfully ask for your help.
  • dawheat - Tuesday, October 1, 2013 - link

    I think there's a big difference between running your hardware at over 100% in a benchmark and running it at 100%. There needs to be clarity on what exactly Samsung is doing here (unlike the S4).

    If they are truly only running it at 100%, then I have to wonder why the benchmark isn't doing so by default? I'd be concerned then in software variation between devices where the benchmark runs one device closer to 100% than another. There should be validation that a CPU benchmark is stressing the CPUs at 100% across all the devices it's being tested across. If not, you're testing the benchmark software as much as the device, instead of just the device.
  • ddriver - Tuesday, October 1, 2013 - link

    Blame power saving features implementation - if you ask me it makes no difference whether you explicitly hint the CPU to go full frequency for a particular task, or the CPU analyzes load and applies the clocking accordingly. The latter takes time, thus the scores are a little lower, because the CPU doesn't run at full frequency for the duration of the test.

    As I already mentioned, this hack doesn't really make the CPU any better than it is, it just ensures the benchmark is ran at the processor peak capabilities and no performance is lost due to underclocking and adjusting the frequency dynamically.
  • ddriver - Tuesday, October 1, 2013 - link

    Benchmarks are ALREADY incompatible between OS vendors, measuring CPU performance with JS code that runs on fundamentally different JS engine implementations is pointless to say the least.
  • dawheat - Tuesday, October 1, 2013 - link

    Brian - your brightness results are in line with the DisplayMate review (http://www.displaymate.com/Galaxy_Note3_ShootOut_1... but could you also see if you can replicate the auto-brightness "overdrive" for lack of a better term in bright ambient light?
  • tanyet - Tuesday, October 1, 2013 - link

    Just wanted to fix the link

    http://www.displaymate.com/Galaxy_Note3_ShootOut_1...
  • tanyet - Tuesday, October 1, 2013 - link

    I'm may be reading this wrong but the impression I got from DisplayMate was that the Note 3 was much brighter.

    "Up until the Galaxy Note 3, OLED displays have been somewhat to significantly dimmer than competing LCD displays. The Note 3 has changed that in a big way…it’s an impressive 55 percent brighter than the Note II and a solid 25 percent brighter than the Galaxy S4. For most image content it provides over 400 cd/m2, comparable or higher than most LCD displays in this size class. Even more impressive is that when Automatic Brightness is turned on, the Note 3 hits an incredible 660 cd/m2 in high ambient light, where it’s needed (85 percent brighter than the Note II and 40 percent brighter than the Galaxy S4 with Automatic Brightness) – the brightest mobile display we have ever tested in the Shoot-Out series. An impressive achievement for OLEDs!"
  • Brian Klug - Tuesday, October 1, 2013 - link

    I consider the ambient light boost method almost along the same lines as benchmarkboost – it isn't something that's there all the time, and it's not accessible unless you're in certain circumstances. Other OEMs have done this, and I continue to measure in a dark room with the slider at 100%.

    That said, the fact that AMOLED can go to 600+ nits is old, even back with SGS2 you could modify the kernel and drive the panel that high.

    -Brian

Log in

Don't have an account? Sign up now