HTPC Credentials - Display Outputs Capabilities

The Frost Canyon NUC comes with two distinct display outputs capable of supporting up to three simultaneous displays. The characteristics are summarized in the table below. From a HTPC use-case perspective, the entries of interest include the ability to support UHD (3840 x 2160) or higher resolutions, along with HDCP 2.2. The latter enables the display output to be used for viewing protected content such as 4K Netflix streams and play back UltraHD Blu-rays.

Supporting the display of high-resolution protected video content is a requirement for even a casual HTPC user. In addition, HTPC enthusiasts also want their systems to support refresh rates that either match or be an integral multiple of the frame rate of the video being displayed. Most displays / AVRs are able to transmit the supported refresh rates to the PC using the EDID metadata. In some cases, the desired refresh rate might be missing in the list of supported modes, and custom resolutions may need to be added.

Custom Resolutions

Our evaluation of the as a HTPC was done using the native HDMI output connected to a TCL 55P607 4K HDR TV via a Denon AVR-X3400H AV receiver. We tested out various display refresh rates ranging from 23.976 Hz to 59.94 Hz. Of particular interest is the 23.976 Hz (23p) setting, which Intel used to have trouble with in the pre-Broadwell days.

The gallery below presents screenshots from the other refresh rates that were tested. The system has no trouble maintaining a fairly accurate refresh rate throughout the duration of the video playback.

High Dynamic Range (HDR) Support

The ability of the system to support HDR output is brought out in the first line of the madVR OSD in the above pictures. The display / desktop was configured to be in HDR mode prior to the gathering of the above screenshots.

The CyberLink Ultra HD Blu-ray Advisor tool confirms that our setup (Intel NUC10i7FNH + Denon AVR-X3400H + TCL 55P607) supports HDCP 2.2 along with HDR. However, despite enabling SGX in the BIOS with 128MB of RAM allocation (and also trying to set it to software-controlled) and confirming its activation with the MS Store App, Cyberlink's BD Advisor refused to see SGX enabled. As of the time of posting this review, we couldn't make the Frost Canyon NUC work for UHD Blu-ray playback. If we have better luck later on, the review will be updated to make note of the needed fix.

[Update (3/17/2020): While I continue to have terrible luck in enabling SGX to operate correctly with the CuberLink Ultra HD Blu-ray Advisor tool, Intel sent across proof that the Frost Canyon NUC is indeed capable of playing back Ultra HD Blu-rays.

It is likely that most consumers using the pre-installed Windows 10 Home x64 / pre-installed drivers will have a painless experience unlike mine which started off the system as a barebones version.]

Networking and Storage Performance HTPC Credentials - YouTube and Netflix Streaming
Comments Locked

85 Comments

View All Comments

  • The_Assimilator - Monday, March 2, 2020 - link

    It's not, but the point is still valid: nobody buying these things is doing so because they expect them to be graphics powerhouses.
  • HStewart - Monday, March 2, 2020 - link

    But some people are so naive and don't realize the point. I came up in days when your purchase card that didn't even have GPU's on it. Not sure what level iGPU's are but they surely can run business graphics fine and even games a couple of years ago.
  • notb - Thursday, March 5, 2020 - link

    Horrible?
    These iGPUs can drive 3 screens with maybe 1-2W power draw. Show me another GPU that can do this.

    This is an integrated GPU made for efficient 2D graphics. There's very little potential to make it any better.
  • PaulHoule - Monday, March 2, 2020 - link

    Well, Intel's horrible iGPUs forced Microsoft to walk back the graphical complexity of Windows XP. They kept the GPU dependent architecture, but had to downgrade to "worse than cell phone" visual quality because Intel kneecaped the graphics performance of the x86 platform. (Maybe you could get something better, but developers can't expect you to have it)
  • HStewart - Monday, March 2, 2020 - link

    I think we need actual proof on these bias statements. I think there is big difference of running a screen at 27 or more inches than 6 to 8 inches no matter what the resolution.
  • Korguz - Monday, March 2, 2020 - link

    we need proof of your bias statements, but yet, you very rarely provide any.. point is ??
  • Samus - Monday, March 2, 2020 - link

    What does screen size have to do with anything? Intel can't make an iGPU that can drive a 4K panel fluidly, meanwhile mainstream Qualcomm SoC's have GPU performance able to drive 4K panels using a watt of power.
  • HStewart - Tuesday, March 3, 2020 - link

    Can Qualcomm actually drive say a 32 in 4k screen efficiently. Also what is being measure here, Videos or actually games and that depends on how they are written.
  • erple2 - Saturday, March 14, 2020 - link

    I'm not sure that I understand your statement here, as it doesn't seem to make any sense. I was not aware that they physical dimensions of the screen mattered at all to the GPU, apart from how many pixels it has to individually manage/draw. If your implication is the complexity and quantity of information that can be made significant on a 32" screen is different from a 5.7" screen, then I suppose you can make that argument. However, I have to make guesses as to what you meant for this to come to that conclusion.

    Generally the graphical load to display 4k resolution is independent of whether the actual screen is 6" or 100". Unless I'm mistaken?
  • PeachNCream - Monday, March 2, 2020 - link

    For once, I agree with HStewart (feels like I've been shot into the Twilight Zone to even type that). To the point though, Windows XP was released in 2001. Phones in that time period were still using black and white LCD displays. Intel's graphics processors in that time period were the Intel Extreme series built into the motherboard chipset (where they would remain until around 2010, after the release of WIndows 7). Sure those video processors are slow compared to modern cell phones, but nothing a phone could do when XP was in development was anything close to what a bottom-feeder graphics processor could handle. I mean crap, Doom ran (poorly) on a 386 with minimal video hardware and that was in the early 1990s whereas phones eight years later still didn't have color screens.

Log in

Don't have an account? Sign up now