4K, HDMI, and Deep Color

As of this writing (January 2014), there are 2 HDMI specifications that support 4K Video (3840×2160 16:9). HDMI 1.4 and HDMI 2.0. As far as I know, there are currently no HDMI 2.0 capable TVs available in the market (though many were announced at CES this week).

Comparing HDMI 1.4 (Black text) and 2.0 (Orange). Source: HDMI 2.0 FAQ on HDMI.org

Comparing HDMI 1.4 (Black text) and 2.0 (Orange). Source: HDMI 2.0 FAQ

A detail that tends to be neglected in all this 4K buzz is the Chroma Subsampling. If you’ve ever compared an HDMI signal against something else (DVI, VGA), and the quality looked worse, one of the reasons is because of Chroma Subsampling (for the other reason, see xvYCC at the bottom of this post).

Chroma Subsampling is extremely common. Practically every video you’ve ever watched on a computer or other digital video player uses it. As does the JPEG file format. That’s why we GameDevs prefer formats like PNG that don’t subsample. We like our source data raw and pristine. We can ruin it later with subsampling or other forms of texture compression (DXT/S3TC).

In the land of Subsampling, a descriptor like 4:4:4 or 4:2:2 is used. Images are broken up in to 4×2 pixel cells. The descriptor says how much color (chroma) data is lost. 4:4:4 is the perfect form of Chroma Subsampling. Chroma Subsampling uses the YCbCr color space (sometimes called YCC) as opposed to the standard RGB color space.

Great subsampling diagram from Wikipedia, showing the different encodings mean

Great subsampling diagram from Wikipedia, showing the different encodings mean

Occasionally the term “4:4:4 RGB” or just “RGB” is used to describe the standard RGB color space. Also note, Component Video cables, though they are colored red, green, and blue, are actually YPbPr encoded (the Analog version of YCbCr).

Looking at the first diagram again, we can make a little more sense of it.

Comparing HDMI 1.4 (Black text) and 2.0 (Orange). Source: HDMI 2.0 FAQ on HDMI.org

In other words:

  • HDMI 1.4 supports 8bit RGB, 8bit 4:4:4 YCbCr, and 12bit 4:2:2 YCbCr, all at 24-30 FPS
  • HDMI 2.0 supports RGB and 4:4:4 in all color depths (8bit-16bit) at 24-30 FPS
  • HDMI 2.0 only supports 8bit RGB and 8bit 4:4:4 at 60 FPS
  • All other color depths require Chroma Subsampling at 60 FPS in HDMI 2.0
  • Peter Jackson’s 48 FPS (The Hobbit’s “High Frame Rate” HFR) is notably absent from the spec

Also worth noting, the most well supported color depths are 8bit and 12bit. The 12 bit over HDMI is referred to as Deep Color (as opposed to High Color).

The HDMI spec has supported only 4:4:4 and 4:2:2 since HDMI 1.0. As of HDMI 2.0, it also supports 4:2:0, which is available in HDMI 2.0′s 60 FPS framerates. Blu-ray movies are encoded in 4:2:0, so I’d assume this is why they added this.

All this video signal butchering does beg the question: Which is the better trade off? More color range per pixel, or more pixels with color channels?

I have no idea.

If I was to guess though, because TV’s aren’t right in front of your face like a Computer Monitor, I’d expect 4K 4:2:2 might actually be better. Greater luminance precision, with a bit of chroma fringing.

Some Plasma and LCD screens use something called Pentile Matrix arrangement of their red, green, and blue pixels.

The AMOLED screen of the Nexus One

The AMOLED screen of the Nexus One. A green for every pixel, but every other pixel is a blue, switching red/blue order every line. Not all AMOLED screens are Pentile. The Super AMOLED Plus screen found in the PS Vita uses a standard RGB layout

So even if we wanted more color fidelity per individual pixel, it may not be physically there.

Deep Color

Me, my latest graphics fascination is Deep Color. Deep Color is the marketing name for more than 8 bits per pixel of a color. It isn’t necessarily something we need in asset creation (not me, but some do want full 16bit color channels). But as we start running filters/shaders on our assets, stuff like HDR (but more than that), we end up losing the quality of the original assets as they are re-sampled to fit in to an 8bit RGB color space.

This can result in banding, especially in near flat color gradients.

From Wikipedia

From Wikipedia, though it’s possible the banding shown may be exaggerated

Photographers have RAW and HDR file formats for dealing with this stuff. We have Deep Color, in all its 30bit (10bpp), 36bit (12bpp) and 48bit (16bpp) glory. Or really, just 36bit (12bpp), but 48bit can be used as a RAW format if we wanted.

So the point of this nerding: An ideal display would be 4K, support 12bit RGB or 12bit YCbCr, at 60 FPS.

The thing is, HDMI 2.0 doesn’t support it!

Perhaps that’s fine though. Again, HDMI is a television spec. Most television viewers are watching video, and practically all video is 4:2:0 encoded anyway (which is supported by the HDMI 2.0 spec). The problem is gaming, where our framerates can reach 60FPS.

The HDMI 2.0 spec isn’t up to spec. ;)

Again this is probably fine. The now-current generation of consoles, nobody is really pushing them as 4K machines anyway. Sony may have 4K video playback support, but most high end games are still targeting 1080p and even 720p. 4K is 4x the pixels of 1080p. I suppose it’s an advantage that 4K only supports 30FPS right now, meaning you only need to push 2x the data to be a “4K game”, but still.

HDMI Bandwidth is rated in Gigabits per second.

  • HDMI 1.0->1.2: ~4 Gb
  • HDMI 1.3->1.4: ~8 Gb
  • HDMI 2.0: ~14 Gb (NEW)

Not surprisingly, 4K 8bit 60FPS is ~12 Gb of data, and 30FPS is ~6 Gb of data. Our good friend 4K 12bit 60FPS though is ~18 Gb of data, well above the limits of HDMI 2.0.

To compare, Display Port.

  • DisplayPort 1.0 and 1.1: ~8 Gb
  • DisplayPort 1.2: ~17 Gb
  • DisplayPort 1.3: ~32 Gb (NEW)

They’re claiming 8K and 4K@120Hz (FPS) support with the latest standard, but 18 doesn’t divide that well in to 32, so somebody has to have their numbers wrong (admittedly I did not divide mine by 1024, but 1000). Also since 8k is 4x the resolution of 4K, and the bandwidth only roughly doubled, practically speaking DisplayPort 1.3 can only support 8k 8bit 30FPS. Also that 4K@120Hz is 4K 8bit 120FPS. Still, if you don’t want 120FPS, that leaves room for 4K 16bit 60FPS, which should be more than needed (12bit). I wonder if anybody will support 4K 12bit 90FPS over DisplayPort?

And that’s 4K.

1080p and 2K Deep Color

Today 1080p is the dominant high resolution: 1920×1080. To the film guys, true 2K is 2048×1080, but there are a wide variety of devices in the same range, such as 2560×1600 and 2560×1440 (4x 720p). These, including 1080p, are often grouped under the label 2K.

A second of 1080p 8bit 60FPS data requires ~3 Gb of bandwidth, well within the range supported by the original HDMI 1.0 spec (though why we even had to deal with 1080i is a good question, probably due to the inability to even meet the HDMI spec).

To compare, a second of 1080p 12bit 60FPS data requires ~4.5 Gb of bandwidth. Even 1080p 16bit 60FPS needed only ~6 Gb, well within the range supported by HDMI 1.3 (where Deep Color was introduced). Plenty of headroom still. Only when we push 2560×1440 12bit 60FPS (~8 Gb) do we hit the limits of HDMI 1.3.

So from a specs perspective, I just wanted to note this because Deep Color and 1080p are reasonable to support on now-current generation game consoles. Even the PlayStation 3, by specs, supported this. High end games probably didn’t have enough processing to spare for this, but it’s something to definitely consider supporting on PlayStation 4 and Xbox One. As for PC, many current GPUs support Deep Color in full-screen resolutions. Again, full-screen, not necessarily on your Desktop (i.e. windowed). From what I briefly read, Deep Color is only supported on the Desktop with specialty cards (FirePro, etc).

One more thing: YCrCb (YCC) and xvYCC

You make have noticed watching a video file that the blacks don’t look very black.

Due to a horrible legacy thing (CRT displays), data encoded as YCrCb use values from 16->240 (15-235?) instead of 0->255. Thats quite the loss, nearly 12% of the available data range, effectively lowering the precision below 8bit. The only reason it’s still done is because of old CRT televisions, which can be really tough to find these days. Regrettably, that does mean both of the original DVD and Bluray movies standards were forced to comply to this.

Sony proposed x.v.Color (xvYCC) as a way of finally forgetting this stupid limitation of old CRT displays, and using the full 0->255 range. As of HDMI 1.3 (June 2006), xvYCC and Deep Color are part of the HDMI spec.

Several months later (November 2006), The PlayStation 3 was launched. So as a rule of thumb, only HDMI devices newer than the PlayStation 3 will could potentially support xvYCC. This means televisions, audio receivers, other set top boxes, etc. It’s worth noting that some audio receivers may actually clip video signals to the 16-240 range, thus ruining picture quality of an xvYCC source. Also the PS3 was eventually updated to HDMI 1.4 via a software update, but the only 1.4 feature supported is Stereoscopic 3D.

Source. Wikipedia.

The point of bringing this up is to further emphasize the potential for color banding and terrible color reproduction over HDMI. An 8bit RGB framebuffer is potentially being compressed to fit within the YCbCr 16-240 range before it gets sent over HDMI. The PlayStation 3 has a setting for enabling the full color range (I forget the name used), and other new devices probably do to (unlikely named xvYCC).

According to Wikipedia, all of the Deep Color modes supported by HDMI 1.3 are xvYCC, as they should be.

27 Responses to “4K, HDMI, and Deep Color”

  1. My brain hurts but good info. Btw, something may be wrong with the Heart icon at the bottom of the post, couldn’t get it to increment.

  2. Starbeamrainbowlabs says:

    This is probably the wrong place to post, but I can’t find anywhere else….

    Do you know of an email address that I can contact about the ludumdare.com website?
    I can’t load the web page at all! It has been like this for at least a week. Before that, the connection was super slow (like ~1KB / sec). My internet connect is usually ~1MB / sec.

    Traceroute:

    Tracing route to ludumdare.com [207.58.176.222]
    over a maximum of 30 hops:

    1 2 ms 4 ms 5 ms x.x.x.x
    2 34 ms 34 ms 34 ms 10.x.x.x
    3 43 ms 35 ms * 10.x.x.x
    4 112 ms 43 ms 43 ms te13-4.br02.ldn01.pccwbtn.net [195.66.236.167]
    5 * * * Request timed out.
    6 * * * Request timed out.
    7 * * * Request timed out.
    8 * 138 ms * vps.ludumdare.com [207.58.176.222]
    9 * * * Request timed out.
    10 * * 132 ms vps.ludumdare.com [207.58.176.222]

    Thank you :)

    • Mike K says:

      Strange. I’m geographically a few thousand miles from our US server (I’m in Canada), and it’s quite speedy for me. It takes about 13 hops for me to get there. Here are the last few:

      10 154.54.80.82 (te4-7.ccr01.iad04.atlas.cogentco.com)
      11 38.104.30.102 (38.104.30.102)
      12 209.50.237.7 (sc-sdv3144.servint.net)
      13 207.58.176.222 (vps.ludumdare.com)

  3. Peter says:

    Any ideas on impact of Thunderbolt in all this even though TVs are less likely to adopt it

    • wrongquestion says:

      Thunderbolt video output is simply DisplayPort, so take a look at the DisplayPort numbers.

    • dr.no says:

      Thunderbolt just passes DisplayPort 1.2,
      well within the limit of TB2 20 Gb/s channel.
      for DisplayPort 1.3, there will need to be TB3 with double the bandwidth.

      “At the physical level, the bandwidth of Thunderbolt 1 and Thunderbolt 2 are identical, and Thunderbolt 1 cabling is thus compatible with Thunderbolt 2 interfaces. At the logical level, Thunderbolt 2 enables channel aggregation, whereby the two previously separate 10 Gbit/s channels can be combined into a single logical 20 Gbit/s channel.”

  4. Thunderbolt is not a video link says:

    Thunderbolt is not a video link. Thunderbolt is not a video link. Thunderbolt is not a video link. Thunderbolt is not a video link. Thunderbolt is not a video link. Thunderbolt is not a video link. Thunderbolt is not a video link.

    Thunderbolt is not a video link. Thunderbolt is not a video link. Thunderbolt is not a video link. Thunderbolt is not a video link. Thunderbolt is not a video link. Thunderbolt is not a video link. Thunderbolt is not a video link.

    Got it? Thunderbolt can carry multiplexed DisplayPort signals but Thunderbolt is not a video link. Thunderbolt is not a video link. Thunderbolt is not a video link. Thunderbolt is not a video link. Thunderbolt is not a video link.

  5. Duckie says:

    “As does the JPEG file format.”

    Subsampling is optional in JPEG; everything supports 4:4:4, 4:2:2, and 4:2:0.

    Anyway, few devices in my experience output 4:2:2 by default (do any even?), since 4:4:4 support is required and it’s easier to support. But, a little known fact is that many TVs subsample to 4:2:2 YCbCr in their internal processing, unless a game or graphics mode is selected.

  6. Petri says:

    “Most television viewers are watching video, and practically all video is 4:2:2 encoded anyway” – Over here in Scandinavia DVD, Blu-ray and all forms of broadcasting (terrestrial, satellite, cable, IPTV) have been 8-bit 4:2:0 since forever. You’re saying someone broadcasts 4:2:2 over there? Hmmmh. I kinda doubt it.

    xvYCC is not the same thing as using the full 0-255 range. You can already enable full range output from various sources (PS3, home computers, media players etc.), and it’s not xvYCC. It encodes additional colors by using both full range *and* negative RGB values. Also, to clarify, Deep Color and xvYCC are not the same thing at all.

    Finally, hate to burst your bubble, but the resolutions and bit depths listed in that HDMI 2.0 graph? Well, they’re not mandatory. Meaning TV manufacturers are allowed to release UHD displays that support only 8-bit 4:2:0 and they can still claim the display as HDMI 2.0 compliant. And since the requirements for an UHD logo include only 8-bit 4:2:0, it’ll be a long while before consumers are offered anything better that’s also moderately priced.

  7. Fantastic, in-depth article. Thank you for writing this. :)

  8. […] Terrifying. See also this article about color gamut over HDMI. […]

  9. Andy says:

    10 bit color (1024 values per-channel) is adequate for video display color w/r/t removing banding. 12bits is overkill as the steps are too small to notice by the human eye.

  10. Bruce Dawson says:

    You never actually explain what 4:2:2 and 4:2:0 mean. Also, why do you care if chroma is subsampled? 4K displays are at the limit of the luminance resolution that the eye can see, and the eyes chroma resolution is *way* lower. That’s why jpeg/video formats subsample chroma. To do otherwise would be inefficient.

    Deep color is good.

    • Bruce Dawson says:

      Oh wait — I missed your diagram. Ignore that part of my reply.

    • Mike K says:

      Also, why do you care if chroma is subsampled?

      Because a “monitor” is for accurate reproduction. Pro-audio speakers are called monitors for the same reason. Monitors are right in your face, unlike televisions which are across the room. The physical distance makes a difference when it comes what you can sense.

      There’s a lot of hearsay and marketing double-speak when it comes to this stuff. My so called 1080p “retinal screen” phone, yeah the >300 DPI resolution is good, but I can totally see the difference when antialiasing is on and off.

      Same thing with Pentile Matrix screens. I may not be able to clearly see the exact visual error, but by golly with the 2 options side by side, I can always tell which one is better.

      We certainly wouldn’t be talking 8K and 16K if 4K was enough (“good enough” doesn’t mean we’re done).

  11. […] 4k, HDMI, and Deep Color « tooNormal – […]

  12. Johnb565 says:

    I got what you intend, thankyou for putting up.Woh I am glad to find this website through google. It is a very hard undertaking to seek to please everybody. by Publilius Syrus. deckkkgafaed

  13. Kelli says:

    Learning a ton from these neat arecilts.

  14. Easter says:

    Hey, that post leaves me feeling fohislo. Kudos to you!

Leave a Reply