HDR10+: What You Should Know

HDR10+ is one of several HDR formats available in the TV landscape, but is it the best? Find out how HDR10+ compares to HDR10 and Dolby Vision.

Category: TV & Displays

If you’ve purchased a new TV recently or follow the industry in any capacity, you’ve probably come across the term HDR10+.

Introduced by Samsung and Amazon Video in April 2017, HDR10+ is a video format that improves upon the HDR10 standard by adding dynamic metadata for improved brightness levels and color depth on compatible displays. But in a market already crowded with HDR standards (including Dolby Vision and the baseline HDR10), do you need a TV that supports HDR10+?

This article will help you understand how HDR10+ works and how it differs from other HDR standards.

What Is HDR10+?

Before getting into HDR10+, it’s essential to understand what HDR is.

High-dynamic-range (HDR) is a video technology used in the high-definition television industry to produce video and still images with improved brightness, contrast, and color accuracy. Any HDR-capable TV is compatible with one or more HDR formats.

HDR10 is the most widely adopted format and is an open, royalty-free standard that uses static metadata to deliver the brightness, contrast, and color to your TV. Since static metadata bases its tone mapping on the brightest frame in the content (a movie, for instance), HDR10 can deliver a disappointing viewing experience on less capable HDR TVs.

By contrast, HDR formats like Dolby Vision and HDR10+ use dynamic metadata, which adjusts tone mapping on a scene-by-scene basis, significantly improving the overall viewing experience. Dolby Vision and HDR10+ are competing for HDR formats, each with its pros and cons.

Dolby Vision is slightly more powerful thanks to its 12-bit color depth (compared to HDR10+’s 10-bit color) and is currently supported by a wider array of TV makers and content platforms. However, HDR10+ has the advantage of being an open standard; meaning content makers don’t need to pay a licensing fee to use it like they do with Dolby Vision. 

Is HDR10+ Better Than HDR?

HDR10+ is essentially a delivery method for HDR rather than a competing technology. The more important question is whether HDR10+ is better than HDR10, to which the answer is yes — at least in terms of overall quality. HDR10 is still considered the default HDR format due to its near-universal adoption. But as covered above, its use of static metadata makes it a less than ideal format.

HDR10+ delivers the same brightness level (maximum of 4,000 cd/m²) as HDR10, but its use of dynamic metadata provides enhanced HDR images. As more TV makers and content platforms adopt it, HDR10+ will likely replace HDR10 as the default HDR format.

Is HDR10+ Better Than Dolby Vision?

It’s easy to get the impression that the presence of HDR10+ and Dolby Vision is akin to the format war we saw between Blu-ray and HD-DVD in the mid-2000s. However, this rivalry is much more consumer-friendly as TV and content makers can support multiple HDR formats simultaneously. But just because HDR formats aren’t mutually exclusive doesn’t mean there aren’t notable differences between them.

Although HDR10+ has the advantage of being royalty-free, Dolby Vision is generally regarded as the better format due to its improved brightness and color. That said, you’ll get incredible picture quality from either one.

What Equipment Do I Need to Get HDR10+?

To take full advantage of HDR10+, you’ll need the following equipment:

  • A TV that is HDR10+ compatible.
  • A device that can read HDR10+ encoded material, such as a media streamer or Blu-ray player.
  • An HDR10+ compatible video source, such as a UHD Blu-ray movie or streaming service (Amazon Prime Video supports HDR10, HDR10+, and Dolby Vision).
  • An HDMI 2.1 cable. HDR10+ uses far more bandwidth than HDR10, so an older HDMI 2.0 cable may not support it.

What Is the Difference Between 4K and HDR10+?

Although both 4K and HDR improve image quality, they do so differently and are not competitors.

4K refers to screen resolution, corresponding to the number of pixels that fit on a TV screen or display. Used synonymously with Ultra HD (UHD), 4K represents a horizontal screen resolution of approximately 4,000 pixels.

As outlined above, HDR refers to contrast and color range on a TV screen or display. An HDR image has a higher contrast and brightness range than Standard Dynamic Range (SDR) and is frequently combined with a 4K resolution to deliver striking picture quality.

Is HDR10+ the Future of the Format?

It isn't easy to know for sure what the future holds for HDR10+ and HDR formats as a whole. Although there is a good chance HDR10+ will become the de facto standard format, Dolby Vision isn't going anywhere anytime soon.

The good news is since it's relatively easy for TV manufacturers and content producers to support multiple HDR formats, you don't need to choose between HDR10+ and Dolby Vision. Aim to buy a TV that supports both formats if you can, but you should be fine as long you have HDR-compatible equipment in the first place.

Source: Lifewire

Shareclose