What is HDR 10+ and the Differences to HDR 10 and Dolby Vision?
After HDR 10 by UHD alliance, Dolby Vision by Dolby Laboratories, Advanced HDR Technicolor, and HLG (Hybrid Log Gamma) by NHK and BBC, now, there is a new HDR format called HDR 10+. Essentially, this format is based on HDR 10 format, but it brings some improvements which claimed can deliver better HDR experience than the predecessor HDR 10. You need to know, HDR 10+ format is first introduced by Samsung and Amazon on April 2017 and this format has got the support of some big-names like Panasonic, 20th Century Fox and others. And by the passage of the time, there are more big-names which support this format. So, what improvements brought by HDR 10+ over HDR 10 and what performance of HDR 10+ against the others?
HDR 10+ vs HDR 10
As we have mentioned, essentially HDR 10+ is based on HDR 10 format. Although HDR 10 is the most popular HDR format and have been used by all of manufacturers and contents provider which already support High Dynamic Range (HDR), but basically it still has some limitations. For remapping color and brightness, HDR 10 uses a static metadata that sent at the beginning of the videos that certainly makes the information received by HDR devices is limited. Basically, metadata is the information that will tell a receiver how HDR contents must be shown. Since the information received by HDR device is limited an tone mapping applies the same contrast, gradation, brightness and color enhancement across an entire scene of content, this certainly makes them do less precise in remapping color and brightness, result in some a bit unwanted deviations that certainly has an impact on the HDR images which be shown. From here, HDR 10+ comes to bring some improvement that makes it can do more precisely in remapping color and brightness of HDR images, result in better HDR experience than HDR 10.
Basically, HDR 10+ uses the same standardization than HDR 10 like 10 bit depth of colors, DCI P3 Color space, 4000 nits of peak brightness with current target 1000 nits. But even so, HDR 10+ brings the improvement on how the metadata was sent. For remapping colors, black level, and brightness, HDR 10+ no longer uses static metadata but now it uses dynamic metadata that means the metadata is no longer sent at the beginning on the videos, but the metadata was sent scene by scene. This will makes it can show tone mapping curve from scene to scene more precisely that certainly makes HDR picture displayed on the screen will looks more true-to-life and closer to the original intentions of content creator. A technique used by HDR 10+ in sending the metadata reminds us to technique that used by Dolby vision.
HDR 10+ vs Dolby Vision
Equally use dynamic metadata to remap colors and brightness level, so, what are the difference between Dolby Vision and HDR 10+? As we have mentioned above, HDR 10+ is a royalty-free HDR format developed by Samsung and other big-names companies as the enhancement of HDR 10 standard. On other hand, Dolby Vision is a proprietary HDR format developed by Dolby Laboratories. Both of these HDR formats equally use dynamic metadata to remap the colors and brightness, make HDR picture produced looks true-to-life. Nevertheless, although both of these formats equally use dynamic metadata, how their metadata created is different. According Dolby’s SVP of Consumer Entertainment, all of metadata of Dolby Vision are created by hand by colorists and editors at the movie studio, while for HDR 10+, its metadata is created an upscaling algorithm.
Meanwhile, developed based on HDR 10 standard, HDR 10+ still use 10 bit depth of colors while Dolby Vision already use 12 bit depth of colors. Although there are not many TVs which already support 12 bit depth of colors, Dolby has claimed it can be down-sampled in such a way as to render 10-bit color more accurately. Additionally, about their color gamut coverage, HDR 10+ support DCI P3 color space while Dolby Vision supports Rec.2020 color space. The other difference between these formats is about the peak brightness that should be reached. Dolby Vision already supports 10.000 nits of peak brightness with current target 4000 nits while like the predecessor HDR 10, HDR 10+ supports 4000 nits of peak brightness whit current target 1000 nits. Theoretically, with higher standardization, Dolby Vision should be able to deliver better HDR picture than HDR10+. But the problem, for this now, there are no TV which has perfectly meet their standardizations.
From all, this actually is about “format war” between some of the leading tech companies in the world where this certainly will produce a winner and loser, but do not rule out, they can coexist. But even so, essentially, each of them want to present better HDR experience for their consumers. Who is the winner is not important to us as the consumers. This is just a knowledge for us to know characteristics and specs of each HDR formats which available in the market. Keep in mind, all of TVs which support Dolby Vision can be ascertained also supports HDR 10, but otherwise, not all of HDR TVs which supports HDR 10 also support Dolby Vision. For this time, the excepting of Samsung TVs, only mid-to upper range TVs which already supports Dolby Vision. Why there are no Samsung TVs which support Dolby Vision, it seems Samsung is still reluctant to pay Dolby Vision’s license for its TVs.
Meanwhile, for this time, the story is different. For this time, there are not yet TVs which already support both HDR 10+ and Dolby Vision. There is only Samsung which declare already support HDR 10+, while other manufactures like LG and Sony still focus on Dolby Vision and have not stated join HDR 10+ alliance, a group of manufacturers that share data points and technology advancements of HDR 10+ with one another. Which is the winner between HDR 10+ and Dolby Vision, we just wait the progression. But do not rule out, HDR 10+ and Dolby Vision can coexist and a TV can support both. If it can happen, this is certainly an advantage for us as the consumers that make us can enjoy movies with more HDR formats option.