Of late, 4K has been the big thing in display technology. It refers to the pixel count that a television display can accommodate.
4K = 3840 X 2160 pixels
It essentially represents 3840 horizontal pixels running across 2160 vertical pixels. In market terms, it’s referred to as 2160p. A few years ago, the world went crazy over the shift from standard display to High Definition (HD).
HD is the most commonly preferred display resolution today. It amounts to 1920 X 1080 pixels, or as commonly referred as 1080p. By this math, the 4K display contains 4 times more pixels than the standard HD display and should mean 4 times better display. Isn’t it! Well, not really.
HD or 4K are merely measures of resolution, which is just one of the many characteristics of a high-quality display such as higher contrast ratio, smoother motion, more colors, enhanced frame rate, etc. Increased pixel count does not necessarily mean the viewing experience will be better. Research says that a viewer needs to sit 5 feet away from an 84-inch TV display to be able to notice the difference of 4K with naked eyes. Well, no thank you.
Let alone an 84-inch TV; the average American household display is between 42 and 50 inches. According to research, one needs to sit about 3 feet away to be able to experience the difference. Therefore, it does not really make sense to purchase a 4K TV, unless it is meant for setting up an ultra large home theatre.
Although TV manufacturers and consumers have been going gaga over the 4K technology, and howsoever advanced may it be, it fails to make a functional impact. Then what does? Time for HDR (High Dynamic Range).
1. Why HDR is the real deal.
HDR is a display technology derived from the DSLR ecosystem, that makes the darks darker and the lights lighter. It essentially optimizes the contrast ratio of the display. It pushes the limit of peak brightness and black level to as much as 1000 nits and 0.05 nits, respectively. This delivers extremely natural pictures.
Not just contrast ratio; HDR also aims to represent colors in a more realistic manner by using display panels which offer a much wider color palette than that of the standard HD TV. HDR can be supported on 10-bit display which means it’ll accommodate over 1 billion individual colors. The next best color palette is available with 8-bit display (Blu-ray) which amounts to only 16 million individual colors.
The standard HDR version that is most commonly used, is HDR10. It is topped by an advanced version called HDR10+, which is developed by Samsung and is exclusive to their devices.
2. Why the OTT industry should give it a damn.
2160p(4K) video is widely being adopted by streaming services such as YouTube, Amazon, Vimeo, and Netflix. Netflix offers some blockbuster shows in 4K that include Jessica Jones, Daredevil, and House of Cards. But it does not necessarily enhance the viewing experience; however, HDR does.
With HDR, on screen fire effects will look much warmer, and images will be lush and vibrant. It’s a technology that’s arguably more noticeable than the upgrade from 1080p to 4K. Apart from the HDR display, one needs HDR ready content and HDR capable player, to enjoy the magic of this superior display technology.
3. Effect of HDR in OTT Industry
a. Effect on Content Production:
With the advent of HDR technology, display makers will manufacture and market HDR supported TVs on a large scale. Consumers will opt for HDR TVs as it gains popularity. That’s when content owners will realize the shift in viewer preference and drive HDR video production. Producers will have to make tactical changes in terms of shooting and processing in order to create HDR ready content.
At the moment, HDR content is in a nascent state. The pioneers in OTT delivery, Amazon Prime and Netflix are both progressing towards acquiring more HDR videos in their library.
Alternatively, older videos can be remastered in HDR to deliver old content with a new experience. Only yesterday, Apple TV announced a free upgrade to HDR for most of the HD content in their library. So, we can expect a number of on-demand video services looking to acquire more HDR content.
b. Effect on Storage:
4K is highly data intensive. 1 minute of 4K video consumes as much as 375 MB of space at 30 frames per second (fps). Similarly, 1 minute of 1080p video requires approximately 150 MB of space at 30 fps. Additionally, increasing the frame rate will result in a linear growth of storage requirement. On the other hand, the HDR component in a given content takes up much lesser space, but in addition to what the resolution (4K, 2K, or 1080p) consumes.
It is interesting to see that 2K or even 1080p resolution with HDR delivers content that is as good as the quality delivered by a non-HDR 4K display. So, as a content owner, you can go on to save more than 60% of valuable server space and yet deliver equally promising content quality.
Let’s say you run a video subscription service, and you intend to offer around 5000 hours of high-quality content per month. With HDR, you can save up to 1.1 TB worth of storage cost as compared to 4K. That’s quite significant.
c. Effect on Bandwidth:
Netflix says that delivering an HDR picture requires about 20% more bits than the equivalent non-HDR resolution. Thus, while 4K is normally delivered by about 15 Mbps, 4K HDR requires 18 Mbps. Likewise, normal 2K is delivered by 5-6 Mbps, while 2K HDR requires 8 Mbps.
It’s clear that 2K HDR delivered by 8Mbps is much more economic than 4K HDR at 18Mbps. Let’s do some maths. Let’s say your platform clocks a monthly average of 50,000 stream hours in high resolution. So, you could save more than 50% of your bandwidth cost, which is quite a perk to switch to HDR and not 4K.
Including HDR content on resolutions lower than 4K, significantly compresses it; without making any visible impact on the picture quality. This is, therefore, a good way to optimize your infrastructure costs and maximise profit.
d. HDR on Mobile devices:
More than half of on-demand video consumers view on mobile devices. 4K, as we have established requires ultra large screens to make a difference. So, mobile devices are not an ideal asset to harbor 4K display, but opting to integrate HDR is a credible alternative.
Mobile HDR brings the stunning experience from a 50-inch TV to a handheld device display. While streaming on-the-go, a consumer may experience bit-rate variations. But that’s going to affect the resolution alone, while they can still enjoy the HDR effect even if they are not streaming at higher resolutions. That’s because resolution and dynamic range are technically independent of each other. This ensures seamless delivery of quality rich content on a mobile display. This ascertains a huge segment of consumers for HDR content.
A new standard for mobile devices has been announced by the Ultra HD Alliance, called Mobile HDR Premium. It means the devices adhere to certain standards to ensure superior viewing experience for HDR content. Although most mobile devices can not reach the maximum peak brightness level at 1000 nits, they can still go up to 550 nits on a 10-bit color palette; which also delivers high-quality media.
Many smartphones, tablets, and laptops come with Mobile HDR Premium badging these days. The badge applies to all devices which offer HDR content, but not necessarily at 4K resolution. Samsung claimed the first mobile HDR display with Galaxy Note 7, followed by LG G6, Sony Xperia XZ Premium, and few others.
YouTube has already started offering HDR content on its app installed on select mobile devices. Both NetFlix and Amazon have announced that they are offering HDR content to mobile devices.
Clearly, content creators have more reasons to produce HDR content than 4K. HDR is here to stay, and will definitely influence the OTT industry until the next breakthrough. There’s a large library of HDR content making its way to be released in near future. So, it’s also time for consumers to adopt the HDR technology and enjoy superior viewing experience.
Let us know what you think. Comment your opinion below.