What Is HDR, and What Does It Do?
You've probably heard that HDR (High Dynamic Range) imaging technology has changed the game for TVs, delivering richer, more lifelike images.
But HDR isn't just for TVs. Many modern PC monitors also feature this technology, which means enhanced contrast and color in some of your favorite PC games.
Whether you're shopping for a new TV, or just bought one, you've probably seen the term "HDR" at some point. But what is HDR and what does it do?
What is HDR?
High-Dynamic Range (HDR) refers to the ability of a monitor to display a wider range of contrast ratio and color than an SDR (Standard Dynamic Range) display. This means the display is capable of richer blacks and brighter bright colors, resulting in an image that is closer to real life.
As you might expect, this requires a panel with the capabilities to display that wider range of brightness and color, and that’s where the HDR feature becomes a desirable one.
What You Need for HDR
First and foremost, you will need an HDR-compatible display. Televisions are currently at the forefront of incorporating HDR technology, and most modern televisions support some version of this feature (more on the different versions shortly).
In addition to the display, you’ll also need an HDR source, referring to the media that is providing the image to the display. The source of this image can vary from a compatible Blu-ray player or video streaming service to a game console or PC.
Keep in mind, HDR does not work unless a source is providing the extra color information required. You’ll still see the image on your display, but you won’t see the benefits of HDR, even if you have an HDR capable display. It’s similar to resolution in this way; if you’re not providing a 4K image, you won’t see a 4K image, even if you’re using a 4K compatible display.
Fortunately, publishers embrace HDR across several formats, including several video streaming services, UHD Blu-ray movies, and many console and PC games.
HDR Formats
There are three main types of HDR technology you can find in most displays today: HDR10, HDR10+, and Dolby Vision. While HDR10 is the most common standard found in most 4K displays these days, Dolby Vision can get brighter, support more colors, and produce the most consistent image.
If you want to know more about what 4K is, check out our article on the difference between 4K and 1080p here.
There are several HDR formats, but here are a few of the most popular you’re likely to encounter:
HDR10 is an open-source HDR format that doesn’t charge licensing fees, so it’s the most commonly found. HDR10 is basically the baseline HDR standard due to its widespread adoption.
HDR10+ is the newer version of the HDR10 standard, and supports a higher color and contrast spectrum, as well as dynamic metadata. When compared to the standard metadata of HDR10, dynamic metadata allows for more information regarding color, and support for brightness levels up to 4,000 nits.
Dolby Vision is an HDR standard created by Dolby Laboratories. There is a licensing fee to use this technology, and it usually demands a higher-end display, so this format is less widespread than HDR10. Dolby Vision supports brightness up to 10,000 nits, which is far higher than most television or monitors currently on the market. Many streaming services and companies still support Dolby Vision however, and some displays support both HDR10 and Dolby Vision.
Each of these standards (and the other, less common HDR standards such as HLG) have strengths and weaknesses, but all of them are a significant step up over SDR. When choosing an HDR display, consider which version of HDR the display uses, and make sure that it is in line with your expectations.
What Does HDR Do?
HDR increases the range between the brightest whites and the darkest blacks on a display. With HDR video, content creators have more control over how bright and colorful certain parts of an image can be, so they can replicate what the eye sees in real life.
Traditional video cameras only use one exposure at a time to capture an image. That means the brightest parts of an image can be overexposed, while the darkest parts of an image can be underexposed. The underexposed parts of an image will lose detail in the shadows, while the overexposed parts of an image will lose detail in the highlights.
With HDR, video cameras can now capture multiple exposures at the same time, which can be combined in post-production to increase the contrast ratio and color range. This produces a more life-like image because our eyes can see a greater range of light and color than a camera does.
HDR requires a TV to have a high peak brightness. In fact, HDR TVs can get 10 times brighter than SDR TVs or more. However, that doesn’t mean the whole image will be brighter. Instead, content creators will be able to control how bright and dark parts of the image will be.
With most HDR displays, you will also get wide color gamut (WCG) too. This technology lets a TV display more vivid colors with higher saturation, as well as more shades of color in between. The majority of SDR displays use 8-bit color, which can produce up to 256 shades per color or around 16.7 million different colors. On the other hand, HDR uses at least 10-bit color, which can produce over 1 billion colors.
HDR Color and Brightness
Talking about color range can get complicated quickly. But suffice to say, the higher the color range, the better the color accuracy in a displayed image.
HDR uses Wide Color Gamut (WCG), which is just a way of saying that the display has the ability to show more colors. Take an onscreen plant for example. With WCG, there will be more shades of green for the display to work with, and the results are a plant that has colors closer to reality. While even the best HDR display doesn’t showcase quite as many colors as the human eye is capable of seeing, it’s still a noticeable step up from SDR (standard dynamic range).
The brightness, or luminescence, of a display is measured in candela per square meter, more commonly referred to as nits. The higher the number of nits, the higher brightness the display can achieve. When it comes to HDR, nits are important because the amount of brightness impacts the contrast ratio, or how many colors it is capable of producing.
HDR in PC Gaming
While the majority of the HDR discussion is mostly centered around televisions and console gaming, HDR has an increasingly prominent place in PC gaming as well. There are HDR gaming displays, and many games use the wider gamut of colors that HDR offers. Despite that, HDR in PC gaming isn’t ubiquitous. Why not? Partly because LCD panels that are designed for higher refresh rates and lower response times aren’t always compatible with high-end HDR. It’s not a feature as commonly found in “fast” gaming displays.
That said, there are options available if you want a gaming display with HDR support.
As with a television, to use HDR with your PC you’ll need not only a display that can support it but also a GPU capable of rendering the game with the additional color and brightness information.
So then, which GPUs support HDR? The good news is almost all modern gaming GPUs do. As long as you have a compatible display and configure your operating system appropriately, you should be able to enjoy HDR on your gaming PC.
What Devices Support HDR?
There are lots of devices that support HDR, including smart TVs, Ultra-HD Blu-Ray players, and almost all 4K streaming devices, including the Roku Streaming Stick 4K, Roku Ultra, Fire TV 4K, Apple TV 4K, Chromecast Ultra, Chromecast with Google TV, and more.
You will usually be able to see if your TV or streaming device supports HDR on the box. In order to get the most out of your HDR TV, you should also use an fiber HDMI 2.1 cable as well.
Conclusion
All in all, a well-performing HDR TV that displays HDR content will look better than a TV from a few years ago. In some cases, they're noticeably brighter and have a wider range of colors, which is a pretty sight to behold.