If you want the best experience from your media, whether it be video games or high definition movies, you’ll want a display that can deliver the best brightness and colours. HDR (High Dynamic Range) is a technology that drastically improves colour, brightness, and contrast ratios of a display, allowing the image to be much closer to real-life colours.
The problem is there are a host of HDR specifications, labels, and standards. There are also displays that do not have HDR labels but perform better or equal to HDR-compatible displays. When browsing for HDR compatible monitors, you have probably come across labels such as HDR, HDR 10, HDR 10+, HDR 400 to HDR 1000, and so on.
What’s the difference between these HDR labels? Which is best for you? Do you even need an HDR monitor? Navigating the world of HDR monitors and displays can be a headache!
In this article, I’ll do a quick breakdown of HDR monitors and specifications standards to make it easier for you to find the best monitor (or TV) for your needs.
HDR 10 vs HDR 400: What’s The Difference?
HDR10 is an open-source media standard, similar to Dolby Vision, which means the device can support HDR media with at least 10-bit colours. Every HDR-compatible device will have the HDR10 standard. It’s often shortened to simply HDR. HDR-400 is a specification by Vesa Display HDR that means the display is HDR compatible at 400 nits.
In short, HDR10 and HDR 400 are the same, except HDR 400 mentions the level of brightness on the display. To put it another way, when you see a three-digit number next to the HDR abbreviation, that number is used to describe the maximum number of nits the display can support, which is essentially the brightness levels.
These are standards designed by Vesa Display HDR, a company well known for manufacturing displays, and it is mainly used on their products, but other companies are adopting it too.
HDR-400 monitors are entry-level HDR displays. Generally, most people agree that 400 nits are not bright enough. Here’s a quick list of the other HDR specifications:
- 1000 (also called HDR 10+)
- 400 True Black
- 500 True Black
Generally, if you are determined to buy an HDR monitor, I recommend looking for one that has at least 600 nits of brightness. The 1,000 nit HDR monitors are quite expensive, but those will deliver the best visual experience.
An alternative to expensive HDR displays would be a regular monitor that supports higher resolutions, good brightness levels, and a decent sRGB coverage, such as 96% or more.
Gaming on HDR Displays:
Undoubtedly, HDR monitors can dramatically enhance the experience of movies, games, and even simply browsing the internet. But there are some things you need to know.
You might be thinking that you can simply buy an HDR-compatible monitor and all the colours on your games will be crystal clear, the shadows darker, and everything will pop-out like when watching a 3D movie.
While the colours will probably be a bit clearer and brighter, to benefit from true HDR, the title has to support the feature. While most modern titles do have an HDR option, not every game does, so you’ll need to check the graphics settings. The games that do support HDR will usually let you choose between a few types, such as HDR 10 or Dolby Vision.
In other words, your monitor might support HDR, but it won’t use it unless there’s an HDR source, whether that’s a game or movie. You also need to manually enable HDR in Windows 10, a mistake a lot of people seem to make. It seems like HDR on PC can be a hit or miss, some times it can work well, while other times a host of factors could ruin the experience.
With consoles, HDR is usually automatically included, and there seem to be more (and cheaper!) HDR TVs than HDR monitors. At the moment, it seems like HDR gaming is easier on consoles than PC as there are fewer hurdles.
Problems with HDR:
Besides the compatibility issues mentioned earlier, there are a few other things you need to know about HDR. For example, if you do manage to get a game working with HDR, you might notice that some colors are a bit too enhances, such as the blacks and whites.
While it does make scenes look a bit more realistic, HDR can also make it harder to clearly see things in-game. For example, dark rooms might be too dark, and bright areas might be too bright. Some people described HDR as adding an enhanced bloom effect onto the game which doesn’t actually improve the appearance that much.
Once again, it all comes down to how the game uses HDR, some games are optimized well for it, while others are hit and miss. It can also take a toll on frame rate, especially at resolutions beyond 1080P. Some times HDR can add input lag to the game, similar to V-sync. So those are a couple of downsides to gaming on HDR displays.
To summarize, HDR is a very interesting technology and it can dramatically improve visuals. However, for games and regular PC use, an HDR monitor doesn’t seem to be the best idea because Windows tends to have issues with HDR media. Not to mention there aren’t even that many games that support it or implement it well. For the most part, I would stick to regular monitors for now.