Obsessed with technology?
Subscribe to the latest tech news as well as exciting promotions from us and our partners!
By subscribing, you indicate that you have read & understood the SPH's Privacy Policy and PDPA Statement.
Blogs

"Fake HDR" TVs are giving HDR a bad name

By Ng Chong Seng - on 18 Feb 2018, 8:12am

Note: This article was first published on 17th December 2017.

For example, this Samsung Q8C QLED TV that uses quantum dots for high brightness and wide color gamut support is what I call a true HDR TV.

Have you heard about this “fake 4K” kerfuffle that happened several years ago? In short, there was one camp that argued if each pixel isn’t made up of three colored sub-pixels, it isn’t a true 4K TV. Another camp maintained that a strict RGB matrix is unnecessary, and picture quality concerns can be overcome algorithmically. Personally, I think as long as you’re happy with what you see onscreen and it’s appreciably better than 1080p, then sub-pixel layout be damned.

However, there’s now another “fake something” episode that I actually feel strongly about, because I think it’s harming unwitting 4K TV buyers. I’m referring to “fake HDR” TVs.

For the uninitiated, HDR (high dynamic range) is, in my opinion, the best thing that could happen to 4K TVs. Done properly and with the right content, an HDR picture “pops” way more than a normal picture. More “lifelike” images, if you will. But to do HDR properly - higher contrast, brighter highlights, wider colors - specific hardware is required. And this is where the marketing line starts to get fuzzy.

For one, brightness is a critical factor for effective HDR display. Most high-end 4K LCD TVs and OLED TVs tout a peak brightness of at least 1,000 and 600 nits respectively for this reason. For the former, they also have either an edge-lit or full-array local dimming system to control the screen backlight, to ensure a bright section truly stands out from its darker surrounding parts.

Most HDR content also support wide color gamut (WCG) for deeper colors and that isn’t something any Tom, Dick, and Harry TV is capable of either. To realize WCG, LCD TVs have started adopting new display/backlight tech: Samsung has QLED (quantum dots), LG has Nano Cell, and Sony has Triluminos; and in general, all 4K OLED TVs can do WCG. As you’d so rightly expect, these premium TVs are all midrange models and up.

So the pet peeve I now have with TV makers is entry 4K TVs that claim to be HDR-compatible. In almost all cases, any sub-S$1K TV that claims that is just trying to tell you that it will accept an HDR signal and won’t show you a garbled picture. By no means should you expect it to display HDR or WCG properly without verifying it. (Alas, that often requires you to drill down to the specs.) Case in point: I was watching Netflix on a low-end $700 4K TV the other day and the HDR label popped up on screen. So while Netflix was indeed supplying an HDR stream and the TV accepted it, the resulting picture quality was no better than a regular 4K SDR stream. I blame the TV (correctly) because I know the real reason, but most users are likely to point their finger (wrongly) at HDR not making an impact. Come to think about it, no wonder many people I’ve talked to poo-poo HDR - they probably experienced it on a sub-par TV incapable of doing the HDR material justice.

Yes, one day all TVs would be capable of doing HDR properly, but that day isn’t today.
 

Read Next: Our 4K TV Buying Guide

This article first appeared in the Dec 2017 issue of HWM.

Ng Chong Seng

Ng Chong Seng / Deputy Editor

I write about tech. I also fix things.