PostHole
Compose Login
You are browsing eu.zone1 in read-only mode. Log in to participate.
rss-bridge 2026-03-01T18:30:17+00:00

HDR was supposed to make everything look better — but do you even notice it?

If you can't tell, you're not alone (and you're not wrong)


HDR was supposed to make everything look better — but do you even notice it?

[HDTV on a TV stand, up against a wall]

Credit: Samual Regan Asante / Unsplash

Rob LeFebvre

Mar 1, 2026, 1:30 PM EST

Rob LeFebvre is an editor and writer focusing on consumer and enterprise technologies for a broad range of outlets. He’s been writing online for more than 15 years; before that he was a special educator for kids with severe disabilities.

Rob has been an Editorial Director at Lifewire, a news writer at Engadget, and a senior contributor at Cult of Mac. He's written about PCs, Macs, mobile phones, and games, created newsrooms from the ground up, and has extensive experience reviewing hardware, software, and games across his career.

So, you’re watching Squid Game on Netflix, excited to see all the deep colors promised by the title’s HDR tag. Except, well, it looks like every other show or movie you pull up on your TV. There’s nothing magical about it.

HDR promises a revolution in how we see contrast and color on our screens, but with misleading certifications, inconsistent implementation, and less-capable hardware, most people are watching content labeled HDR that looks no different (or actively worse) than what came before. Here’s why that’s so, and why the promise is bigger than the reality.

[the netflix app on a smartphone next to a tv remote with a netflix button]

Netflix Is More Expensive Than Ever, But There Are 3 Reasons I'll Keep Paying

It might be more expensive, but I think Netflix is worth keeping around.

Digvijay Kumar

###
What HDR is actually supposed to do

####
The science of brightness, contrast, and color volume

Credit: Pexels / Pixabay

HDR stands for High Dynamic Range, meaning the largest difference between the darkest black and brightest white your display can show. SDR, or Standard Dynamic Range, typically tops out at around 100 nits of brightness. HDR10 (one of the HDR specifications) aims at 1,000 nits, while Dolby Vision (another spec) targets up to 10,000 nits of brightness range.

HDR also uses wider color gamuts (like DCI-P3), which means more colors, not just brighter ones. Dolby Vision and HDR10 both require that displays and content use at least the DCI-P3 color space, though the more expansive Rec. 2020 specification is becoming a target for future devices and content.

Of course, HDR isn’t always used at the limit; often, you’ll see HDR content at peak brightness of 1,000 to 4,000 nits, and content creators can choose the extent of HDR abilities they use. For example, HDR10 content is typically mastered for a peak of 1,000 nits (while supporting up to 4,000 nits) and Dolby Vision starts at 4,000 nits and can support up to 10,000 nits.

###
Why the certification system is basically broken

####
How a $200 monitor can legally call itself HDR-ready

[amjed-omaf-xkG_QtMhTbo-unsplash]

Credit: Amjed Omaf / Unsplash

The standards are pretty clear; it’s how individual manufacturers make their monitors and TV that changes things. For example, VESA’s DisplayHDR 400 specification only requires 400 nits of peak brightness and no local dimming requirement (which can increase contrast). At this level, you’ll likely not see any vast improvement over SDR content.

Most budget laptops and monitors use the HDR400 cpec, reviewers at sites like Rtings and Digital Trends note that these displays often look worse in HDR mode thanks to blown-out highlights and no improvements to black levels.

All this to say that the HDR badge on your streaming platform or game launcher doesn’t mean your monitor can manage the higher range or color space.

###
The format war that left viewers behind

####
HDR10, Dolby Vision, HLG — and why none of them agree

[Screenshot of Dolby Vision website]

If you’ve been in the tech world for any length of time, you know that competing formats fight it out for dominance. From Betamax vs. VHS to Blu-Ray vs HD DVD, until a single format wins, it’s anyone’s game.

HDR10 and Dolby Vision implement HDR differently. HDR10 is an open and static specification: one set of metadata for the entire film. Dolby Vision is licensed and adjusts scene by scene. HLF, Hybrid Log-Gamma, was designed for broadcast and live TV, but isn’t backward compatible with non-HDR displays and may become out of sync or damaged when transmitted.

Many TVs support HDR10, fewer support Dolby Vision (Samsung TVs do not), and very few handle all formats perfectly. Netflix, Disney+, and Apple TV favor different HDR formats as well, which can create a fragmented experience. Even if your TV supports all the formats well, the content you watch may not look the same across shows or platforms.

###
What "fake HDR" looks like in the real world

####
Tone mapping gone wrong and the crushed blacks problem

[An old TV on a park bench]

Credit: Anete Lusina / Pexels

When a display or TV can’t match the brightness targets in the HDR content, it has to tone map, or compress the signal to match what the screen can actually show. When it does this poorly, you’ll see artifacts like highlight clipping, crushed shadows, or even washed-out midtones. What that ends up looking like is an image with fewer details: you lose depth and contrast and end up with a blah, average picture.

In fact, you might get a better picture by turning off HDR on these mid-range screens. YouTube and some streaming platforms can also offer different encoding options for HDR vs SDR, and sometimes the SDR encode is better mastered.

“YouTube's automated SDR down conversion is a convenient choice that can deliver good results with no effort,” says Google on its HDR support page for YouTube. “However, on challenging clips, it might not deliver the perfect result.”

If you have the option, try looking at content that isn’t performing well with SDR.

###
When HDR actually works — and what it takes

####
The hardware floor where HDR becomes genuinely impressive

[Apple Macbook Pro 14-inch in space black]

Credit: Apple, Inc.

If you have the higher-end display, though, you can see the meaningful improvements that HDR brings to the table. OLED displays, like those from Samsung and Sony, can achieve true per-pixel black levels, which makes the HDR contrast sing.

Mini LED panels with full-array local dimming like Samsung's Neo QLED (QN90 series) and Hisense's U8 series, are more budget-friendly ways to get the real HDR experience. If you’re an Apple user, many of the company’s devices and displays support real HDR.

###
How to check if HDR is actually working on your setup

####
The quick tests that reveal whether your display is doing anything

On a Windows system, you can head into Settings > System > Display > HDR to see if “Use HDR” is active. You can also look at “HDR/SDR brightness balance.”

If you’re on macOS, HDR is enabled by default. You can check via System Settings > Displays.

[...]


Original source

Reply