Nits measured on HDR monitor 2x those reported on scopes!

Get answers to your questions about color grading, editing and finishing with DaVinci Resolve.
  • Author
  • Message
Offline

vastunghia

  • Posts: 12
  • Joined: Wed Dec 22, 2021 11:32 am
  • Real Name: Sergio De Zordo

Nits measured on HDR monitor 2x those reported on scopes!

PostThu Jun 05, 2025 11:41 am

Hi all,

I recently purchased a very cheap (and popular, AFAIK, among gamers) QHD HDR display -- the AOC Q27G3XMN. I know this is far from ideal for HDR color grading, so please, don't waste your time telling me so ;). I purchased it with different goals in mind, HDR color grading being only one item of the list -- and not even at the top of it.

However, this is what I'm experimenting with right now, and I have a bunch of questions that maybe some guru hanging around this forum may help me understand better.

First of all my grading environment:
  • Project Color Science set to DaVinci YRGB with Timeline = DaVinci WG/Intermediate [I like grading in this space / gamma], Output = Rec.2100 ST2084 and HDR mastering for 1000 nits [not sure exactly what this does TBH]
  • Footage is RAW (CinemaDNG format FWIW) with settings Color space = Blackmagic Design and Gamma = Blackmagic Design Film
  • Initial Node (pre-grading) Color Space Transform: from Col. Space = Blackmagic Design Film Gen 1 / Gamma = Blackmagic Design Film to Col. Space = 'Use timeline' / Gamma = 'Use timeline', no tone / gamut mapping, no Forward or Inverse OOTF
  • Final Node (post-grading) Color Space Transform: from Col. Space = 'Use timeline' / Gamma = 'Use timeline' to Col. Space = Rec.2100 / Gamma = Rec.2100 ST2084, no tone mapping, some kind of gamut mapping, no Forward or Inverse OOTF

FIRST QUESTION

Using my good old X-Rite ColorMunki Display colorimeter, I verified that there is a huge systematic mismatch between nits reported in DaVinci's scopes and those measured by the colorimeter. Like, 100 nits on scopes translate to 200 nits measured on monitor. And ~600 nits on scopes nearly saturate my monitor's max brightness, which should be around 1'200 nits -- as also verified with the colorimeter. Looks almost like a 2x multiplier :o

Once I realized that, I included a node that applies some negative offset (still in the DaVinci WG Color Space / linear Intermediate gamma) so that luminosity values measured on the monitor are closer to the ones reported in the scopes -- lowering from 25.00 to 13.20 seems to do the trick. I enable it for color grading, I disable it when rendering -- hoping for the best.

The question is: Am I doing something wrong? Is it possible that my monitor sucks so much at displaying correct luminosity? Honestly I don't think so... I also checked Rting's review (https://www.rtings.com/monitor/reviews/aoc/q27g3xmn): they test for HDR brightness being in line with the expected value and of course their finding is that, if any, brigthness is a bit below the expected value, not above -- and not 2x for sure.

So, what am I missing? Any hint will be most welcome!

SECOND (more general) QUESTION

Having found a way to somehow calibrate my cheap HDR monitor for brightness (where calibration in this context to me means just that it now matches values reported on scopes), is there any way you are aware of to make also colors a bit more consistent? I can calibrate my monitor in SDR mode, but in HDR I understand that it applies some internal LUT that cannot be tweaked in any way -- it somehow seems to apply some sort of teal+orange look to all images. The result is that I first have to do color grading in SDR, to have some min degree of color accuracy, and then I switch to HDR to adjust 'only for brightness' -- though I know this is a huge simplification.

So the question is: using my colorimeter (and DisplayCAL, for instance), is it possible to create a LUT to feed into DaVinci in order to contrast my monitor inclination to apply the s***y teal+orange look to my footage when in HDR mode? Once again, I would enable it only when grading, and disable it for export.

Thank you all for sharing your experience and your valuable knowledge!!
Offline

mpetech

  • Posts: 894
  • Joined: Wed Sep 04, 2013 9:52 pm
  • Real Name: Dom Silverio

Re: Nits measured on HDR monitor 2x those reported on scopes

PostThu Jun 05, 2025 1:25 pm

1. Did you see this?

"These results are in the 'DisplayHDR' HDR Mode, which you can only select when sending an HDR signal to the monitor. Local Dimming was also set to 'Medium,' as 'Strong' has worse picture quality, and disabling local dimming results in a worse EOTF, as you can see here. We measured the Real Scene brightness differently than our regular methodology with a Blu-ray player, as the brightness was much lower with it. Instead, we used an HDFury Vertex Linker to force an HDR signal over HDMI from a PC, which resulted in a higher Real Scene brightness."

Also, check if there are profiles like"Vivid" or "Cinema". Often, they affect the response curve.

2. Can you adjust the RGB levels directly on the monitor? Are there RGB level profile separate for HDR and SDR?

If not, creating a LUT is a decent workaround. I assume you are using a Decklink or Ultrastudio for video output?
Offline

vastunghia

  • Posts: 12
  • Joined: Wed Dec 22, 2021 11:32 am
  • Real Name: Sergio De Zordo

Re: Nits measured on HDR monitor 2x those reported on scopes

PostThu Jun 05, 2025 1:51 pm

mpetech wrote:1. Did you see this? [...] Also, check if there are profiles like"Vivid" or "Cinema". Often, they affect the response curve.

Yup! I matched rtings' settings. HDR mode is "DisplayHDR", not "Vivid" or "Cinema".
mpetech wrote:2. Can you adjust the RGB levels directly on the monitor? Are there RGB level profile separate for HDR and SDR?

Unfortunately, no :cry:. This cheap piece of hardware has all adjustments disabled when in HDR mode. Their internal LUT kicks in and the only thing you can do is change HDR modes -- and the least miserable mode is "DIsplayHDR".
mpetech wrote:If not, creating a LUT is a decent workaround. I assume you are using a Decklink or Ultrastudio for video output?

Actually, since I'm on a Mac, I enabled "Use 10-bit precision in viewers if available" and "Use Mac display color profilers for viewers" under DVR Preferences, which I understand allows me to obtain proper HDR monitoring on my monitor (which is connected to my eGPU via DisplayPort). So even though I own an UltraStudio Monitor 3G, I'm not using it. Do you think I should give it a try? Just to check if issue 1 (monitor too bright wrt scopes) persists?

And, could anybody point me in the right direction on how to build the LUT to feed into DVR? As I said, I own a colorimeter, I installed DisplayCAL, I calibrated in SDR mode, but have no clue on how to build a LUT to color correct DVR in HDR mode...

Thanks a lot for your suggestions!
Offline
User avatar

Uli Plank

  • Posts: 25457
  • Joined: Fri Feb 08, 2013 2:48 am
  • Location: Germany and Indonesia

Re: Nits measured on HDR monitor 2x those reported on scopes

PostThu Jun 05, 2025 2:03 pm

Older probes are not capable of measuring HDR screens correctly.
There's a reason why Calibrite has these now (and it's not marketing only):
https://calibrite.com/eu/product/display-pro-hl/
My disaster protection: export a .drp file to a physically separated storage regularly.
www.digitalproduction.com

Studio 19.1.3
MacOS 13.7.4, 2017 iMac, 32 GB, Radeon Pro 580 + eGPU
MacBook M1 Pro, 16 GPU cores, 32 GB RAM, MacOS 14.7.2
SE, USM G3
Offline

mpetech

  • Posts: 894
  • Joined: Wed Sep 04, 2013 9:52 pm
  • Real Name: Dom Silverio

Re: Nits measured on HDR monitor 2x those reported on scopes

PostThu Jun 05, 2025 2:06 pm

I would try the Decklink first and see if creating a LUT, then using it in DR as an output LUT works.
Using your Mac monitor output can also work, but you would not need a LUT. An ICC profile should be sufficient.

https://displaycal.net/#create-3dlut
Offline

mpetech

  • Posts: 894
  • Joined: Wed Sep 04, 2013 9:52 pm
  • Real Name: Dom Silverio

Re: Nits measured on HDR monitor 2x those reported on scopes

PostThu Jun 05, 2025 2:14 pm

Uli Plank wrote:Older probes are not capable of measuring HDR screens correctly.
There's a reason why Calibrite has these now (and it's not marketing only):
https://calibrite.com/eu/product/display-pro-hl/


The X-Rite ColorMunki Display can measure nit above 1000 nits. Spec'd up to 2000 nits and covers Rec. 2020.
It is a colorimeter, not a high-end spectrometer. But it is more than sufficient to do HDR 1000 nit Rec 2020 calibration.
Offline

vastunghia

  • Posts: 12
  • Joined: Wed Dec 22, 2021 11:32 am
  • Real Name: Sergio De Zordo

Re: Nits measured on HDR monitor 2x those reported on scopes

PostThu Jun 05, 2025 2:56 pm

Uli Plank wrote:Older probes are not capable of measuring HDR screens correctly.
There's a reason why Calibrite has these now (and it's not marketing only):
https://calibrite.com/eu/product/display-pro-hl/

mpetech wrote:The X-Rite ColorMunki Display can measure nit above 1000 nits. Spec'd up to 2000 nits and covers Rec. 2020.
It is a colorimeter, not a high-end spectrometer. But it is more than sufficient to do HDR 1000 nit Rec 2020 calibration.

Thank you both for your points of view! I was skeptical too in using my ColorMunki for measuring 1000+ nits, but I must say that it proved to be quite reliable -- it can read far higher than 1000 nits and its measurement of the monitor's max brightness seems pretty consistent with professional measures (such as the ones of rtings).
mpetech wrote:I would try the Decklink first and see if creating a LUT, then using it in DR as an output LUT works.

Thank you! Not sure why the DeckLink should help in obtaining monitor brightness consistent with values reported in DVR scopes, but I agree it's worth a try. Will proceed as soon as I find the time.

mpetech wrote:Using your Mac monitor output can also work, but you would not need a LUT. An ICC profile should be sufficient.

Mmh I think I read somewhere that ICC profiles do not work well when HDR mode is enabled, even in macOS. For instance, check this source: https://gregbenzphotography.com/hdr/, mentioning that "ICC profiles do not currently support HDR and will break HDR" :?.

Thanks for the link, I will study and report back!

Return to DaVinci Resolve

Who is online

Users browsing this forum: 01Beaker, dannerino, drake5000, Google [Bot], panos_mts and 290 guests