
- Posts: 12
- Joined: Wed Dec 22, 2021 11:32 am
- Real Name: Sergio De Zordo
Hi all,
I recently purchased a very cheap (and popular, AFAIK, among gamers) QHD HDR display -- the AOC Q27G3XMN. I know this is far from ideal for HDR color grading, so please, don't waste your time telling me so
. I purchased it with different goals in mind, HDR color grading being only one item of the list -- and not even at the top of it.
However, this is what I'm experimenting with right now, and I have a bunch of questions that maybe some guru hanging around this forum may help me understand better.
First of all my grading environment:
FIRST QUESTION
Using my good old X-Rite ColorMunki Display colorimeter, I verified that there is a huge systematic mismatch between nits reported in DaVinci's scopes and those measured by the colorimeter. Like, 100 nits on scopes translate to 200 nits measured on monitor. And ~600 nits on scopes nearly saturate my monitor's max brightness, which should be around 1'200 nits -- as also verified with the colorimeter. Looks almost like a 2x multiplier
Once I realized that, I included a node that applies some negative offset (still in the DaVinci WG Color Space / linear Intermediate gamma) so that luminosity values measured on the monitor are closer to the ones reported in the scopes -- lowering from 25.00 to 13.20 seems to do the trick. I enable it for color grading, I disable it when rendering -- hoping for the best.
The question is: Am I doing something wrong? Is it possible that my monitor sucks so much at displaying correct luminosity? Honestly I don't think so... I also checked Rting's review (https://www.rtings.com/monitor/reviews/aoc/q27g3xmn): they test for HDR brightness being in line with the expected value and of course their finding is that, if any, brigthness is a bit below the expected value, not above -- and not 2x for sure.
So, what am I missing? Any hint will be most welcome!
SECOND (more general) QUESTION
Having found a way to somehow calibrate my cheap HDR monitor for brightness (where calibration in this context to me means just that it now matches values reported on scopes), is there any way you are aware of to make also colors a bit more consistent? I can calibrate my monitor in SDR mode, but in HDR I understand that it applies some internal LUT that cannot be tweaked in any way -- it somehow seems to apply some sort of teal+orange look to all images. The result is that I first have to do color grading in SDR, to have some min degree of color accuracy, and then I switch to HDR to adjust 'only for brightness' -- though I know this is a huge simplification.
So the question is: using my colorimeter (and DisplayCAL, for instance), is it possible to create a LUT to feed into DaVinci in order to contrast my monitor inclination to apply the s***y teal+orange look to my footage when in HDR mode? Once again, I would enable it only when grading, and disable it for export.
Thank you all for sharing your experience and your valuable knowledge!!
I recently purchased a very cheap (and popular, AFAIK, among gamers) QHD HDR display -- the AOC Q27G3XMN. I know this is far from ideal for HDR color grading, so please, don't waste your time telling me so

However, this is what I'm experimenting with right now, and I have a bunch of questions that maybe some guru hanging around this forum may help me understand better.
First of all my grading environment:
- Project Color Science set to DaVinci YRGB with Timeline = DaVinci WG/Intermediate [I like grading in this space / gamma], Output = Rec.2100 ST2084 and HDR mastering for 1000 nits [not sure exactly what this does TBH]
- Footage is RAW (CinemaDNG format FWIW) with settings Color space = Blackmagic Design and Gamma = Blackmagic Design Film
- Initial Node (pre-grading) Color Space Transform: from Col. Space = Blackmagic Design Film Gen 1 / Gamma = Blackmagic Design Film to Col. Space = 'Use timeline' / Gamma = 'Use timeline', no tone / gamut mapping, no Forward or Inverse OOTF
- Final Node (post-grading) Color Space Transform: from Col. Space = 'Use timeline' / Gamma = 'Use timeline' to Col. Space = Rec.2100 / Gamma = Rec.2100 ST2084, no tone mapping, some kind of gamut mapping, no Forward or Inverse OOTF
FIRST QUESTION
Using my good old X-Rite ColorMunki Display colorimeter, I verified that there is a huge systematic mismatch between nits reported in DaVinci's scopes and those measured by the colorimeter. Like, 100 nits on scopes translate to 200 nits measured on monitor. And ~600 nits on scopes nearly saturate my monitor's max brightness, which should be around 1'200 nits -- as also verified with the colorimeter. Looks almost like a 2x multiplier

Once I realized that, I included a node that applies some negative offset (still in the DaVinci WG Color Space / linear Intermediate gamma) so that luminosity values measured on the monitor are closer to the ones reported in the scopes -- lowering from 25.00 to 13.20 seems to do the trick. I enable it for color grading, I disable it when rendering -- hoping for the best.
The question is: Am I doing something wrong? Is it possible that my monitor sucks so much at displaying correct luminosity? Honestly I don't think so... I also checked Rting's review (https://www.rtings.com/monitor/reviews/aoc/q27g3xmn): they test for HDR brightness being in line with the expected value and of course their finding is that, if any, brigthness is a bit below the expected value, not above -- and not 2x for sure.
So, what am I missing? Any hint will be most welcome!
SECOND (more general) QUESTION
Having found a way to somehow calibrate my cheap HDR monitor for brightness (where calibration in this context to me means just that it now matches values reported on scopes), is there any way you are aware of to make also colors a bit more consistent? I can calibrate my monitor in SDR mode, but in HDR I understand that it applies some internal LUT that cannot be tweaked in any way -- it somehow seems to apply some sort of teal+orange look to all images. The result is that I first have to do color grading in SDR, to have some min degree of color accuracy, and then I switch to HDR to adjust 'only for brightness' -- though I know this is a huge simplification.
So the question is: using my colorimeter (and DisplayCAL, for instance), is it possible to create a LUT to feed into DaVinci in order to contrast my monitor inclination to apply the s***y teal+orange look to my footage when in HDR mode? Once again, I would enable it only when grading, and disable it for export.
Thank you all for sharing your experience and your valuable knowledge!!