
- Posts: 44
- Joined: Fri Jul 21, 2023 1:17 pm
- Real Name: Adam Horshack
I'm trying to understand how best to match out-of-camera Rec.709 footage with the equivalent log footage converted to Rec.709-A using a CST in Resolve.
I've shot footage precisely metered to middle gray for both V-Log and N-Log. If I drop both on the timeline and do a log -> Rec.709-A conversion I get the expected 41% IRE per the specification.
I then shot the same metered scene using each camera's Rec.709 profile instead of log. When I examine the out-of-camera Rec.709 footage the IRE for the same middle gray is 50%, which is Rec.709 Gamma 2.4, I believe because the footage includes the 1.2 display/system "simultaneous contrast" gamma adjustment encoded into it, which I believe is expected for cameras.
If I drop the OOC Rec.709 footage onto the timeline and add a Rec.709 2.4 -> Rec.709-A CST, Resolve enables the reverse OOTF by default, which keeps middle gray at 50%. If I uncheck the reverse OOTF then the IRE drops to the expected 41%.
I presume the out-of-camera Rec.709 2.4 footage intended to be played directly on a playback device, ie encoded for delivery/display. If so, why does the Rec.709-A deliverable from the CST strip out the 1.2 display gamma via the forward OOTF for its deliverable, so that it's IRE is 41% vs the 50% for the out-of-camera, if both are intended as deliverables for playback? Maybe they're targeting different displays, perhaps the OOC footage targeting TV and the CST monitors/online? Or perhaps I'm missing something fundamental about the intention of one or the other in terms of deliverables.
I've shot footage precisely metered to middle gray for both V-Log and N-Log. If I drop both on the timeline and do a log -> Rec.709-A conversion I get the expected 41% IRE per the specification.
I then shot the same metered scene using each camera's Rec.709 profile instead of log. When I examine the out-of-camera Rec.709 footage the IRE for the same middle gray is 50%, which is Rec.709 Gamma 2.4, I believe because the footage includes the 1.2 display/system "simultaneous contrast" gamma adjustment encoded into it, which I believe is expected for cameras.
If I drop the OOC Rec.709 footage onto the timeline and add a Rec.709 2.4 -> Rec.709-A CST, Resolve enables the reverse OOTF by default, which keeps middle gray at 50%. If I uncheck the reverse OOTF then the IRE drops to the expected 41%.
I presume the out-of-camera Rec.709 2.4 footage intended to be played directly on a playback device, ie encoded for delivery/display. If so, why does the Rec.709-A deliverable from the CST strip out the 1.2 display gamma via the forward OOTF for its deliverable, so that it's IRE is 41% vs the 50% for the out-of-camera, if both are intended as deliverables for playback? Maybe they're targeting different displays, perhaps the OOC footage targeting TV and the CST monitors/online? Or perhaps I'm missing something fundamental about the intention of one or the other in terms of deliverables.