iPhone footage scaling incorrectly in Resolve
Hello!
-iPhone footage captured in Apple Log, 4K ProRes 422 HQ using Blackmagic Camera App on iOS.
-1.55x Anamorphic mobile lens was used in front of the camera.
So, native clips are 3840x2160 (16:9). As a test, I shot 2 clips. The first clip uses BMC App's 1.55x desqueeze option which gets buried in the clip's metadata. The second clip was also shot with the Anamorphic lens but not using the desqueeze in-app, to see if manually setting the pixel aspect ratio in Resolve would help with the problem.
I want to finish in Academy Scope, so the Resolve project is set to 4096x1716 (2.39:1). Image sizing input and output are set to "Centre crop with no resizing". Resolve detects the clip shot with a 1.55x pixel aspect ratio and appropriately applies the pixel aspect ratio. Great. The manual clip I adjusted to match under clip attributes. Both clips now match, now I do my scaling calculations.
Native clip size: 3840x2160 (1.78:1)
Finishing resolution: 4096x1716 (2.39:1)
3840 * 1.55 = 5,952 (of note, the clip shot with 1.55x in-app shows up in the operating system as 5952x2160, but in Resolve as 3840x2160) this aspect ratio is 2.76, and so now I should crop to 2.39 and then scale to 4K UHD Scope.
2160 * 2.386946 = 5155.80336
So I expect the 2.39 extraction to be 5156x2160
Now to check the numbers to scale from the extraction to the final output:
4096 / 5156 = 0.794444
1716 / 2160 = 0.794444
I check using both width and height and the numbers match, so I know my extraction calculation is correctly 2.39. If I take 0.794444 and input it into Zoom in the resizing module, and save this as a preset, I should expect to be able to modify the "Input Presizing" under clip attributes of any clip and it should scale the 5156x2160 extraction down to 4096x1716. However, this is not what the footage does. The footage is scaled down as seen in the attachment. The first image is with the Input Presizing of 0.794444 applied. The image is smaller than the final output, yet the extraction should be larger than the final output and scaled down, not up. The second image is how the footage is displayed "Centre crop with no resizing" and no other modifications. The third pic is of the expected result (I cheated by up-scaling manually to mimic what the input presizing should be doing). If I were to speculate, it looks like when going from a 1.0 to 1.55 pixel aspect ratio, Resolve is transforming the image while maintaining the 3840 width in pixels. So, its compressing vertically and ending up with something like 3840x1390. This is not what happens when using this form of scaling with a 'real' cinema camera system.
Please keep in mind that I have used this scaling workflow for 15 years on set with all flavours of Anamorphic lenses, on all cinema camera systems from Alexa 65, 35 etc, Sony Venice 1+2, Red etc etc. This approach is vetted through Deluxe/Technicolor (Co3/Picture Shop). Meaning I know the math is sound and the method should achieve the results I expect. So, what I'm wondering is if there is some way Apple is automatically modifying their clips before they are transformed in Resolve? Has the BM Camera App embedded anything in the metadata that is making Resolve transform the image? Is it perhaps just a bug with this media? Has anyone else experienced this (not many people shoot Anamorphic on mobile so I know its a long shot)? Maybe there is something I am overlooking?
Any insights would be grateful!
-iPhone footage captured in Apple Log, 4K ProRes 422 HQ using Blackmagic Camera App on iOS.
-1.55x Anamorphic mobile lens was used in front of the camera.
So, native clips are 3840x2160 (16:9). As a test, I shot 2 clips. The first clip uses BMC App's 1.55x desqueeze option which gets buried in the clip's metadata. The second clip was also shot with the Anamorphic lens but not using the desqueeze in-app, to see if manually setting the pixel aspect ratio in Resolve would help with the problem.
I want to finish in Academy Scope, so the Resolve project is set to 4096x1716 (2.39:1). Image sizing input and output are set to "Centre crop with no resizing". Resolve detects the clip shot with a 1.55x pixel aspect ratio and appropriately applies the pixel aspect ratio. Great. The manual clip I adjusted to match under clip attributes. Both clips now match, now I do my scaling calculations.
Native clip size: 3840x2160 (1.78:1)
Finishing resolution: 4096x1716 (2.39:1)
3840 * 1.55 = 5,952 (of note, the clip shot with 1.55x in-app shows up in the operating system as 5952x2160, but in Resolve as 3840x2160) this aspect ratio is 2.76, and so now I should crop to 2.39 and then scale to 4K UHD Scope.
2160 * 2.386946 = 5155.80336
So I expect the 2.39 extraction to be 5156x2160
Now to check the numbers to scale from the extraction to the final output:
4096 / 5156 = 0.794444
1716 / 2160 = 0.794444
I check using both width and height and the numbers match, so I know my extraction calculation is correctly 2.39. If I take 0.794444 and input it into Zoom in the resizing module, and save this as a preset, I should expect to be able to modify the "Input Presizing" under clip attributes of any clip and it should scale the 5156x2160 extraction down to 4096x1716. However, this is not what the footage does. The footage is scaled down as seen in the attachment. The first image is with the Input Presizing of 0.794444 applied. The image is smaller than the final output, yet the extraction should be larger than the final output and scaled down, not up. The second image is how the footage is displayed "Centre crop with no resizing" and no other modifications. The third pic is of the expected result (I cheated by up-scaling manually to mimic what the input presizing should be doing). If I were to speculate, it looks like when going from a 1.0 to 1.55 pixel aspect ratio, Resolve is transforming the image while maintaining the 3840 width in pixels. So, its compressing vertically and ending up with something like 3840x1390. This is not what happens when using this form of scaling with a 'real' cinema camera system.
Please keep in mind that I have used this scaling workflow for 15 years on set with all flavours of Anamorphic lenses, on all cinema camera systems from Alexa 65, 35 etc, Sony Venice 1+2, Red etc etc. This approach is vetted through Deluxe/Technicolor (Co3/Picture Shop). Meaning I know the math is sound and the method should achieve the results I expect. So, what I'm wondering is if there is some way Apple is automatically modifying their clips before they are transformed in Resolve? Has the BM Camera App embedded anything in the metadata that is making Resolve transform the image? Is it perhaps just a bug with this media? Has anyone else experienced this (not many people shoot Anamorphic on mobile so I know its a long shot)? Maybe there is something I am overlooking?
Any insights would be grateful!
