- Posts: 29
- Joined: Mon Jan 17, 2022 10:16 pm
- Real Name: Danny Burns
I hope to better understand what's happening here, so if anyone knows, please advise.
I have some test footage shot on an iPhone 12 Pro Max with the standard lens. Tracked this footage in Fusion, and entered the exact sensor size and focal length into the solve panel. Fusion ignores the focal length and determines its own focal length, with is wrong. Everything looks ok and placed shapes seem to lock in and no sliding is visible.
Then I use an FBX Export node to get that track data into Blender. In Blender other than the placed aligned shapes being tiny, and the camera being upside down and backwards, it looks OK. BUT entering the focal length from the camera in Fusion into the camera in Blender causes a mismatch in the aligned shapes and they now slide around as though they are not tracked properly. If I then enter the correct focal length into the Blender camera, the image seems to align itself properly and 8 out of 10 times, the tracking now works correctly. In the other 2 it's still misaligned and unfixable.
For comparison, I can track the same footage in After Effects, bring into Blender via the AE2Blend add-on and everything lines up properly. Focal lengths of the cameras match, no sliding of elements, etc. I had been under the impression that Fusion's camera tracker was considerably better than that of AE, but this doesn't seem to be the case. Can anyone help me understand what I may be doing wrong?
I have some test footage shot on an iPhone 12 Pro Max with the standard lens. Tracked this footage in Fusion, and entered the exact sensor size and focal length into the solve panel. Fusion ignores the focal length and determines its own focal length, with is wrong. Everything looks ok and placed shapes seem to lock in and no sliding is visible.
Then I use an FBX Export node to get that track data into Blender. In Blender other than the placed aligned shapes being tiny, and the camera being upside down and backwards, it looks OK. BUT entering the focal length from the camera in Fusion into the camera in Blender causes a mismatch in the aligned shapes and they now slide around as though they are not tracked properly. If I then enter the correct focal length into the Blender camera, the image seems to align itself properly and 8 out of 10 times, the tracking now works correctly. In the other 2 it's still misaligned and unfixable.
For comparison, I can track the same footage in After Effects, bring into Blender via the AE2Blend add-on and everything lines up properly. Focal lengths of the cameras match, no sliding of elements, etc. I had been under the impression that Fusion's camera tracker was considerably better than that of AE, but this doesn't seem to be the case. Can anyone help me understand what I may be doing wrong?