I thought I’d share a quick motion tracking test I did with the BMC yesterday. I’ve spent a lot of time working with 3D animation, and I’ve been looking forward to seeing how well high-quality plates from the BMC would interact with CGI. It’s particularly exciting to use the BMC because DSLR footage comes with so many problems (line skipping, pixel interpolation, etc.) that can make getting a good track difficult.
For those unfamiliar with the workflow, here’s a quick run through of what I’m talking about.
• Master shot taken with the BMC. Preferably includes non-tripod mounted motion through 3D space.
• The footage is run through 3D motion tracking software (in this case, SynthEyes) to generate a virtual camera that mirrors the 3D movement of the real-world BMC.
• The virtual camera track is brought into 3D animation software (in this case, Maya) so that virtual objects or scenes will match the motion of the BMC master shot.
• The resulting objects are then composited with the original footage.
For this test, I did a deliberately quick and dirty shakycam shot down by the Bristol waterfront, using the Blackmagic handles and a Nikon 20mm AI-S lens. (The audio comes from the internal mic. I have to agree with the general consensus that while the built-in mic can be useful for syncing 2nd-source audio, you really don’t want to count on it for final delivery.)
All of the tracking, rendering, and compositing was done in full 2400x1350 resolution. The tracking software clearly *loves* this, and once I had good, solid objects in the scene, I got a good, solid track – better than anything I’ve ever pulled from a DSLR shot.
The only tracking problems you’ll notice come in the beginning of the shot, while I’m tilting up from rippling water. SynthEyes got confused by the changing refraction caused by the water, but once it got past that, it got a stable track.
Since this is primarily a motion tracking test, I didn’t spend much time on other issues. To save you all the trouble of pointing these out to me (aren’t I helpful) they include:
• The reflection in the water. To do it right, I should have rendered one in Maya, instead of faking it in FCP where I put things together.
• The lighting on the torus. It doesn’t match the scene; one should use HDR images to make a light environment in Maya to properly illuminate the object.
• The matte for the spar that moves in front of the torus. There are about 240 frames where the spar interacts with the torus; for this exercise, I wasn’t about to roto them all by hand. I made one, and tracked it to move with the spar, so you’ll see artifacts where it doesn’t match up.
(HD, 68MB. Please right-click and save, instead of opening in a browser.)
http://kubrickwannabe.com/magic-donut.mov
While the shot is far from being a polished product, I think it shows that the BMC’s hi-res footage will be a real pleasure to use for anyone doing SFX or greenscreen work. I’m very excited about the BMC’s potential when used on something more than just a throwaway test.
For those unfamiliar with the workflow, here’s a quick run through of what I’m talking about.
• Master shot taken with the BMC. Preferably includes non-tripod mounted motion through 3D space.
• The footage is run through 3D motion tracking software (in this case, SynthEyes) to generate a virtual camera that mirrors the 3D movement of the real-world BMC.
• The virtual camera track is brought into 3D animation software (in this case, Maya) so that virtual objects or scenes will match the motion of the BMC master shot.
• The resulting objects are then composited with the original footage.
For this test, I did a deliberately quick and dirty shakycam shot down by the Bristol waterfront, using the Blackmagic handles and a Nikon 20mm AI-S lens. (The audio comes from the internal mic. I have to agree with the general consensus that while the built-in mic can be useful for syncing 2nd-source audio, you really don’t want to count on it for final delivery.)
All of the tracking, rendering, and compositing was done in full 2400x1350 resolution. The tracking software clearly *loves* this, and once I had good, solid objects in the scene, I got a good, solid track – better than anything I’ve ever pulled from a DSLR shot.
The only tracking problems you’ll notice come in the beginning of the shot, while I’m tilting up from rippling water. SynthEyes got confused by the changing refraction caused by the water, but once it got past that, it got a stable track.
Since this is primarily a motion tracking test, I didn’t spend much time on other issues. To save you all the trouble of pointing these out to me (aren’t I helpful) they include:
• The reflection in the water. To do it right, I should have rendered one in Maya, instead of faking it in FCP where I put things together.
• The lighting on the torus. It doesn’t match the scene; one should use HDR images to make a light environment in Maya to properly illuminate the object.
• The matte for the spar that moves in front of the torus. There are about 240 frames where the spar interacts with the torus; for this exercise, I wasn’t about to roto them all by hand. I made one, and tracked it to move with the spar, so you’ll see artifacts where it doesn’t match up.
(HD, 68MB. Please right-click and save, instead of opening in a browser.)
http://kubrickwannabe.com/magic-donut.mov
While the shot is far from being a polished product, I think it shows that the BMC’s hi-res footage will be a real pleasure to use for anyone doing SFX or greenscreen work. I’m very excited about the BMC’s potential when used on something more than just a throwaway test.