Sat Aug 21, 2021 2:28 am
Hi! Gyroflow dev here. While I haven't used blackmagic gear personally, I've been trying to add support for as many sources of motion data as possible, which allows for more hardware flexibility and post processing possibilities, especially since the program is open source. As the BMPCC has the sensor hardware already, it could certainly be neat to support it in the future if IMU logging is enabled.
Furthermore, I have a proof of concept openfx resolve plugin, which when finished, will be able to apply the relevant gyro-based image stabilization steps in the editor without re-encoding the footage.
If any blackmagic people are reading this, here are some tips/info which could be handy for implementing gyro data logging successfully. Most of it is based on some ongoing collaboration with an action camera manufacturer, so things might differ for a larger setup.
* Gyro logging rate of at least 200 Hz captures all the motion data we're interested in. The required logging rate may increase depending on how the well the data is processed internally due to the potential of aliasing (e.g. 500 Hz may be required if noise is particularly bad). Accelerometer (and magnetometer) rates can be lower, but it might also be simpler to log everything at the same rate.
* Low pass filter to remove sensor noise/vibrations above 100 Hz. This is both to meet the Nyquist criteria, but also to filter some of the high frequency vibrations encounted on drones if using higher sampling rates. Typically the hardware low pass filter available in MEMS sensors should do, but some have trouble with particularly strong vibrations.
* Gyro full scale of +/- 1000 degrees per second or +/- 2000 dps. Seems pretty fast, but a drone can easily max out the scale at +/- 500 dps.
Since the BRAW format supports per frame metadata, one possibility could be "batching" multiple log samples per frame to get a higher effective sampling rate.
A totally different approach could be internally processing the raw gyro/acceleration/magnetometer in order to calculate the absolute orientation through sensor fusion, or a relative orientation from gyro integration alone. This can then be saved as a quaternion on a frame-by-frame basis. The advantage here is the reduced amount of metadata, but requires taking any data processing delays into account in order to perfectly synchronize the orientations. Furthermore, this wouldn't work well rolling shutter correction in the future, which requires inter-frame motion data.
Hopefully this was useful, or at least interesting. Recently many new cameras are getting gyro logging support, so it could be cool to see the feature on a proper cinema camera. As mentioned elsewhere, this can potentially be useful for VFX applications depending on orientation accuracy requirements, but at the very least post stabilization works