- Posts: 157
- Joined: Wed Dec 05, 2012 7:06 am
There's been a surge of neural engine/neural processing AI-enhanced tools that make life a lot easier. Things like keying without a green screen, EBsynth, Ember Gen etc. all that use some kind of neural processing to both speeds up the process and make it essentially one-click to result with just minor adjustments.
But there's still a big area that never gets attention and that is basically the most important part of VFX work and compositing: camera tracking.
I'm still waiting for SteadXP to mature enough to make a version of their box that works with high-end cinema for stabilization, but I'd also want them to release something that basically tracks the camera and gives you an FBX camera straight out of a shoot day. But since we don't have that, why isn't anyone working on a neural engine camera tracker? Something that basically uses far more advanced methods to calculate not only camera movements, but that can detect and remove the movement of actors and other stuff in a scene that doesn't relate to the camera move. Like, an inverted neural engine object tracker, instead, tracking everything else. As well as automatically figure out perspective, ground plane etc.
I'm checking research papers on a lot of interesting stuff that's coming, but none of them mentions a good camera tracker.
I would bet, that if Blackmagic released Fusion with a neural engine camera tracker that almost works as a one-click, it would be the go-to tool for the entire industry. It's such an untapped potential and would save the biggest time-consumer that we have at the moment.
I'm doing a lot of work in Blender, and that camera tracker is good but very bad at working without a lot of input. So I usually track cameras in Fusion and export the camera to Blender. But it's still so time-consuming and the result is almost always extremely off the mark.
So, if a neural engine camera tracker was the ONLY update coming in the next version of Fusion, that would be enough to justify a whole version beat. Basically, if Fusion had something like EBsynth, Runway, and a Camera tracker, it's game over for other compositing software. Doing a camera track, keying someone without a green screen, and adding features to that person in a matter of minutes rather than hours would revolutionize compositing workflows.
Here are the other tools as references.
https://runwayml.com/
https://ebsynth.com/
But there's still a big area that never gets attention and that is basically the most important part of VFX work and compositing: camera tracking.
I'm still waiting for SteadXP to mature enough to make a version of their box that works with high-end cinema for stabilization, but I'd also want them to release something that basically tracks the camera and gives you an FBX camera straight out of a shoot day. But since we don't have that, why isn't anyone working on a neural engine camera tracker? Something that basically uses far more advanced methods to calculate not only camera movements, but that can detect and remove the movement of actors and other stuff in a scene that doesn't relate to the camera move. Like, an inverted neural engine object tracker, instead, tracking everything else. As well as automatically figure out perspective, ground plane etc.
I'm checking research papers on a lot of interesting stuff that's coming, but none of them mentions a good camera tracker.
I would bet, that if Blackmagic released Fusion with a neural engine camera tracker that almost works as a one-click, it would be the go-to tool for the entire industry. It's such an untapped potential and would save the biggest time-consumer that we have at the moment.
I'm doing a lot of work in Blender, and that camera tracker is good but very bad at working without a lot of input. So I usually track cameras in Fusion and export the camera to Blender. But it's still so time-consuming and the result is almost always extremely off the mark.
So, if a neural engine camera tracker was the ONLY update coming in the next version of Fusion, that would be enough to justify a whole version beat. Basically, if Fusion had something like EBsynth, Runway, and a Camera tracker, it's game over for other compositing software. Doing a camera track, keying someone without a green screen, and adding features to that person in a matter of minutes rather than hours would revolutionize compositing workflows.
Here are the other tools as references.
https://runwayml.com/
https://ebsynth.com/