- Posts: 36
- Joined: Thu Nov 21, 2019 2:45 pm
- Real Name: Frederick Destrem
fredenchine wrote:- My camera (Z-Cam K1) exports H264 VR180 SBS (stereoscopic) with a res. of 5760x2880 (2x2880x2880). What is the best workflow / conversion to work inside Davinci ?
For Resolve native S3D color correction workflows you will find the Resolve stereo3D system prefers separate left and right eye video clips so you can use the Resolve Media Pool system to manage the stereo clips, and pass the footage through an Edit/Color/Deliver page workflow.
Is your goal to keep the footage in a side by side fisheye format for all of the post-production?
Or is your goal to convert the footage into a panoramic image projection converted format like:
- Side by Side Stereo packed into a cropped 180°x180° LatLong format with a blending mask on the edge of the frame
- OverUnder Stereo 360°x180° LatLong format with black filling the unused frame area
Check out the trial versions of these tools VR tools that work in Resolve/Fusion:
Boris CONTINUUM VR UNIT
https://borisfx.com/products/continuum-units/continuum-vr/
Re: Vision Effects RE:Lens
https://revisionfx.com/products/relens/resolve/
You can also use UV Pass (ST Map) warping workflows in Resolve/Fusion too:
RE: Vision Effects RE:Map
https://revisionfx.com/products/remap/resolve/
CustomShader3D
(If you have a Fusion Studio dongle you could also check out CustomShader3D which was mentioned previously on this thread. It's programmable so any conversion imaginable is possible via fragment shaders. You just have to have coding skills to tap into these tools full power.)
https://indicated.com/blackmagic-fusion-plugins/custom-shader-3d/
The Steakunderwater Reactor package manager for Resolve + Fusion also has VR tools that are free that can be of some assistance. The macro + script + fuse based content in Reactor is slower to render then compiled OFX plugins or Fusion native plugins but covers a wide range of immersive post-production tool needs. If you install KartaVR via the Reactor package manager, make sure to check the KartaVR example media projects page out. There are several stereo VR workflow example comps on that page that go over the basic node connections required in Fusion.
fredenchine wrote:- What nodes do you need in your fusion pipeline to render from a 3D text or graphic to a SBS 3D VR ? (180 or 360)
For VR180 workflows if you are using a parallel stereo style camera rig to film the footage you can use Fusion's Camera3D, SphericalCamera, and Renderer3D nodes to create the Fusion 3D rendered output needed.
If the footage moves and was filmed with a dolly, or gimbal, you will likely need to track and match move the stereo VR180 footage in an external camera tracking program like SynthEyes Pro if your goal is to have the most flexibility and control possible. But you could always play around with the Fusion Studio "Spherical Stabilizer" node to see what you get for results if you converted your fisheye SBS VR180 media to a pair of separate left and right views that were padded out LatLong 360°x180° format.
You will need to use the "Splitter" and "Combiner" nodes to break apart and merge back together with your left and right eye SBS footage when running media through the Fusion page.
The "LatLong Patcher" node when used with a "Paint" node with a "clone multistroke" setting active can do cloning/patching work if required.
It's possible to explore Stereo 6DoF VR disparity mapping + depthmap workflows in Fusion Studio and Resolve Studio via "Disparity", "DisparityToZ", and "CopyAux" nodes if you want to go that deep.
In the Nodes view, it helps to use the right-click "Paste Instance" option in the tools contextual menu to keep the left and right eye separate stereo image processing nodes in sync. An instance uses the exact same settings for both nodes so managing a stereo workflow is more manageable.