Jump to: Board index » General » Fusion

Fusion VR 3D 360 work flow

Learn about 3D compositing, animation, broadcast design and VFX workflows.
  • Author
  • Message
Offline

Joel Peck

  • Posts: 10
  • Joined: Wed Dec 11, 2013 2:56 pm

Fusion VR 3D 360 work flow

PostMon Aug 28, 2017 9:15 pm

There are no tutorials or even basic workflow docs on a VR workflow so I appreciate any help. I don’t know if there is an issue or if the VR assets in Fusion just currently don’t do what I would thing it should.

I have been composting for many years but am new to Fusion.

Windows 10
Dell precision 36 core hyper threaded to 72 virtual cores
96GB memory
GTX 1080 Ti
Oculus Rift CV1

I am beginning a VR project and am deciding on AE or Fusion or both. My work flow is to use a combination of Maya output which provides an over-under stereo LatLong or equirectangular sequence. Currently I will be doing that with 4k x 4k 16bit tif, but may increase the res and or use exr 32 bit float in future. I will also be outputting images, geometry, etc. from Maya and other sources for composite with or without the 4k OU stereo LatLong as part of the composite. In other words a typical advanced stereoscopic composite work flow, accept in VR.

I followed the basic tutorial for a 3d stereo composite and have logically tried to apply it to a VR 3d stereo workflow by;

1. Starting with a loader node the 4kx4k stereo OU latlong is imported into the comp. Which is working fine to import the VR stereo image sequence. All the monitor selections at the bottom of the node work fine and i am able to view in the viewers and it is also available in the Rift in full 3d 360 as expected.
2. I tried a number of ways to go from #1 and what worked was to use a 3Dslitter which also appears to work fine and also is able to see the image in full 360 3d.
3. then to an image plane 3d node where the 360 view is no longer available.
4. then to a 3d merge where the image plan and the VR spherical camera is connected. Spherical camera is set to LatLong.
5. out of 3d merge to 3d render node which provides the 360 view but now it is no longer viewable as 3d 360 in the viewer or the Rift.

I just learned form support that in node position #3 it should be a shape node set to sphere. That did work to keep the VR 360 going into the render node. However, now the image quality has degraded a good bit and the whole scene is in reverse. I can usually progress intuitively with these things having been doing this type work for so long, but I must say this is not at all intuitive and I need help. A good tutorial would be nice. But, in the mean time is there any one out in the forum with this knowledge to pass on to a fellow traveler?

Please..
Last edited by Joel Peck on Tue Aug 29, 2017 1:36 pm, edited 1 time in total.
Offline

Hendrik Proosa

  • Posts: 3033
  • Joined: Wed Aug 22, 2012 6:53 am
  • Location: Estonia

Re: Fusion VR 3D 360 work flow

PostTue Aug 29, 2017 7:24 am

It sounds like you are re-rendering your stereo backplate through vr camera. It would be better if you render additional elements (geometry and whatnot) separately and then combine them with merge in equi form, making sure that the render output is a proper 360 image first and that you actully have two correct equi views. VR viewer is nice, but to see what you actually have, it is better to check your views in equi format and do a differenct/wipe on views to check that you get correctly rendered stereo panorama. You can always manually split your stereo layout to two mono images, do stuff on them and then combine in the end to resulting stereo layout.

I haven't fiddled with Fusions vr tools much yet, so l'm not sure it any filters are supported for 360 images. This is usualy one messy part in 360 comping, because you can't just blur an equi image and expect the blur to be even in horizon and pole areas. Pixel-based ops (merges, grades and other similar non-spatial ops) should work fine though.
I do stuff.
Offline

David_Cox

  • Posts: 108
  • Joined: Thu Aug 03, 2017 11:20 pm
  • Location: London UK

Re: Fusion VR 3D 360 work flow

PostTue Aug 29, 2017 10:03 am

Just to add to what Hendrik explained...

I agree that you didn't actually need to place your lat-long image into your Fusion 3D scene. This is because the lat-long image is a flat background and ready to be comped over by the rendered output of the spherical camera. If you ever did want to put a lat-long wrap-around image into a 3D scene, then you apply it as a texture to a sphere which has its centres the same as the camera. This way, whichever way we look, we see the inside the of the sphere at an equal distance from the camera. The reason you noticed that the image then looks flipped is that Fusion presumes you will be looking at the sphere from the outside. But with 360 video, we are on the inside of the sphere looking out, so we need to flip the incoming image which we can do with a transform node attached to the loader. We would also most likely not want the wrap around image to be shadowed, so we would un-check the "affected by lights" option in the sphere's control panel.

But as Hendrik correctly pointed out, you probably don't need to put your lat-long image into your 3D scene. If you do, then your lat-long image will be loaded, wrapped to a sphere, unwrapped from a sphere (by the spherical camera) and rendered back to where it started. At best, a waste of CPU cycles and at worst, it will keep getting in the way. Also because you have a stereo lat-long image, you will need to have two spheres, one for the left view and one for the right. All potentially a bit messy.

A better approach would be that you keep your pre-rendered lat-long background separate from your true 3D pipeline as Hendrick suggested. If you think of it as a good old-fashioned 2D comp, the lat long image is just the background, nothing more, and so can be comped over pretty much at the end of the flow.

So your flow would look like:
(1) 3D flow: FBX loader>Spherical Camera>lights>3d Merge>Render3D
(2) Lat Long flow: Just a loader plus any corrections you want apply to it.
(3) Merge 1 over 2 and view from there.

Just in case you haven't found it, there is also a new tool called Lat-Long Patcher. It unwraps a segment of a lat-long image to be flat and un-distorted. This means you can carry out comp work to it without being concerned about the effects of the lat-long distortion. You use a lat-long patcher either side of your comp work, the first to un-distort and the second to re-distort back to lat-long.

Hope that helps.

DC
Showreels: vimeo.com/davidcoxtv
Website: www.davidcox.tv
Offline

Joel Peck

  • Posts: 10
  • Joined: Wed Dec 11, 2013 2:56 pm

Re: Fusion VR 3D 360 work flow

PostTue Aug 29, 2017 1:56 pm

Thank you much for you responses and suggestions. I am new to Fusion. I am much more familiar with AE. I started my dive into compositing this project in Fusion because I wanted to move from AE to Fusion anyway, or at least add Fusion to my tool set, and thought it might be better for the VR effort.

Again not yet proficient at all in Fusion, I will need to digest these suggestions and try to apply them and see what this flow looks like in practice before i can see if I have any further questions regarding these suggestions.

I haven't started to see how AE would handle this project yet, but at first look it seems that AE would allow me to use the latlong OU split into L R as a background then build the comp in front of it, and using Z placement to create the parallax/separation to simulate being within it if needed. Which would be a nice much less complicated approach.

David or Hendrik or anyone have an idea how AE (with Skybox of course) compares for VR 3D work flow?
Offline

David_Cox

  • Posts: 108
  • Joined: Thu Aug 03, 2017 11:20 pm
  • Location: London UK

Re: Fusion VR 3D 360 work flow

PostTue Aug 29, 2017 2:08 pm

It's probably worth you allowing yourself some play time with Fusion, just to get the hang of how it works. It perhaps isn't as intuitive as it could be, but it is very powerful and flexible. Follow some general tutorials as even though they won't be about 360 video, they will lead you through how Fusion thinks about things. It is worth the effort.

I haven't comped 360 video in AE, but I would think the same issue would arise that you don't want to double process your lat-long background. You have already rendered this from your CGI program so all its stereo and lat-long params will be correct. If you bring it into a 3D environment and shoot it back out again, it's likely to cause some issues somewhere.

DC
Showreels: vimeo.com/davidcoxtv
Website: www.davidcox.tv
Offline

Joel Peck

  • Posts: 10
  • Joined: Wed Dec 11, 2013 2:56 pm

Re: Fusion VR 3D 360 work flow

PostTue Aug 29, 2017 11:43 pm

OK that all makes a lot of sense.

Thank you much. A very big help.

I see what you mean about the double process of the 3d lat long.

This will require much experimenting for best app and work flow.

A nice tutorial from BM would be even nicer.
Offline

zcream

  • Posts: 36
  • Joined: Sun Apr 08, 2018 11:55 pm
  • Real Name: Anmol mishra

Re: Fusion VR 3D 360 work flow

PostMon Oct 08, 2018 6:57 pm

Would one of you gents care to post a tutorial on the workflow?

Sent from my Lenovo TB-8703F using Tapatalk
Offline

Joel Peck

  • Posts: 10
  • Joined: Wed Dec 11, 2013 2:56 pm

Re: Fusion VR 3D 360 work flow

PostMon Oct 08, 2018 7:20 pm

I would like to post a tutorial, but none was provided from BM and I was unable to see a good work flow for 360 VR 3D stereo. I was able to see a good flow for VR, but not for VR stereo with all the different types of VR 3D stereo layers I would require. I was looking forward to using Fusion because I like it and it has many benefits. But, unfortunately I am using another comp app. for VR 360 VR 3D stereo comping.

Return to Fusion

Who is online

Users browsing this forum: No registered users and 33 guests