Jump to: Board index » General » Fusion

Emulating Nuke's Point Cloud Function for Relighting

Learn about 3D compositing, animation, broadcast design and VFX workflows.
  • Author
  • Message
Offline

Ryan Bloomer

  • Posts: 903
  • Joined: Tue Jul 09, 2013 6:58 pm

Emulating Nuke's Point Cloud Function for Relighting

PostWed Sep 28, 2016 1:40 pm

I'm trying to emulate the technique found in this video by Image Engine for District 9 inside of Fusion.



Basically, it's taking the WPP render from 3D, displacing an image plane with high subdivision level, and projecting the Beauty render back onto the displacement. I can get a basic set up working but I run into a few problems. After creating the displacement and importing the 3D scene camera, everything lines up, but when using displace3D, with scale of 100 and position as source, the displacement comes from a single point, so the entire mesh originates from one point. Is there a way to not have the position pass displace from a single point? Reason being, in order to get the Fusion3D space render3D to merge with the original Beauty render, I need to mask the render and use a matte control to control the edge.

Included is an image of how the displace3d is displacing from a single point.
Attachments
displace3D_singlepoint.PNG
displace3D_singlepoint.PNG (510.86 KiB) Viewed 6980 times
Offline
User avatar

Chad Capeland

  • Posts: 3308
  • Joined: Mon Nov 10, 2014 9:40 pm

Re: Emulating Nuke's Point Cloud Function for Relighting

PostWed Sep 28, 2016 5:49 pm

What are you expecting to happen? What's the position channel values for those pixels?
Chad Capeland
Indicated, LLC
www.floweffects.com
Offline

Ryan Bloomer

  • Posts: 903
  • Joined: Tue Jul 09, 2013 6:58 pm

Re: Emulating Nuke's Point Cloud Function for Relighting

PostWed Sep 28, 2016 10:05 pm

Hey Chad,

Position Data on the front of displacement is:
PX:613.14.94
PY: -33.425
PZ: 6.13066

Ideally the displace wouldn't continue back to a single point. If I add a spot light the shadows fall between the mesh and into the "geo" that goes back to a single point. (not sure how to describe it better.

Image below: I was expecting to have the mesh end where the red outlines are.
displace3D_singlepoint_marked.jpg
displace3D_singlepoint_marked.jpg (99.65 KiB) Viewed 6932 times


The end goal is to be able to add additional shading/texturing, in a render from 3D, with the ability to place and see the comp in Fusion's 3D space, without needing to import geometry.

Also included is the original 3D render, and the attempted relighting with the displaced WPP on an image plane with 1000 subdivisions.

3D render.PNG
3D render.PNG (173.08 KiB) Viewed 6932 times

Fusion_3Drender_relight_withWPP displacement.PNG
Fusion_3Drender_relight_withWPP displacement.PNG (251.47 KiB) Viewed 6932 times
Offline

Sander de Regt

  • Posts: 4133
  • Joined: Thu Nov 13, 2014 10:09 pm

Re: Emulating Nuke's Point Cloud Function for Relighting

PostWed Sep 28, 2016 10:09 pm

If I can take a guess, maybe the WPP is anti-aliased by any chance? (or the displacement uses the wrong filter type) so that where the render crosses over to nothingness (which is what happens at the edge of your render) the depth values are somewhat corrupted.
Sander de Regt

ShadowMaker SdR
The Netherlands
Offline

Joël Gibbs

  • Posts: 97
  • Joined: Wed Nov 12, 2014 9:18 pm
  • Location: Nashville

Re: Emulating Nuke's Point Cloud Function for Relighting

PostWed Sep 28, 2016 10:33 pm

the single point back there is most likely everything that is alpha. so the it's pushing back all those points of mesh to 0,0,0.
Offline
User avatar

Chad Capeland

  • Posts: 3308
  • Joined: Mon Nov 10, 2014 9:40 pm

Re: Emulating Nuke's Point Cloud Function for Relighting

PostThu Sep 29, 2016 12:57 am

It's not about being in alpha, it's about the position pass being [0,0,0], which means all of the points are exactly where they were told to be. :)
Chad Capeland
Indicated, LLC
www.floweffects.com
Offline

Ryan Bloomer

  • Posts: 903
  • Joined: Tue Jul 09, 2013 6:58 pm

Re: Emulating Nuke's Point Cloud Function for Relighting

PostThu Sep 29, 2016 1:05 pm

Thanks guys, it all seams to make sense. Just can't get it to work the way I imagined it would. Next step will be to try to bring the Geo into Fusion and create the WPP from within Fusion itself and check the differences between 3D render and fusion's render, maybe the WPP is being antialiased at render.

Also, not sure why there's a golf ball effect happening to the displace, I would suspect this to be smooth like the render was from 3D.

I've seen others use this technique with a particle system, but it sounds like that is more for reference to where things are located in 3D space, not for relighting purposes. Any other suggestions would be great, I'll keep exploring.
Offline
User avatar

Chad Capeland

  • Posts: 3308
  • Joined: Mon Nov 10, 2014 9:40 pm

Re: Emulating Nuke's Point Cloud Function for Relighting

PostThu Sep 29, 2016 3:25 pm

Ryan Bloomer wrote:I would suspect this to be smooth like the render was from 3D.


It wasn't smooth in 3D though. It only LOOKED smooth because of the way the normals were interpolated.

Ryan Bloomer wrote:I've seen others use this technique with a particle system, but it sounds like that is more for reference to where things are located in 3D space, not for relighting purposes. Any other suggestions would be great, I'll keep exploring.


For relighting, you might be better off doing that in a shader.
Chad Capeland
Indicated, LLC
www.floweffects.com
Offline

Ryan Bloomer

  • Posts: 903
  • Joined: Tue Jul 09, 2013 6:58 pm

Re: Emulating Nuke's Point Cloud Function for Relighting

PostThu Sep 29, 2016 3:29 pm

Thanks Chad,

When you say relighting with a shader, do you mean using the 2D method with the normals pass?
Offline
User avatar

Chad Capeland

  • Posts: 3308
  • Joined: Mon Nov 10, 2014 9:40 pm

Re: Emulating Nuke's Point Cloud Function for Relighting

PostThu Sep 29, 2016 4:46 pm

Ryan Bloomer wrote:Thanks Chad,

When you say relighting with a shader, do you mean using the 2D method with the normals pass?


There's 2 ways, you can either use the geometry to hold your textures that you use for shading, or you just apply the textures to a pair of triangles and get your position from a texture. Position from a texture is a bit trickier, but it saves you from having to displace crazy geometry. Either way, you'll use normals, albedo, roughness, object masks, etc...
Chad Capeland
Indicated, LLC
www.floweffects.com
Offline

Klaus May

  • Posts: 15
  • Joined: Sat Sep 26, 2015 1:32 pm

Re: Emulating Nuke's Point Cloud Function for Relighting

PostThu Sep 29, 2016 8:32 pm

Ryan Bloomer wrote:The end goal is to be able to add additional shading/texturing, in a render from 3D, with the ability to place and see the comp in Fusion's 3D space, without needing to import geometry.

Why don't you want to import your geometry? I got very good results for relighting in fusion with the mesh exported as FBX and a beauty pass rendered out and projected back onto the mesh. I also tried the displacement method but you need a very high level of subdivision and still get artifacts on the edges. Projecting on the real mesh gives way more accurate results.
Offline

Ryan Bloomer

  • Posts: 903
  • Joined: Tue Jul 09, 2013 6:58 pm

Re: Emulating Nuke's Point Cloud Function for Relighting

PostFri Sep 30, 2016 3:06 pm

I was trying to stay away from bringing in Geometry for a few reasons. Most of my Fusion compositing is background replacements, and usually they are interiors. These get pretty high in the poly count. I'm exploring ways of quickly retopo the mesh for the purpose of bringing that mesh into Fusion and projecting the Beauty back on to the retopo mesh.

I'm interested in using this technique, but was trying to find a faster way to do it by only needing the position pass and then displacing the beauty. Seams like i'll have to re-explore the retopo approach.

The thing I usually run into with projection mapping with interior renders is trying to get reflections to move if I do a camera move on the projection. World reflections work well, but per object never seams to look right. I usually refer back to 3D rendering image sequences of the move and then finessing those renders in Fusion with relighting and CC.
Offline
User avatar

Kristof Indeherberge

  • Posts: 75
  • Joined: Fri Jul 01, 2016 8:15 pm

Re: Emulating Nuke's Point Cloud Function for Relighting

PostTue Oct 04, 2016 9:14 pm

Hi Ryan,

Using the position pass as a source to regenerate the geometry is something I use on a daily basis. It is useful in so many ways. The position pass is nothing more than x,y,z position values of a pixel stored in RGB, so you'd want to avoid having black pixels in your render as they will put verts of your image plane on the origin and that's what you are seeing.

To avoid it, I always include an infinity sphere (polys facing inwards and make it big enough so that your scene and cam fits inside) or plane (parent it to your cam, use an expression to scale it according to the FOV and distance to your cam and push it far enough) in my 3D scene, so that there's always geo visible and in doing so, those pesky black pixels are gone too.

As an alternative, you can also create a P render of such a sphere or plane in Fusion and merge your position pass on top of it (assuming you have a proper alpha). But that will slows things down.

I also believe that Raf added tools to Krokodove to take care of this, so you might give it a go if you have Studio running.

What else, always render your technical AOVs unfiltered. And make sure that your plane that you are displacing in comp has as many verts as you have pixels. You can streamline this by using expressions.

The faceting you are seeing is because of the dense geometry: you have many pixels (points) describing a triangle in your mesh, you lose the smoothing that you will have when there is enough room to interpolate (that is the case in your 3D app). Even though we have a ReplaceNormals3D node, it won't change anything because of that density.

So what can you do? You can buy Chad's awesome Custom Shader 3D plugin and replace the normals in Fusion on a shader level by those coming from your renders. I have been experimenting with that and you can grab some code to get you started over here:

steakunderwater.com/wesuckless/viewtopic.php?f=16&t=802&start=45#p7663

Chad was so kind to help me out. You could do something similar and replace the UVs too.

Having done that, you can now use 3D lights in Fusion to relight. But you have more options: you could also use the normals AOV and use that to relight without using the 3D engine. Or convert your 3D light's rotation to a vector you can use to drive the 2D relight tool. This has the benefit of being smooth. There are macros to be found that do this, dating back to 2009-2011 ish on the old PigsFly forum. A friend of mine made that part of a plugin set he shares for free. Check out the UV unwrap plugin too while you're at it.

Grab it here: aliendesigns.be?page_id=247

You need to mail him to get a compiled version for Fu8+. It's done, but there is no link yet.

And once you get used to this type of workflow, you will be hooked. Streamline it more by storing your camera info as metadata in your renders so you generate the cam on the fly and it will always be in sync without the need of exporting and importing cameras. Believe me, this workflow outperforms Nuke in many many ways. Try the displaceGeo node in Nuke and go to dinner.

The geometry is useful for creating hold-out mattes combined with image planes in a 3D setup... Particles... Have a look at Robocop's demo to get your creative juices flowing:

vimeo.com/14314572

Remember, it sucks less.
Offline

Ryan Bloomer

  • Posts: 903
  • Joined: Tue Jul 09, 2013 6:58 pm

Re: Emulating Nuke's Point Cloud Function for Relighting

PostThu Oct 06, 2016 1:34 pm

Thanks for all the detailed information Kristof!

Awesome to hear you're using this type of workflow daily, and it's reassuring this approach is viable. I'll try exploring the techniques mentioned below to try and solve some of my issues, as being able to see AND relight in 3D would offer a great amount of control that I don't currently have.

And make sure that your plane that you are displacing in comp has as many verts as you have pixels. You can streamline this by using expressions.


Does this mean that for a 1080 comp, you would need 2,073,000 subdivisions? That seams like it would cripple Fusion, as putting 2000 subdivisions when trying this really slowed things down.

The linked video was really helpful in setting up the displacement originally. I've been displacing from Pos, rather than RGB, and to get the alembic camera to line up I've had to displace size by 100. I'll see if I can get the scene to re-align when using RGB to displace, as well as incorporating an infinity sphere to get ride of the black pixels.

I've been able to successful use the 2D normal's approach with a lot of success for 3D inserts (mainly live action and placing 3D into the plate) The 2D approach works quite well for these types of shots.

For anyone following this thread, Gurt Gartner has a great tutorial on relighting with normals.


There is also a macro floating around called "3 point re-light" that works really well for insert type shots.
Offline

Ciro Cardoso

  • Posts: 33
  • Joined: Wed May 18, 2016 9:00 am

Re: Emulating Nuke's Point Cloud Function for Relighting

PostThu Oct 06, 2016 2:45 pm

Thanks for sharing the video. I always wonder how to use the normal pass.
Offline

Joël Gibbs

  • Posts: 97
  • Joined: Wed Nov 12, 2014 9:18 pm
  • Location: Nashville

Re: Emulating Nuke's Point Cloud Function for Relighting

PostThu Oct 06, 2016 3:56 pm

Ryan Bloomer wrote:Thanks for all the detailed information Kristof!


And make sure that your plane that you are displacing in comp has as many verts as you have pixels. You can streamline this by using expressions.


Does this mean that for a 1080 comp, you would need 2,073,000 subdivisions? That seams like it would cripple Fusion, as putting 2000 subdivisions when trying this really slowed things down.




In an ImagePlane3D you can unlock width/height subdivisions. For an HD image, 1920 in your Subdivision Width slider, and 1080 in your Subdivision Height slider.
Offline
User avatar

Chad Capeland

  • Posts: 3308
  • Joined: Mon Nov 10, 2014 9:40 pm

Re: Emulating Nuke's Point Cloud Function for Relighting

PostThu Oct 06, 2016 5:31 pm

Ciro Cardoso wrote:Thanks for sharing the video. I always wonder how to use the normal pass.


The problem with that technique is that's it's so abstracted from the 3D scene. Placing lights or adjusting shading or using different passes is really hard.
Chad Capeland
Indicated, LLC
www.floweffects.com
Offline
User avatar

Chad Capeland

  • Posts: 3308
  • Joined: Mon Nov 10, 2014 9:40 pm

Re: Emulating Nuke's Point Cloud Function for Relighting

PostThu Oct 06, 2016 5:33 pm

Joël Gibbs wrote:
Ryan Bloomer wrote:Thanks for all the detailed information Kristof!


And make sure that your plane that you are displacing in comp has as many verts as you have pixels. You can streamline this by using expressions.


Does this mean that for a 1080 comp, you would need 2,073,000 subdivisions? That seams like it would cripple Fusion, as putting 2000 subdivisions when trying this really slowed things down.




In an ImagePlane3D you can unlock width/height subdivisions. For an HD image, 1920 in your Subdivision Width slider, and 1080 in your Subdivision Height slider.


Position doesn't by itself create accurate normals though, no matter what the vertex/pixel ratio is. That's why you need a Normals pass.
Chad Capeland
Indicated, LLC
www.floweffects.com
Offline
User avatar

Eric Westphal

  • Posts: 215
  • Joined: Thu Nov 20, 2014 1:59 pm

Re: Emulating Nuke's Point Cloud Function for Relighting

PostThu Oct 06, 2016 6:27 pm

Joël Gibbs wrote:In an ImagePlane3D you can unlock width/height subdivisions. For an HD image, 1920 in your Subdivision Width slider, and 1080 in your Subdivision Height slider.


In fact it's Width-1 and Height-1. So 1919 x 1079 SubDivs in this case.
One SubDiv gives 2 Vertices and you want your pixels to sit exactly on a vertex,
not in between two vertices for proper and accurate displacement via WPP.

Cheers.

Eric.
my hovercraft is full of eels.
Offline

Joël Gibbs

  • Posts: 97
  • Joined: Wed Nov 12, 2014 9:18 pm
  • Location: Nashville

Re: Emulating Nuke's Point Cloud Function for Relighting

PostThu Oct 06, 2016 6:35 pm

AH! :D makes sense. Thanks for pointing that out Eric!

Return to Fusion

Who is online

Users browsing this forum: CUBuffskier and 29 guests