Tracking for Modelers: A look at how Nukes New Point Cloud Generator Node can get you pixel accurate Layouts
As someone who enjoys the long tedious proces of Tracking, I’ve been dying for a program to take the data of the solved Virtual Camera to create high density point clouds. As I’m sure our friends at the foundry have been when they wrote the new Point Cloud Generator Node in Nuke. And the best part about it is that you dont need to rely on the Foundries Camera Solver to for this. For those of you who want an accurate in and out representation of a scene without busting out LIDAR, I recommend a few simple rules to capture scene you want.
Keep camera movements simple but move around. I know you want to try and do a crazy 360* turn to prove how “badass” you are. but really the solver will recognize and reenforce Spacial Location much more if you keep it with in a 45*-60* Degree of rotation.
Moving around is essential.
The Camera Solver Algorithms relies on what is called Parallax.
Where what the computer looks at is a Flat 2 Demential Picture (most of the time) if your not shooting in stereo there is very little to clue the computer in on whats going on. How points of contrast moved through out the plate makes all the difference if the computer can map out the depth to being unsolvable.
Do not move the camera too fast.
Especially with CMOS sensor based Cameras.
Heavy moment…even Any movement can invoke an “Evil” Called Rolling Shutter,
Were the bottom of the frame isn’t as updated as fast as the top. The foundry makes a great Plug-in to deal with this, I recommend you check it out. If not other Trackers do give you some tools to Deal with this issue but the best policy if your filming simply for the point cloud is to just keep it steady..and at a Resonable speed.
As well as Motion Blur will defiantly work aggressively to degrade the quality of the track if you do get a simple solve out of it at all.
Nodal Pans, (A Pan that is attached to one singular Node/Point in space,such As a Pan from a TriPod) unless given more data the Parallax Track Based Solve will return a Spherical Cluster of Point Cloud Data. Even though the camera is solved Correctly the point cloud will not. For Documenting a Scene its is recommend you use Free Hand motion.
Getting The Point Cloud out of Nuke.
Now you got your data out of Nuke, As the modeler you are its not going to do you much good if you cant use this data in your favorite modeler. I’m going to focus on Maya for the sake of simplicity but the concept remands the same.
The concept is that the point cloud is a group of Vertices that you most likely want to be translated in locators.
You can export point clouds through .FBX but all my experiences its freezes every time due to the amount of data needed to export.
But never fear because an MIT kid named Dominic Drane has written one for you.
Place the .py file your maya plug in folder.
This plug is a 1-2 knock out with importing the Camera and Point cloud data.
First export the Chan. File from the Nuke Camera Node, Import it then it will ask you if you want to import the OBJ you wrote the vertex data too.
And presto your done.