Nuke was made for stereo. Even before its support in version 5 Nukes flexible node sytem seems
to be the perfect way to cobble together a way to make any thing go which way.
But if your in a pinch or are new to stereo Making your own custom rig may seem like a Daunting Task.
So here is mine, I hope you have fun with it.
Under The Rig Control node
This Node Controls the Rigs Movement
Key frame or Parent a Solved camera to the RIg.
For cameras with Variable Focus you Can parent the rigs focus controls under the Solved Import tab.
Enjoy The ability to nondestructive adjust the camera on top of a monoscopic solve data
Toe Slider- to adjust convergence – Leave at 0 if you want a parallel rig
Inner Axeal – Also Known as Inner Ocular when Refuting to the Human Eye, Slides both cameras away and towards each other horizontally.
Rig move – this is a quick peak where the axis is sitting and gives you the ability to parent the rigs position
— Indie Tab
Left Indie – moves the Left camera Independently from the Right non destructively to the inner ocular
Right Indie – Moves the Right camera Independently from the Left non destructively to the inner ocular
Left Rotation Indie – Nondestructive Rotate the Left
Right Rotation Indie – Nondestructive Rotate the Right
This is the same as what you would find in the Camera Projection Tab, Adjust the Focal, Aperture ect
A quick tip useful for any heavy Nuke User is how to find a Node you created in a complex Node Graph.
By hitting “/” while hovering over the Node Graph
Proms a Search window where you can type in and search your to your hearts content.
Now combined with the tab menu you’ll be blazing away through your comps like a pro.
Nathan Westveer – NathanFx.com
From an earlier entry, I gave away how to bake projection into Geometry/Cards to do paint fixes on them. Nuke as powerful as it is has quite a few bugs..or more likely wasn’t designed around painting in the projection shader. You Can do a blend Matte to combined projection but, add a roto paint and suddenly the scale blows way off..even if you reformat the input. Its kind of Pesky if you want to use a camera solve to make paint fixes a bit easier.
There is a way I found out, not the sleekest one yet effective. For the sake of example, Lets save you have a wall from shoot covered in cracks that have to be animated forming, this plate also has a shifting gradient from a light source in the shot. Using a Camera Tracker you solved a 3d camera and its at your disposal. Place a Card where the wall. For the Texture use a Rotopaint with an empty constant the format of your comp. Then Attach the Bg1 to your Plate as you use the clone source from your clone tool, With a life time of of your choosing. Then from the Scanline from the camera Merge this on top of the original Plate again. and vaula your paint fixes has as all the transforms. rotations and perspective shifts that you need. Create that Clean plate.
Nathaniel Westveer – www.nathanfx.com
Well blogging about tid bits about nuke has seemed to been quite the theme for me these last couple months, I recently had run into a job where the only tracker I had at my hand inside nuke, No problem I thought I’ve exported cameras before. but ..to my horror it can be very difficult some times. Sure there are a lot of Mel and Python scripts to Maya, but none of them where playing nice with the version I was using. And most of what you get when you look for alternative ways are people telling you to upgrade to the newest Nuke which exports .FBX – which was kind of an issue for me, one I didn’t have that version and two FBXs are very difficult them to play nice most of the time.
So is there a way to get Nuke to write out a .MA file Nativity, with a slightly crude yet very effective plug in, absolutely ( cue up following link in a new window or tab)
After installing this into your plugin folder ( make sure you but the base.ma into the icons folder)
createMayaScene Gizmo takes a bit to get use too, After attaching all of the handles you still have to Type in the names of the nodes into the 3 fields with Correct syntax. (Example Read1 ,CameraTracker3 ect)
Once you do fill in the fields you are ready to export that camera to .ma, pick the extension you want to save to and remember to put .ma after the file name and hit export and kick back for those brief .2-.3 secs as it writes out the file.
Open it in Maya and there you go, you Made it. And your co workers will be so impressed.
Nathan – www.nathanfx.com
Nukes has a great 3d environment along with solid 2d node based composting, these tools can seem to live in different worlds. On one side, 3d the other 2D. Getting things out into 2d from 3d can like a daunting task if your not familiar with The reconcile 3d node. Lets Take the Godrays node for an example. allot of the times the point you want to project from is not on a plane of parallax visible in the plate, more a point in 3d Space off in the distance. You can animate it by hand but, if you can solve a camera its easier than you think.
So fire up nuke and hit that tab key as you type in R-e-c-o .. to pull up the the node like a pro.
You’ll notice that there is a few handles that are attached with it. Img, Camera, Axis. The most important of the handles to focus on right now is the Axis, where it observed by the solved(or animated) camera is where its going to back out the X,Y coordinates. Click and drag the axis and orientate it in space where you want you ( for this example) god rays to project from and attach the other handles to assigned names, Now from this you’ll have a few choices.
You can hit calculate key frames to have the computer look through the camera and give you the x,y values and bake it out. or you can have it update in real time and link an expression (apple drag, sorry windows folks i can only tell you from a mac at this moment) to the x,y transforms to the parenting node.
Either way this node is amazingly simple to use for all you xy converts.
The Reverse Process – Points to 3D (from xy to xyz)
Now you say hey Nate that’s great But I need to translate a point in 2d to 3D. This process can be done a few ways, what comes to mind is placing things in by hand, not a difficult process, in most cases but The foundry has your back with a node specialized to do this – Points to 3d. The min you get this node on screen you quickly observe that is has a camera handle like Reconcile 3D and relatively its the same process, take your solved or animated camera and attach it to it. From there you notice in the node option s it has Point A, Point B, Point C. These are the 3 Points you need to feed into the node to find the point in 3d space. So scrub through your timeline where you see your object, space or thing(er mo bob) and set the 3 points on 3 different frames, and voila there you have it. Your 2d values mapped out in 3D space.
Nathaniel Westveer – Nathanfx.com
Tracking for Modelers: A look at how Nukes New Point Cloud Generator Node can get you pixel accurate Layouts
As someone who enjoys the long tedious proces of Tracking, I’ve been dying for a program to take the data of the solved Virtual Camera to create high density point clouds. As I’m sure our friends at the foundry have been when they wrote the new Point Cloud Generator Node in Nuke. And the best part about it is that you dont need to rely on the Foundries Camera Solver to for this. For those of you who want an accurate in and out representation of a scene without busting out LIDAR, I recommend a few simple rules to capture scene you want.
Keep camera movements simple but move around. I know you want to try and do a crazy 360* turn to prove how “badass” you are. but really the solver will recognize and reenforce Spacial Location much more if you keep it with in a 45*-60* Degree of rotation.
Moving around is essential.
The Camera Solver Algorithms relies on what is called Parallax.
Where what the computer looks at is a Flat 2 Demential Picture (most of the time) if your not shooting in stereo there is very little to clue the computer in on whats going on. How points of contrast moved through out the plate makes all the difference if the computer can map out the depth to being unsolvable.
Do not move the camera too fast.
Especially with CMOS sensor based Cameras.
Heavy moment…even Any movement can invoke an “Evil” Called Rolling Shutter,
Were the bottom of the frame isn’t as updated as fast as the top. The foundry makes a great Plug-in to deal with this, I recommend you check it out. If not other Trackers do give you some tools to Deal with this issue but the best policy if your filming simply for the point cloud is to just keep it steady..and at a Resonable speed.
As well as Motion Blur will defiantly work aggressively to degrade the quality of the track if you do get a simple solve out of it at all.
Nodal Pans, (A Pan that is attached to one singular Node/Point in space,such As a Pan from a TriPod) unless given more data the Parallax Track Based Solve will return a Spherical Cluster of Point Cloud Data. Even though the camera is solved Correctly the point cloud will not. For Documenting a Scene its is recommend you use Free Hand motion.
Getting The Point Cloud out of Nuke.
Now you got your data out of Nuke, As the modeler you are its not going to do you much good if you cant use this data in your favorite modeler. I’m going to focus on Maya for the sake of simplicity but the concept remands the same.
The concept is that the point cloud is a group of Vertices that you most likely want to be translated in locators.
You can export point clouds through .FBX but all my experiences its freezes every time due to the amount of data needed to export.
But never fear because an MIT kid named Dominic Drane has written one for you.
Place the .py file your maya plug in folder.
This plug is a 1-2 knock out with importing the Camera and Point cloud data.
First export the Chan. File from the Nuke Camera Node, Import it then it will ask you if you want to import the OBJ you wrote the vertex data too.
And presto your done.
We’ve all had it happen, the Tracker liked and we though we could live with it, but nothing brings out these annoying fluctuations in a cameras translation in a tracked camera then trying to composite with it. Well Fear not Smart, Handsome, Intelligent reader of my blog because its easier than you think.
Simply select the Fames in the keyframe editor and right click. Next go to edit filter. ( Edit <Filter)
Choose a value of intensity and there you go! You are on your way to Camera smoothing your way to sanity and peace of mind.
We all love Nuke, and yes with Modos shinny new 501 update we are buzzing more than ever how this Sub D modeler is breaking every paradigm left and right. We can get Gemoetry out of new just fine, but what about that Camera solve that you did in Nuke how are you going to get it out to such a program as modo. Fear not because exporting cameras from Nuke to Modo is easier than you think! (see link)
Thanks to the Modo.Stenson you can now in nuke go to export Chan file in the camera node and feed it right into Modo.
The work flow threw me though a loop at first but it goes as following. Once you install the script goes as this. In the animation Tab you’ll find the Interface for fs_NukeChanIO.py as Nuke Channel IO. Select a camera or 3d Geometry and click import. (this being that the values are aplicable to any keyframing object.) select and press enter and Bingo you have your Nuke Camera imported right into the scene,
Have fun Importing,
Nathan Westveer – Nathanfx.com