Nuke was made for stereo. Even before its support in version 5 Nukes flexible node sytem seems
to be the perfect way to cobble together a way to make any thing go which way.
But if your in a pinch or are new to stereo Making your own custom rig may seem like a Daunting Task.
So here is mine, I hope you have fun with it.
Under The Rig Control node
This Node Controls the Rigs Movement
Key frame or Parent a Solved camera to the RIg.
For cameras with Variable Focus you Can parent the rigs focus controls under the Solved Import tab.
Enjoy The ability to nondestructive adjust the camera on top of a monoscopic solve data
Toe Slider- to adjust convergence – Leave at 0 if you want a parallel rig
Inner Axeal – Also Known as Inner Ocular when Refuting to the Human Eye, Slides both cameras away and towards each other horizontally.
Rig move – this is a quick peak where the axis is sitting and gives you the ability to parent the rigs position
— Indie Tab
Left Indie – moves the Left camera Independently from the Right non destructively to the inner ocular
Right Indie – Moves the Right camera Independently from the Left non destructively to the inner ocular
Left Rotation Indie – Nondestructive Rotate the Left
Right Rotation Indie – Nondestructive Rotate the Right
This is the same as what you would find in the Camera Projection Tab, Adjust the Focal, Aperture ect
A quick tip useful for any heavy Nuke User is how to find a Node you created in a complex Node Graph.
By hitting “/” while hovering over the Node Graph
Proms a Search window where you can type in and search your to your hearts content.
Now combined with the tab menu you’ll be blazing away through your comps like a pro.
Nathan Westveer – NathanFx.com
Toxik or should I say Composite saves out allot of the user experience to its preference file. This is great if you like it a certain way. But with All the ways you could change something that you cant figure how to put back…this could have you up a wall trying to figure out what went wrong.
For those of you frustrated compositors here is the quick fix you been craving.
For example for OSX Toxik 2011 Users:
It is to be notied for you Studio artist you can inter changeable save out your preferences and link to them with in the project preferences.
With in the UI you can find it under
Edit/ Project Preferences, Under the first tab that is displayed the Information tab
Anyone who uses 3DEqualizer on OSX knows that it runs through X11, And to its credit it runs pretty smoothly.
But there are a few issues with X11 that break the software right out of the box for some OSX users.
One is the ALT key.
support on at www.sci-d-vis.com tells mac users to download and install and open source version of x11, but is there a simple fix to this keyboard calamity?
Turns out Francis North an InkScape user noticed the same setbacks in Inkcape.
He writes in a form post http://www.inkscapeforum.com/viewtopic.php?t=800&f=5
Launch X11 from Applications/Utilities
Open the preferences window and enable the “Use the system keyboard layout” preference. This will force X11 to use it’s system default keyboard layout.
Open an xterm window (Command-N) and perform the following:
$ xmodmap -pke > ~/.Xmodmap
This captures the current xmodmap settings to a hidden file located in your home directory. It’s this ~/.Xmodmap file that X11 will by default use to override any system mappings.
Edit your .Xmodmap file by typing the following in the same xterm window:
$ open -e ~/.Xmodmap
Change the line: “keycode 66 = Mode_switch” to “keycode 66 = Meta_L”
Change the line: “keycode 69 = Mode_switch” to “keycode 69 = Meta_R”
Save the file and exit TextEdit.
Disable the X11 preferences “Emulate three button mouse” and “Use the system keyboard layout”, then close the preferences window.
These settings basically say “don’t treat ALT as a special key, and don’t override my .Xmodmaps file with system defaults”.
Close the X11 preferences window, any opened xterm windows, and then the X11 application.
That’s it. Now when you launch Inkscape, the ALT key should work as expected, and the status-bar will correctly display any ALT-key-specific help when that key is pressed (e.g. when using the Selection tool).
One final note: from an xterm window, you can debug your key mappings by typing “xev”. This program will open an interactive test window that will echo information about current key/mouse presses, and is a handy utility for determining keycodes and what they’re currently mapped to.
Feel free to provide any feedback and/or request for further clarification.
Thanks Francis North for this InkScape Tip. 3D Equalizer MAC Users will also be thrilled to find out this fixes their Alt Key woes as well.
Nathaniel Westveer – www.nathanfx.com
From an earlier entry, I gave away how to bake projection into Geometry/Cards to do paint fixes on them. Nuke as powerful as it is has quite a few bugs..or more likely wasn’t designed around painting in the projection shader. You Can do a blend Matte to combined projection but, add a roto paint and suddenly the scale blows way off..even if you reformat the input. Its kind of Pesky if you want to use a camera solve to make paint fixes a bit easier.
There is a way I found out, not the sleekest one yet effective. For the sake of example, Lets save you have a wall from shoot covered in cracks that have to be animated forming, this plate also has a shifting gradient from a light source in the shot. Using a Camera Tracker you solved a 3d camera and its at your disposal. Place a Card where the wall. For the Texture use a Rotopaint with an empty constant the format of your comp. Then Attach the Bg1 to your Plate as you use the clone source from your clone tool, With a life time of of your choosing. Then from the Scanline from the camera Merge this on top of the original Plate again. and vaula your paint fixes has as all the transforms. rotations and perspective shifts that you need. Create that Clean plate.
Nathaniel Westveer – www.nathanfx.com
I’m writing this post today to tell every one on a great Mel. Script that i’m sure you trackers out there will enjoy. The Survey Solver. as its called, not to be confused with Pixel farms new node in Matchit and Track, this Mel Script will take any Track from a verious other trackers and configure them to work in another program. Such as PF Track to Syntheyes. Or PF Track to Maya Live..if your into that kind of thing.
The Script comes form none other from a fellow tracker known as Minco Marinov, (Check this reel while your on his site too). Survey Solver is easy enough to install by dragging it in to your script editor and saving it out to a desinated spot on your shelf. Or if your feeling technical there are great step by step instructions on where to save the files on your system. ( given it looks like the paths where written for Maya 7) This script runs great in Maya 2011.
Give it a shot,
Happy Tracking – Nathaniel Westveer – www.nathanfx.com
Well blogging about tid bits about nuke has seemed to been quite the theme for me these last couple months, I recently had run into a job where the only tracker I had at my hand inside nuke, No problem I thought I’ve exported cameras before. but ..to my horror it can be very difficult some times. Sure there are a lot of Mel and Python scripts to Maya, but none of them where playing nice with the version I was using. And most of what you get when you look for alternative ways are people telling you to upgrade to the newest Nuke which exports .FBX – which was kind of an issue for me, one I didn’t have that version and two FBXs are very difficult them to play nice most of the time.
So is there a way to get Nuke to write out a .MA file Nativity, with a slightly crude yet very effective plug in, absolutely ( cue up following link in a new window or tab)
After installing this into your plugin folder ( make sure you but the base.ma into the icons folder)
createMayaScene Gizmo takes a bit to get use too, After attaching all of the handles you still have to Type in the names of the nodes into the 3 fields with Correct syntax. (Example Read1 ,CameraTracker3 ect)
Once you do fill in the fields you are ready to export that camera to .ma, pick the extension you want to save to and remember to put .ma after the file name and hit export and kick back for those brief .2-.3 secs as it writes out the file.
Open it in Maya and there you go, you Made it. And your co workers will be so impressed.
Nathan – www.nathanfx.com
Nukes has a great 3d environment along with solid 2d node based composting, these tools can seem to live in different worlds. On one side, 3d the other 2D. Getting things out into 2d from 3d can like a daunting task if your not familiar with The reconcile 3d node. Lets Take the Godrays node for an example. allot of the times the point you want to project from is not on a plane of parallax visible in the plate, more a point in 3d Space off in the distance. You can animate it by hand but, if you can solve a camera its easier than you think.
So fire up nuke and hit that tab key as you type in R-e-c-o .. to pull up the the node like a pro.
You’ll notice that there is a few handles that are attached with it. Img, Camera, Axis. The most important of the handles to focus on right now is the Axis, where it observed by the solved(or animated) camera is where its going to back out the X,Y coordinates. Click and drag the axis and orientate it in space where you want you ( for this example) god rays to project from and attach the other handles to assigned names, Now from this you’ll have a few choices.
You can hit calculate key frames to have the computer look through the camera and give you the x,y values and bake it out. or you can have it update in real time and link an expression (apple drag, sorry windows folks i can only tell you from a mac at this moment) to the x,y transforms to the parenting node.
Either way this node is amazingly simple to use for all you xy converts.
The Reverse Process – Points to 3D (from xy to xyz)
Now you say hey Nate that’s great But I need to translate a point in 2d to 3D. This process can be done a few ways, what comes to mind is placing things in by hand, not a difficult process, in most cases but The foundry has your back with a node specialized to do this – Points to 3d. The min you get this node on screen you quickly observe that is has a camera handle like Reconcile 3D and relatively its the same process, take your solved or animated camera and attach it to it. From there you notice in the node option s it has Point A, Point B, Point C. These are the 3 Points you need to feed into the node to find the point in 3d space. So scrub through your timeline where you see your object, space or thing(er mo bob) and set the 3 points on 3 different frames, and voila there you have it. Your 2d values mapped out in 3D space.
Nathaniel Westveer – Nathanfx.com
Nathan here, sometimes its the small tricks of the trade that get the job done. that can make all the difference in how a project gets done, In the world of 3D taking Image planes from resolution they where shot in often unnecessary and ads added pressure on to your system while you work. This often leads to you resizing one by one.
You could be into the Automator scripts, but if your me you want a solid UI that you dont have to figure out to use, A small program called Resize Me can work wonders.
Tracking for Modelers: A look at how Nukes New Point Cloud Generator Node can get you pixel accurate Layouts
As someone who enjoys the long tedious proces of Tracking, I’ve been dying for a program to take the data of the solved Virtual Camera to create high density point clouds. As I’m sure our friends at the foundry have been when they wrote the new Point Cloud Generator Node in Nuke. And the best part about it is that you dont need to rely on the Foundries Camera Solver to for this. For those of you who want an accurate in and out representation of a scene without busting out LIDAR, I recommend a few simple rules to capture scene you want.
Keep camera movements simple but move around. I know you want to try and do a crazy 360* turn to prove how “badass” you are. but really the solver will recognize and reenforce Spacial Location much more if you keep it with in a 45*-60* Degree of rotation.
Moving around is essential.
The Camera Solver Algorithms relies on what is called Parallax.
Where what the computer looks at is a Flat 2 Demential Picture (most of the time) if your not shooting in stereo there is very little to clue the computer in on whats going on. How points of contrast moved through out the plate makes all the difference if the computer can map out the depth to being unsolvable.
Do not move the camera too fast.
Especially with CMOS sensor based Cameras.
Heavy moment…even Any movement can invoke an “Evil” Called Rolling Shutter,
Were the bottom of the frame isn’t as updated as fast as the top. The foundry makes a great Plug-in to deal with this, I recommend you check it out. If not other Trackers do give you some tools to Deal with this issue but the best policy if your filming simply for the point cloud is to just keep it steady..and at a Resonable speed.
As well as Motion Blur will defiantly work aggressively to degrade the quality of the track if you do get a simple solve out of it at all.
Nodal Pans, (A Pan that is attached to one singular Node/Point in space,such As a Pan from a TriPod) unless given more data the Parallax Track Based Solve will return a Spherical Cluster of Point Cloud Data. Even though the camera is solved Correctly the point cloud will not. For Documenting a Scene its is recommend you use Free Hand motion.
Getting The Point Cloud out of Nuke.
Now you got your data out of Nuke, As the modeler you are its not going to do you much good if you cant use this data in your favorite modeler. I’m going to focus on Maya for the sake of simplicity but the concept remands the same.
The concept is that the point cloud is a group of Vertices that you most likely want to be translated in locators.
You can export point clouds through .FBX but all my experiences its freezes every time due to the amount of data needed to export.
But never fear because an MIT kid named Dominic Drane has written one for you.
Place the .py file your maya plug in folder.
This plug is a 1-2 knock out with importing the Camera and Point cloud data.
First export the Chan. File from the Nuke Camera Node, Import it then it will ask you if you want to import the OBJ you wrote the vertex data too.
And presto your done.
We’ve all had it happen, the Tracker liked and we though we could live with it, but nothing brings out these annoying fluctuations in a cameras translation in a tracked camera then trying to composite with it. Well Fear not Smart, Handsome, Intelligent reader of my blog because its easier than you think.
Simply select the Fames in the keyframe editor and right click. Next go to edit filter. ( Edit <Filter)
Choose a value of intensity and there you go! You are on your way to Camera smoothing your way to sanity and peace of mind.
Now there comes a time and a place where we all shout obscenities at our computer screens when we rendered out 100+ frames and they are named wrong. Defiantly been there and back. At first blush on google you might find a few solutins that are apple scritps…but I dont know about you but I find their functinatily lacking. Enter the SEO Challenged mac OS X progam Name Changer.
Name Changer Has many Great options in a simple design to Batch Remane the most might mare file sequences. (tip even if you want to just delete parts of the sequence just replace it with nothing in the Fields in Name Changer). Name Changer will Preview the changes before you make them and swiftly carrie out the new names in a snap.
We all love Nuke, and yes with Modos shinny new 501 update we are buzzing more than ever how this Sub D modeler is breaking every paradigm left and right. We can get Gemoetry out of new just fine, but what about that Camera solve that you did in Nuke how are you going to get it out to such a program as modo. Fear not because exporting cameras from Nuke to Modo is easier than you think! (see link)
Thanks to the Modo.Stenson you can now in nuke go to export Chan file in the camera node and feed it right into Modo.
The work flow threw me though a loop at first but it goes as following. Once you install the script goes as this. In the animation Tab you’ll find the Interface for fs_NukeChanIO.py as Nuke Channel IO. Select a camera or 3d Geometry and click import. (this being that the values are aplicable to any keyframing object.) select and press enter and Bingo you have your Nuke Camera imported right into the scene,
Have fun Importing,
Nathan Westveer – Nathanfx.com
The first post, Search Read more and Learn in this journey of discovery..My Blog – enjoy 🙂