Smooth surfaces, to some of us who religiously do Feature Tracked 3D Camera solves, they are the Devil. Unless the smooth suface has pours that we can boost the contrast on, its pretty hard to track those suckers. Something that Planar tracking seems to Excel at. By Planar Tracking a smooth surface suddenly allows you to recreate the shot to have a texture on top of the of this surface, But what should you put on it? A gird would seem to work out great..since it would allow a high density of trackers constantly distributed over the surface, In theory its sounds good. Yet Grids are too similar for the tracker algorithm, which bumps up the computational time to figure out which grid square a track as well as confusing the features to jump from square to square with in the search region a set back that often cant be avoided if the track is moving a significant distance. So what to do? An Irregular Texture turns out to be just what the doctor ordered. Perln Factual Noise is a fast way to generate a texture on the fly And complimentes most trackers search algorithms natural ability to look for high contrast patterns through out the enhanced plate.
The Exeption: Optical Flow tracking
Where contrast based Feature tracking gets confused and dazed at a black and white checkers Optical flow behaves different and tracks through less of an issue though it will get tripped a bit.
Nuke was made for stereo. Even before its support in version 5 Nukes flexible node sytem seems
to be the perfect way to cobble together a way to make any thing go which way.
But if your in a pinch or are new to stereo Making your own custom rig may seem like a Daunting Task.
So here is mine, I hope you have fun with it.
Under The Rig Control node
This Node Controls the Rigs Movement
Key frame or Parent a Solved camera to the RIg.
For cameras with Variable Focus you Can parent the rigs focus controls under the Solved Import tab.
Enjoy The ability to nondestructive adjust the camera on top of a monoscopic solve data
Toe Slider- to adjust convergence – Leave at 0 if you want a parallel rig
Inner Axeal – Also Known as Inner Ocular when Refuting to the Human Eye, Slides both cameras away and towards each other horizontally.
Rig move – this is a quick peak where the axis is sitting and gives you the ability to parent the rigs position
— Indie Tab
Left Indie – moves the Left camera Independently from the Right non destructively to the inner ocular
Right Indie – Moves the Right camera Independently from the Left non destructively to the inner ocular
Left Rotation Indie – Nondestructive Rotate the Left
Right Rotation Indie – Nondestructive Rotate the Right
This is the same as what you would find in the Camera Projection Tab, Adjust the Focal, Aperture ect
Most of the time I’m keen on sharing a Visual effects tip that I had to rack my own brian out trying to solve, but this time I would like to give the air waves for a brief moment to a photoshop plugin thats comes in really handy for me, Golden Crop. Golden Crop Does math mathematic operations to give you Complex Composition models and algorithm in simple photoshop guidelines.
For those of you who are new to installing photoshop plug ins, simply drop the plugin file into your photoshop plugin folder, OSX users will find this in their photoshop folder in the application folder.
Once downloaded your going to search high and low in photoshop and wonder what actualy happen, dont worry I had the same reaction after Installing Golden Crop, and its where you least expect it. In the File<automate menu will apear a new set up options for Golden Crop, Simply select what operations you want it to execute and there you go.Compostions no one can argue about with you, because you have science your side. Or at least Golden Crop.
Nathaniel Westveer – www.nathanfx.com
Anyone who uses 3DEqualizer on OSX knows that it runs through X11, And to its credit it runs pretty smoothly.
But there are a few issues with X11 that break the software right out of the box for some OSX users.
One is the ALT key.
support on at www.sci-d-vis.com tells mac users to download and install and open source version of x11, but is there a simple fix to this keyboard calamity?
Turns out Francis North an InkScape user noticed the same setbacks in Inkcape.
He writes in a form post http://www.inkscapeforum.com/viewtopic.php?t=800&f=5
Launch X11 from Applications/Utilities
Open the preferences window and enable the “Use the system keyboard layout” preference. This will force X11 to use it’s system default keyboard layout.
Open an xterm window (Command-N) and perform the following:
$ xmodmap -pke > ~/.Xmodmap
This captures the current xmodmap settings to a hidden file located in your home directory. It’s this ~/.Xmodmap file that X11 will by default use to override any system mappings.
Edit your .Xmodmap file by typing the following in the same xterm window:
$ open -e ~/.Xmodmap
Change the line: “keycode 66 = Mode_switch” to “keycode 66 = Meta_L”
Change the line: “keycode 69 = Mode_switch” to “keycode 69 = Meta_R”
Save the file and exit TextEdit.
Disable the X11 preferences “Emulate three button mouse” and “Use the system keyboard layout”, then close the preferences window.
These settings basically say “don’t treat ALT as a special key, and don’t override my .Xmodmaps file with system defaults”.
Close the X11 preferences window, any opened xterm windows, and then the X11 application.
That’s it. Now when you launch Inkscape, the ALT key should work as expected, and the status-bar will correctly display any ALT-key-specific help when that key is pressed (e.g. when using the Selection tool).
One final note: from an xterm window, you can debug your key mappings by typing “xev”. This program will open an interactive test window that will echo information about current key/mouse presses, and is a handy utility for determining keycodes and what they’re currently mapped to.
Feel free to provide any feedback and/or request for further clarification.
Thanks Francis North for this InkScape Tip. 3D Equalizer MAC Users will also be thrilled to find out this fixes their Alt Key woes as well.
Nathaniel Westveer – www.nathanfx.com
From an earlier entry, I gave away how to bake projection into Geometry/Cards to do paint fixes on them. Nuke as powerful as it is has quite a few bugs..or more likely wasn’t designed around painting in the projection shader. You Can do a blend Matte to combined projection but, add a roto paint and suddenly the scale blows way off..even if you reformat the input. Its kind of Pesky if you want to use a camera solve to make paint fixes a bit easier.
There is a way I found out, not the sleekest one yet effective. For the sake of example, Lets save you have a wall from shoot covered in cracks that have to be animated forming, this plate also has a shifting gradient from a light source in the shot. Using a Camera Tracker you solved a 3d camera and its at your disposal. Place a Card where the wall. For the Texture use a Rotopaint with an empty constant the format of your comp. Then Attach the Bg1 to your Plate as you use the clone source from your clone tool, With a life time of of your choosing. Then from the Scanline from the camera Merge this on top of the original Plate again. and vaula your paint fixes has as all the transforms. rotations and perspective shifts that you need. Create that Clean plate.
Nathaniel Westveer – www.nathanfx.com
Well blogging about tid bits about nuke has seemed to been quite the theme for me these last couple months, I recently had run into a job where the only tracker I had at my hand inside nuke, No problem I thought I’ve exported cameras before. but ..to my horror it can be very difficult some times. Sure there are a lot of Mel and Python scripts to Maya, but none of them where playing nice with the version I was using. And most of what you get when you look for alternative ways are people telling you to upgrade to the newest Nuke which exports .FBX – which was kind of an issue for me, one I didn’t have that version and two FBXs are very difficult them to play nice most of the time.
So is there a way to get Nuke to write out a .MA file Nativity, with a slightly crude yet very effective plug in, absolutely ( cue up following link in a new window or tab)
After installing this into your plugin folder ( make sure you but the base.ma into the icons folder)
createMayaScene Gizmo takes a bit to get use too, After attaching all of the handles you still have to Type in the names of the nodes into the 3 fields with Correct syntax. (Example Read1 ,CameraTracker3 ect)
Once you do fill in the fields you are ready to export that camera to .ma, pick the extension you want to save to and remember to put .ma after the file name and hit export and kick back for those brief .2-.3 secs as it writes out the file.
Open it in Maya and there you go, you Made it. And your co workers will be so impressed.
Nathan – www.nathanfx.com
Nukes has a great 3d environment along with solid 2d node based composting, these tools can seem to live in different worlds. On one side, 3d the other 2D. Getting things out into 2d from 3d can like a daunting task if your not familiar with The reconcile 3d node. Lets Take the Godrays node for an example. allot of the times the point you want to project from is not on a plane of parallax visible in the plate, more a point in 3d Space off in the distance. You can animate it by hand but, if you can solve a camera its easier than you think.
So fire up nuke and hit that tab key as you type in R-e-c-o .. to pull up the the node like a pro.
You’ll notice that there is a few handles that are attached with it. Img, Camera, Axis. The most important of the handles to focus on right now is the Axis, where it observed by the solved(or animated) camera is where its going to back out the X,Y coordinates. Click and drag the axis and orientate it in space where you want you ( for this example) god rays to project from and attach the other handles to assigned names, Now from this you’ll have a few choices.
You can hit calculate key frames to have the computer look through the camera and give you the x,y values and bake it out. or you can have it update in real time and link an expression (apple drag, sorry windows folks i can only tell you from a mac at this moment) to the x,y transforms to the parenting node.
Either way this node is amazingly simple to use for all you xy converts.
The Reverse Process – Points to 3D (from xy to xyz)
Now you say hey Nate that’s great But I need to translate a point in 2d to 3D. This process can be done a few ways, what comes to mind is placing things in by hand, not a difficult process, in most cases but The foundry has your back with a node specialized to do this – Points to 3d. The min you get this node on screen you quickly observe that is has a camera handle like Reconcile 3D and relatively its the same process, take your solved or animated camera and attach it to it. From there you notice in the node option s it has Point A, Point B, Point C. These are the 3 Points you need to feed into the node to find the point in 3d space. So scrub through your timeline where you see your object, space or thing(er mo bob) and set the 3 points on 3 different frames, and voila there you have it. Your 2d values mapped out in 3D space.
Nathaniel Westveer – Nathanfx.com
Nathan here, sometimes its the small tricks of the trade that get the job done. that can make all the difference in how a project gets done, In the world of 3D taking Image planes from resolution they where shot in often unnecessary and ads added pressure on to your system while you work. This often leads to you resizing one by one.
You could be into the Automator scripts, but if your me you want a solid UI that you dont have to figure out to use, A small program called Resize Me can work wonders.
Tracking for Modelers: A look at how Nukes New Point Cloud Generator Node can get you pixel accurate Layouts
As someone who enjoys the long tedious proces of Tracking, I’ve been dying for a program to take the data of the solved Virtual Camera to create high density point clouds. As I’m sure our friends at the foundry have been when they wrote the new Point Cloud Generator Node in Nuke. And the best part about it is that you dont need to rely on the Foundries Camera Solver to for this. For those of you who want an accurate in and out representation of a scene without busting out LIDAR, I recommend a few simple rules to capture scene you want.
Keep camera movements simple but move around. I know you want to try and do a crazy 360* turn to prove how “badass” you are. but really the solver will recognize and reenforce Spacial Location much more if you keep it with in a 45*-60* Degree of rotation.
Moving around is essential.
The Camera Solver Algorithms relies on what is called Parallax.
Where what the computer looks at is a Flat 2 Demential Picture (most of the time) if your not shooting in stereo there is very little to clue the computer in on whats going on. How points of contrast moved through out the plate makes all the difference if the computer can map out the depth to being unsolvable.
Do not move the camera too fast.
Especially with CMOS sensor based Cameras.
Heavy moment…even Any movement can invoke an “Evil” Called Rolling Shutter,
Were the bottom of the frame isn’t as updated as fast as the top. The foundry makes a great Plug-in to deal with this, I recommend you check it out. If not other Trackers do give you some tools to Deal with this issue but the best policy if your filming simply for the point cloud is to just keep it steady..and at a Resonable speed.
As well as Motion Blur will defiantly work aggressively to degrade the quality of the track if you do get a simple solve out of it at all.
Nodal Pans, (A Pan that is attached to one singular Node/Point in space,such As a Pan from a TriPod) unless given more data the Parallax Track Based Solve will return a Spherical Cluster of Point Cloud Data. Even though the camera is solved Correctly the point cloud will not. For Documenting a Scene its is recommend you use Free Hand motion.
Getting The Point Cloud out of Nuke.
Now you got your data out of Nuke, As the modeler you are its not going to do you much good if you cant use this data in your favorite modeler. I’m going to focus on Maya for the sake of simplicity but the concept remands the same.
The concept is that the point cloud is a group of Vertices that you most likely want to be translated in locators.
You can export point clouds through .FBX but all my experiences its freezes every time due to the amount of data needed to export.
But never fear because an MIT kid named Dominic Drane has written one for you.
Place the .py file your maya plug in folder.
This plug is a 1-2 knock out with importing the Camera and Point cloud data.
First export the Chan. File from the Nuke Camera Node, Import it then it will ask you if you want to import the OBJ you wrote the vertex data too.
And presto your done.
We’ve all had it happen, the Tracker liked and we though we could live with it, but nothing brings out these annoying fluctuations in a cameras translation in a tracked camera then trying to composite with it. Well Fear not Smart, Handsome, Intelligent reader of my blog because its easier than you think.
Simply select the Fames in the keyframe editor and right click. Next go to edit filter. ( Edit <Filter)
Choose a value of intensity and there you go! You are on your way to Camera smoothing your way to sanity and peace of mind.