www.nathanfx.com

Latest

3DEqualizer Alt Key OSX fix

Anyone who uses 3DEqualizer on OSX knows that it runs through X11, And to its credit it runs pretty smoothly.

But there are a few issues with X11 that break the software right out of the box for some OSX users.

One is the ALT key.

support on at www.sci-d-vis.com tells mac users to download and install and open source version of x11, but is there a simple fix to this keyboard calamity?

Turns out Francis North an InkScape user noticed the same setbacks in Inkcape.

He writes in a form post http://www.inkscapeforum.com/viewtopic.php?t=800&f=5

1
Launch X11 from Applications/Utilities

2
Open the preferences window and enable the “Use the system keyboard layout” preference. This will force X11 to use it’s system default keyboard layout.

3
Open an xterm window (Command-N) and perform the following:

$ xmodmap -pke > ~/.Xmodmap

This captures the current xmodmap settings to a hidden file located in your home directory. It’s this ~/.Xmodmap file that X11 will by default use to override any system mappings.

4
Edit your .Xmodmap file by typing the following in the same xterm window:

$ open -e ~/.Xmodmap

Change the line: “keycode 66 = Mode_switch” to “keycode 66 = Meta_L”
Change the line: “keycode 69 = Mode_switch” to “keycode 69 = Meta_R”

Save the file and exit TextEdit.

5
Disable the X11 preferences “Emulate three button mouse” and “Use the system keyboard layout”, then close the preferences window.

These settings basically say “don’t treat ALT as a special key, and don’t override my .Xmodmaps file with system defaults”.

6
Close the X11 preferences window, any opened xterm windows, and then the X11 application.

That’s it. Now when you launch Inkscape, the ALT key should work as expected, and the status-bar will correctly display any ALT-key-specific help when that key is pressed (e.g. when using the Selection tool).

One final note: from an xterm window, you can debug your key mappings by typing “xev”. This program will open an interactive test window that will echo information about current key/mouse presses, and is a handy utility for determining keycodes and what they’re currently mapped to.

Feel free to provide any feedback and/or request for further clarification.

Happy ALT-keying

Thanks Francis North  for this InkScape Tip. 3D Equalizer MAC Users will also be thrilled to find out this fixes their Alt Key woes as well.

Nathaniel Westveer – www.nathanfx.com

Camera Solved 3D RotoPainting – Nuke

From an earlier entry, I gave away how to bake projection into Geometry/Cards to do paint fixes on them.  Nuke as powerful as it is has quite a few bugs..or more likely wasn’t designed around painting in the projection shader.  You Can do a blend Matte to combined projection but, add a roto paint and suddenly the scale blows way off..even if you reformat the input.  Its kind of Pesky if you want to use a camera solve to make paint fixes a bit easier.

There is a way I found out, not the sleekest one yet effective.  For the sake of example, Lets save you have a wall from shoot covered in cracks that have to be animated forming, this plate also has a shifting gradient from a light source in the shot.  Using a Camera Tracker you solved a 3d camera and its at your disposal.  Place a Card where the wall. For the Texture use a Rotopaint with an empty constant the format of your comp.  Then Attach the Bg1 to your Plate as you use the clone source from your clone tool, With a life time of of your choosing.  Then from the Scanline from the camera Merge this on top of the original Plate again. and vaula your paint fixes has as all the transforms. rotations and perspective shifts that you need. Create that Clean plate.

Nathaniel Westveer – www.nathanfx.com

SS Convert – Convert between PF Track- SynthEyes – Boujou and more

I’m writing this post today to tell every one on a great Mel. Script that i’m sure you trackers out there will enjoy.  The Survey Solver. as its called, not to be confused with Pixel farms new node in Matchit and Track, this Mel Script will take any Track from a verious other trackers and configure them to work in another program.  Such as PF Track to Syntheyes. Or PF Track to Maya Live..if your into that kind of thing.

http://minchomarinov.com/SS/

The Script comes form none other from a fellow tracker known as  Minco Marinov, (Check this reel while your on his site too).  Survey Solver is easy enough to install by dragging it in to your script editor and saving it out to a desinated spot on your shelf.  Or if your feeling technical there are great step by step instructions on where to save the files on your system.  ( given it looks like the paths where written for Maya 7) This script runs great in Maya 2011.

Give it a shot,

Happy Tracking – Nathaniel Westveer – www.nathanfx.com

You solved it!…now what – Exporting Nuke Cameras to .MA

Well blogging about tid bits about nuke has seemed to been quite the theme for me these last couple months,  I recently had run into a job where the only tracker I had at my hand inside nuke,  No problem I thought I’ve exported cameras before. but ..to my horror it can be very difficult some times.  Sure there are a lot of Mel and Python scripts to Maya, but none of them where playing nice with the version I was  using.  And most of what you get when you look for alternative ways are people telling you to upgrade to the newest Nuke which exports .FBX –  which was kind of an issue for me, one I didn’t have that version and two FBXs are very difficult them to play nice most of the time.

So is there a way to get Nuke to write out a .MA file Nativity, with a slightly crude yet very effective plug in, absolutely ( cue up following link in a new window or tab)

http://www.creativecrash.com/nuke/downloads/scripts-plugins/camera/c/nuke-camera-tracker-export

After installing this into your plugin folder ( make sure you but the base.ma into the icons folder)

createMayaScene Gizmo takes a bit to get use too, After attaching all of the handles you still have to Type in the names of the nodes into the 3 fields with Correct syntax.  (Example Read1 ,CameraTracker3 ect)

Once you do fill in the fields you are ready to export that camera to  .ma, pick the extension you want to save to and remember to put .ma after the file name and hit export and kick back for those brief .2-.3 secs as it writes out the file.

Open it in Maya and there you go, you Made it. And your co workers will be so impressed.

Nathan – www.nathanfx.com

3D in 2D (xy to xyz and back agin)

Nukes has a great 3d environment along with solid 2d node based composting, these tools can seem to live in different worlds. On one side, 3d the other 2D. Getting things out into 2d from 3d can like a daunting task if your not familiar with The reconcile 3d node. Lets Take the Godrays node for an example.  allot of the times the point you want to project from is not on a plane of parallax visible in the plate, more a point in 3d Space off in the distance.  You can animate it by hand but, if you can solve a camera its easier than you think.

So fire up nuke and hit that tab key as you type in R-e-c-o .. to pull up the the node like a pro.

You’ll notice that there is a few handles that are attached with it.  Img, Camera, Axis.  The most important of the handles to focus on right now is the Axis, where it observed by the solved(or animated) camera is where its going to back out the X,Y coordinates.  Click and drag the axis and orientate it in space where you want you ( for this example) god rays to project from and attach the other handles to assigned names,  Now from this you’ll have a few choices.

You can hit calculate key frames to have the computer look through the camera and give you the x,y values and bake it out. or you can have it update in real time and link an expression (apple drag, sorry windows folks i can only tell you from a mac at this moment) to the x,y transforms to the parenting node.

Either way  this node is amazingly simple to use for all you xy converts.

The Reverse Process – Points to 3D (from xy to xyz)

Now you say hey Nate that’s great But I need to translate a point in 2d to 3D.  This process can be done a few ways, what comes to mind is placing things in by hand, not a difficult process, in most cases but The foundry has your back with a node specialized to do this –  Points to 3d.  The min you get this node on screen you quickly observe that is has a camera handle like Reconcile 3D and relatively its the same process, take your solved or animated camera and attach it to it.  From there you notice in the node option s it has Point A, Point B, Point C.  These are the 3 Points you need to feed into the node to find the point in 3d space. So scrub through your timeline where you see your object, space or thing(er mo bob) and set the 3 points on 3 different frames,  and voila there you have it.  Your 2d values mapped out in 3D space.

Nathaniel Westveer – Nathanfx.com

Interior Panoramas

Lets face it, blueprints to every piece of architecture you’ll come across isn’t always realistic, or practical.This leads to finding the “unofficial” ways to transform what we see infront of us into an Orthographic image. Since I have a camera on me more than measuring tape as does i’m sure of all of us these days, Making a Photo Stich is a great solution.

A few problem arise when you go about to do this,
Parallax. What normally isn’t a problem when you capture that wide mountain vista in your vacation photos, suddenly becomes a jarring problem. From one Photo to the next, Objects closer to the camera will jump further and further away from shot to shot. So far in fact that stitching algorithms such as Photoshop, Throw up their hands in defeat, as they cant figure out whats going on from the what is the background of the photo to the foreground.

Photoshop isnt the sharpest tool in the draw, more like the a paraplegic kid on the playground who has fallen face first in the mud as he or she hopelessly shouts no no I can do it…I dont want anyones help.

For a solution to deal with this issue, a different program is required.
Thanks to thouse hardworking open source Enthousiasts and Developers we have Hugin. This Affectionally named Program, Knows the ins and outs of photo stitching that allows human input at the heart of the matter. Now Tutorials for this programed are over documented on its own web site but, for this situation, how you shoot matters allot.

http://hugin.sourceforge.net/

Always keep a Equidistance from the furthest Plane of Parallax, Shoot in Straight lines, literally. Trace out a horizontal line and folow it to a T. This will keep the what could be a difficult solve easier.

(only for refence panos) Try take picture from the closet point to you to the next, the goal is to get the stitched image to be as orthographic (flat/without perspective)  as we can make it, this means barriers that come strait at you should be photographed head on, and as little of their sides photographed as posible.

Once you got that down, Get the Photos imported and loaded.
only Aline similarities of the user features to the further plain of parallax, or the Plain you want to aline too. This will Create the the constancy of the stich through out the Solve that will give you the best results.

Batch Resizing

Nathan here, sometimes its the small tricks of the trade that get the job done. that can make all the difference in how a project gets done, In the world of 3D taking Image planes from resolution they where shot in often unnecessary and ads added pressure on to your system while you work. This often leads to you resizing one by one.
You could be into the Automator scripts, but if your me you want a solid UI that you dont have to figure out to use, A small program called Resize Me can work wonders.

http://www.apple.com/downloads/macosx/imaging_3d/resizeme.html

Resize me 2.2

Tracking for Modelers: A look at how Nukes New Point Cloud Generator Node can get you pixel accurate Layouts

As someone who enjoys the long tedious proces of Tracking, I’ve been dying for a program to take the data of the solved Virtual Camera to create high density point clouds. As I’m sure our friends at the foundry have been when they wrote the new Point Cloud Generator Node in Nuke. And the best part about it is that you dont need to rely on the Foundries Camera Solver to for this. For those of you who want an accurate in and out representation of a scene without busting out LIDAR, I recommend a few simple rules to capture scene you want.

Keep camera movements simple but move around. I know you want to try and do a crazy 360* turn to prove how “badass” you are. but really the solver will recognize and reenforce Spacial Location much more if you keep it with in a 45*-60* Degree of rotation.

Moving around is essential.
The Camera Solver Algorithms relies on what is called Parallax.
Where what the computer looks at is a Flat 2 Demential Picture (most of the time) if your not shooting in stereo there is very little to clue the computer in on whats going on. How points of contrast moved through out the plate makes all the difference if the computer can map out the depth to being unsolvable.

Do not move the camera too fast.
Especially with CMOS sensor based Cameras.
Heavy moment…even Any movement can invoke an “Evil” Called Rolling Shutter,
Were the bottom of the frame isn’t as updated as fast as the top. The foundry makes a great Plug-in to deal with this, I recommend you check it out. If not other Trackers do give you some tools to Deal with this issue but the best policy if your filming simply for the point cloud is to just keep it steady..and at a Resonable speed.
As well as Motion Blur will defiantly work aggressively to degrade the quality of the track if you do get a simple solve out of it at all.

Nodal Pans
Nodal Pans, (A Pan that is attached to one singular Node/Point in space,such As a Pan from a TriPod) unless given more data the Parallax Track Based Solve will return a Spherical Cluster of Point Cloud Data. Even though the camera is solved Correctly the point cloud will not. For Documenting a Scene its is recommend you use Free Hand motion.

Nuke Dense Point Cloud

Getting The Point Cloud out of Nuke.
Now you got your data out of Nuke, As the modeler you are its not going to do you much good if you cant use this data in your favorite modeler. I’m going to focus on Maya for the sake of simplicity but the concept remands the same.

The concept is that the point cloud is a group of Vertices that you most likely want to be translated in locators.

You can export point clouds through .FBX but all my experiences its freezes every time due to the amount of data needed to export.

But never fear because an MIT kid named Dominic Drane has written one for you.
(www.reality-debug.co.uk)

http://www.creativecrash.com/maya/downloads/scripts-plugins/utility-external/import/c/-chan-file-exporter-importer-with-nuke6x-point-cloud-importing–2/download_page

Place the .py file your maya plug in folder.

This plug is a 1-2 knock out with importing the Camera and Point cloud data.
First export the Chan. File from the Nuke Camera Node, Import it then it will ask you if you want to import the OBJ you wrote the vertex data too.

And presto your done.

Smoothing Camera Paths in Nuke

We’ve all had it happen, the Tracker liked and we though we could live with it, but nothing brings out these annoying fluctuations in a cameras translation in a tracked camera then trying to composite with it.  Well Fear not Smart, Handsome, Intelligent reader of my blog because its easier than you think.

Simply select the Fames in the keyframe editor and right click. Next go to edit filter. ( Edit <Filter)

Choose a value of intensity and there you go!  You are on your way to Camera smoothing your way to sanity and peace of mind.


Baking out Projection in Nuke

Anyone who is a nuke Artist loves projections. Infact when you are a nuke artist you start getting the urge to project everything onto Geometry…and why not its alot better than multi plaining in Shake.  But this wonderful tool has a few limitations. You cant use multiple projection shaders and any roto item post element in the node tree and the projection maps incorrectly.  Sooo Can we Bake out what we project on geometry to UVs? Absolutely,  Notice how the render Node says Obj/Scn, you probably have used the Scn Part but today we are going to attach it to the Obj part.  Go ahead and link the input to the Geometry node you want to bake out. Next in the Render node, under Scanline go to the drop down menu Projection and change it to UV.

If your viewer is attached up to the Node tree to see the rendered results you’ll see the UV Texture rendered out and ready for you to write it out to a file.  (Note Cards and other Nuke Geometry UVs are already laid out, if you havnt laid out your UVS or your Generating  poissonmesh you’ll need to Set your UVS in an external 3d Program. )