MocoMaya: Controlling a Virtual Camera Rig in Maya

Being able to control a real-life robotic armature/camera rig wasn’t enough… so I implemented the MocoMaya client Python module. Autodesk Maya 2010 has the ability to be scripted using its embedded Python interpreter. Python is pretty much a second class citizen in Maya since their original, native and well supported scripting language is MEL. Fortunately they have it set up such that the Python API somewhat closely mirrors the MEL API… albeit quite nastily.

The MocoMaya implementation has the ability to connect to a MocoServer and get live actuator movements, publish its own “viewfinder” (which is really just a live rendering of the scene as seen by the camera) and it can even import the CurveSet that is saved from the MocoMotionBuilder interface.

To run MocoMaya you have to start up Maya, go to the Scripting Window and under the Python tab enter about 4 lines of Python (really it’s only 2 lines that count (import MocoMaya; MocoMaya.MocoMaya()) the 2 lines preceding it are to append to the sys.path so it can find the MocoMaya module from within Maya… remember this is a Python that’s embedded and quite stripped down)… when you execute it you get a nice GUI.

When you run MocoMaya a small window pops up asking you for the Servername and Port of the MocoServer, it also lets you specify an existing virtual Maya camera if you want to use it as the camera being controlled by the MocoServer otherwise it’ll just create its own camera. After you enter the proper server and port and hit Connect a skeletal rig (using joints) is created and the camera is parented to it’s neck automatically… this rig represents all the joints that are movable in the actual real-life rig (I still need to add constraints/limits to the joints). From there you can go over to one of the other two tabs: Controller and Import/Export.

In the Controller tab you have several configurations dealing with the live control that you can set. You can enable/disable the MocoServer from controlling your camera live (ie. as the user is doing things in the MocoMotionBuilder interface things start moving around in Maya). You also have the ability to disable the outgoing viewfinder, this can drastically speed up the reaction time for Maya as it isn’t rendering the scene at every single move/frame. Finally you can turn down the resolution of the viewfinder output so it can render faster.

Note that when you have live control turned on the following things are constantly updated live in Maya as they are changed on the GUI: The armature positions and the “current frame”… the “current frame” is a relative concept. No matter what you set the Begin/End Frames on the GUI it still represents the frames as a normalized value from 0 to 1 internally as well as when it exports things. This means that as you’re creating some animation in Maya for a totally different frame rate/range the live playhead moves relative to the begin/end frame range in Maya… this is a feature. If you want the actual frame numbers to be identical then simply set both the GUI’s and Maya’s frame Begin/End values to be the same.

Additionally at every actuator command MocoMaya receives it also renders the image at the end of the motion from the rig camera and publishes it to the MocoServer as “viewfinder/maya”… this viewfinder, like all the other viewfinders, can be viewed in the browser or pulled in by anything else that can pull viewfinder images (ie. MocoCompositor).

In the Import/Export tab (only the Import works for now)… you can specify the filename that you saved the CurvesSet to in the MocoMotionBuilder interface. It will pull in the CurveSet from the MocoServer and apply all the keyframes and rotations to the virtual rig… you can see the curves using the Graph Editor and clicking on the rig/camera.

It’s worth mentioning that when you’re doing a live control from the MocoServer, nothing is actually being keyframed or saved in you Maya file… it’s only being used for previewing the framing and such. If you actually want the rig motion curves applied in Maya you’ll have to save them in the MocoMotionBuilder and Import the CurveSet as mentioned above. This is also a feature (not screwing with your Maya scene too much).

Also Note that if you have your rig already keyframed and animated it’s probably a good idea to turn off the live control otherwise both Maya and the MocoServer are fighting over the rig as the frames are being scrubbed. The easiest fix here is to delete your keyframes for the rig so the MocoServer can get total control.

So what’s next? I’ll need to get around to implementing the Export feature (to export the CurveSet to the MocoServer so others can use it). Right now the live control is a one-way thing (MocoServer->MocoMaya) I was contemplating the possibility of bi-directional live control so any rig changes made in Maya would send actuator commands back to the MocoServer etc… but that’s just another can of worms just waiting to happen (ie. trapping events in Maya and that whole re-entrant issue of receiving actuator commands you yourself published (the latter can probably be fixed with adding a ‘_initiator’ signature of some sort and filtering actuator commands containing your own signature))… but there’s more important things to implement/fix before I get time to think about this, so I’ll just add it as a TODO here.

“Oh controlling Maya is great and all, but how about us poor people that can’t afford the expenses of Maya?” you ask… yeah I feel your pain. I originally wanted to implement a MocoBlender (for the awesome open source Blender 3D program)… unfortunately after much research I found out that they are in the middle of a major overhaul for their 2.50 release. Their Python API is being reworked and isn’t quite ready for prime-time. This gave me two options, either implement it in the old API with a limited lifespan or just wait it out til 2.50 comes out and try to implement it then… yeah I’m lazy, I’ll wait til it comes out… well after graduation :P.

Here’s a quick little video that I made using MocoMaya to plan out the motion of the virtual rig, then importing the CurveSet and finally doing the Maya batch render out to frames. I used Imagemagick’s convert command to make an mpeg out of it. Note: I found the house model and the background image somewhere online (no ownership claim, but fair use). Obviously I’m not a professional animator 🙂

Comments are closed.