Working on Suicide Squad

Working on Suicide Squad

In Part 5 of this blog series, I’ll be going over the work we did for Suicide Squad. One of 2016’s most hyped films, it landed with a critically derided meh.
I won’t go into the behind the scenes of why the movie went from A+ to D- in a heartbeat, there are plenty of articles out there that talk about what happened, but instead I’ll talk about the challenges we went through to bring some amazing visuals to the screen.

For the record, I honestly don’t think it was as bad as it was made out to be. It was definitely not great, but it was fun enough and I’ve seen worse films get better reviews. I’ve totally seen worse superhero films, and it doesn’t excuse it, but I feel the hate was a direct response to the immense hype, and not totally objective. But then again, maybe I’m not objective myself having worked on it.

Also my favorite review that I’ve read states: “If 2016 could be summed up in a movie, it would be Suicide Squad”. Ouch!

Challenges and Craziness

So for this movie, we were responsible for two primary characters.

Incubus (The big orange dude) and Enchantress (the lady in the green dress with the eyebrows).

 

We used the tentacle technology developed on Edge of Tomorrow to handle Incubus’s tentacles of destruction

Incubus

Incubus was of course fully CG. You can see him in the trailer as the guy who destroys the subway train.

This was lost in translation in the final movie, but he actually absorbs everything he destroys. There’s like a mini universe inside him.
If you were to pause on a frame of him and strip away his armor, there’s floating heads, eyeballs, guns, and even an entire tank inside of him.

Unfortunately with all the other effects, and the armor, it totally gets lost.

He also fires these tentacles outwards when destroying things. We made good use of the tentacle technology that was developed for Edge of Tomorrow, to make the tentacles that he fires.

Enchantress

Played by Cara Delevigne, Enchantress is a semi CG character. When she’s in her Jade outfit, basically the only part of her that is real is her face, and even then, we replace her face in a few shots.

The rest of her body is all computer generated, a mixture of some great tracking, animation, simulation and shading.
It may not look as realistic in the final film, with all her glowing tattoos and other effects, but if you were to see the CG model without all of that, there are several shots where all we could do to distinguish them was to look for her eyebrows (our model didn’t have eyebrows for a while).

We made use of some new skin shading technology, a new muscle simulation technology and a lot of talented artist time to recreate Ms.Delevigne in CGI.

Echantress in all her CG glory.
Forgive the low rez image, I had to find it via Google Images and there’s not much available yet.

Python For Suicide Squad

We made a lot of use of Python for the movie to make our lives easier and create new pipelines.

Our Muscle Simulation Pipeline used new technology invented by Ziva Dynamics

Muscle Simulation Pipeline

To get Enchantress to look as realistic as possible, we had to simulate the muscles under her skin.

About the same time we were doing this, Ziva Dynamics were doing closed betas of their new mucle simulation technology.
These are the same guys who did the amazing muscle work at Weta and now have their systems for sale for both feature film but also for interactive video games. (Seriously, their realtime VR demo is mindblowing).

Our character artist who was doing the sims needed to work in stages.

  1. Take the animation and prepare it
  2. Simulate the Bones (some bones are flexible) and write out their geometry cache.
  3. Simulate the muscles on top of the bones, and cache them out.
  4. Simulate a fascia on top of the muscles
  5. Simulate the fat and the skin sliding on top of this fascia.
  6. Simulate the tight clothing sliding on the skin.

While we used Ziva for the actual simulation, we still needed a new pipeline to procedurally handle all these stages.

So I built a framework where the artist could provide  a list of stages and their dependencies, as well as a construction class that would set up the sims.
Then my tool would figure out the dependency graph, generate all the data needed at each stage to feed the next and finally send it on through to the lighting department.

The framework was completely built using Python and in fact does not rely on Ziva at all, but does support it where needed.

This became especially useful when having to run through multiple shots as once, but it meant that setups could be reused for characters with little work.

Ingesting Motion Capture Data

For this show, we had a lot of motion capture data that needed to be ingested.
But we had some big problems that made it slow to do so manually.

  • Our motion capture vendor had their own naming conventions for scene files.
  • The motion capture rigs weren’t directly compatible with our production rigs and required some manual work.
  • We needed to then playblast these and add them to our animation library.

Doing this manually would have taken roughly 20 Minutes per captured clip, if I had no issues.. We had a couple hundred clips. That would be a full weeks work to do just one iteration over all of them.

This is where Python was super useful. I figured out the steps and scripted it all up. It could be done in a couple hours instead.

Given a list of files, it would:

  • Convert it to our naming conventions
  • Create a tracking camera to playblast it
  • Transfer the animation to our production rigs
  • Add these to our animation library
  • Send out an email

That meant I could start the process, go to my meetings, come back and it would be done.

That’s a lot of time saved.

Motion Capture lets us realistically capture animation from real world actors. This is often then used as a base to animate on top of, or even just as a reference.

Learn Python

If you're interested in learning more about Python, why not sign up for my course here?

This course takes you from learning the fundamentals of Python, to creating tools with advanced user interfaces, all while creating projects you can use immediately.
It will teach you many of the skills necessary to create tools like the ones in this article.

With over 700 students in its' first week, including artists from major studios, it is sure to provide valuable knowledge to everyone.



Python For Maya: Artist Friendly Programming - $70 $90

 

If you sign up from this link, it will save you $20 off the retail price using the coupon code: DGOVILCOM

Buy The Movie

Okay, I know I basically just talked about how bad the movie was, and I won’t lie it’s not great,but maybe you’ll like it in the same campy way people like the original Evil Dead? It’s actually pretty fun as a movie and you might have a good time.


 

Buy the BluRay

 

Affiliate Link

Working on Hotel Transylvania 2

Working on Hotel Transylvania 2

In Part 4 of my blog series, I’ll be going over what it was like to work on Hotel Transylvania 2 as well as how we used Python during its production.

This was the first project I’ve done where I was on it from the very beginning and it’s a really great experience being able to shape the foundations of the movie itself.

Additionally, I basically grew up watching Dexter’s Labarotory and Samurai Jack, so being able to work with Genndy Tartakovsky was something I was so psyched to do.

In the end it turned out to be a very difficult film to work on, with many changes to the story, a lot of crowds, large environments and a team that was fairly new to the studio.

Oh, and it’s also the show where I met my Wife. She’d just started in the industry as a cloth/hair artist and, well, I guess it’s a happy story from then on!

A Behind the Scenes look of working on Hotel T2

Challenges of The Project

Hotel Transylvania 2 was probably one of our most challenging animated features at the time.
This section goes over some of the challenges we had:

 

Animation Style

If you’re at all familiar with Genndy, you know he loves to do really over the top animation, where each frame is sculpted to be like how a 2D animator would work.
This means we were often pushing the character rigs way past where the rig would go and animators would sculpt their characters on a per frame basis to get the graphic style necessary.

 

Characters and Crowds

The movie had north of 200 hero characters, and a further 250+ variation characters. With close to 500 characters, versus a typical animated feature that had under a hundred, we were pushing new grounds.
Additionally it wasn’t unheard of single shots that had close to a hundred characters in them at a given time.

This made for a huge amount of work managing so many assets, but also a huge use of our crowd system that helped animators push out shots even when there was so much going on.
Other movies since have continued to push past these numbers, with the upcoming Emoji movie trying to take our crown, while Storks had shots with close to 800 characters in scene at once.

 

Cloth and Hair

We spent a lot of R&D upgrading our cloth and hair systems for this show.

We had Dennis, the child, who had a lot of red, curly hair.
We had characters transforming shapes while wearing multiple layers of clothing.
We had shots where animators would sculpt their characters into shapes that could never exist.

There had to be a way to get physically simulated cloth and hair, while stiill maintaining the graphic style that Genndy loves and is known for.

Large Environments

Environments were both incredibly large but also very dense.
To tackle this, we had to make heavy use of instancing to keep our memory usage low.

We also were often traversing outside the region of floating point precision, where graphics calculations and simulation start breaking down. This required the development of a new way to efficiently handle scenes where we’d travel outside this region.

Python On Hotel Transylvania 2

To get through this show, we had to work as efficiently as possible. Here’s a sampling of the Python tools I developed on this show.

Working at Origin

There were several shots that involved our characters moving great distances.

The problem with this is that eventually you move outside the limits of what computers can accurately represent and you start getting floating point precision errors.

This manifests itself in errors like simulations and other calculations having little bugs or even crashing the applicatiion.
Traditionally, movies work around this by centering the scene at the center of the world at the beginning of each shot. Unfortunately for us, we had single shots, and many of them, where we’d transition too far for this to work.

To work around this, I came up with a system that would take the scenes major character, find their world position, and offset the entire scene to the center of the world.
This was implemented using Python to build a node network inside Maya, and today is implemented in a single custom C++ node.

This gives us the illusion that from the camera’s point of view, we’re still in the right spot, but really we’re working at the center of the world.
When the animators move their character forward, it actually just moves the whole scene backwards.

Then these values are recorded and can be used to move them back to their actual world positions at render time.
This lets all our departments to work in a very comfortable area.

Animators don’t have to keep moving their cameras.
Cloth, Hair and FX can all work at the world center.
Everyone is happy.

This was the sequence we developed this tool for.
The car is moving pretty fast and rapidly goes outside of the comfort zone on some of the longer shots…

Reusing animation is a tale as old as time…
because it’s smart! Movie’s take a lot of time, money and effort.
It’s wise to save yourself time that you can spend elsewhere.

Rebuilding Animation

So we’ve got a lot of shots, a lot of characters, basically a lot of work. So let’s be smart and reuse as much as we can from the previous Hotel Transylvania film.

Should be easy right? Wrong.
In the mean time we’ve changed our toolsets so significantly that our old geometry data is no longer that useful.
We need to recreate all these old scenes so that we can use them. Close to 400 in total.

So they ask me to rebuild it all. This is where Python comes in handy, because I definitely was not going to do it manually.
I have the names of the characters, I can use that to look up what characters they map to on the new show.
I also can find their old cached animation curves.

So my tool would take a text file as input that told it what characters were in the scene and where.
It then finds all the relevant data, rebuilds the scene using our new, shiny assets and within a day we have 400 scenes recreated as if they were done from scratch again.

If I did this manually, it would have taken a few months.

Learn Python

If you're interested in learning more about Python, why not sign up for my course here?

This course takes you from learning the fundamentals of Python, to creating tools with advanced user interfaces, all while creating projects you can use immediately.
It will teach you many of the skills necessary to create tools like the ones in this article.

With over 700 students in its' first week, including artists from major studios, it is sure to provide valuable knowledge to everyone.



Python For Maya: Artist Friendly Programming - $70 $90

 

If you sign up from this link, it will save you $20 off the retail price using the coupon code: DGOVILCOM

Buy The Movie or Art Book

 

The movie is a good one for the kids, but the art book has some really gorgeous work in there from Genndy himself. If you’re interested, click on any of these Amazon Affiliate Links




Buy The Blu-Ray

 

 



Buy the Art Book

Working on The Amazing Spider-Man 2

Working on The Amazing Spider-Man 2

In Part 3 of this blog series, I’ll be covering what it was like to work on The Amazing Spider-Man 2.

This is the 5th Spider-Man feature film from Sony, and is part of the reboot/alternate universe starring Andrew Garfield as Spidey.

This project was a last minute project for me. Imageworks was light on work post Cloudy2 and I was about to be let go, but a last minute shift up in plans on another show meant I got to stay, and I’ve never been in that position since, fortunately.

If you’re interested in Layout, or cinematography, check out these two great books! (Affiliate Links)


Setting The Scene
Primarily about 2D animation, but a fantastic resource on traditional layout.


Framed Ink
One of my favorite books about composition

Layout and Pipeline: Doing Double Duty

For a lot of our Visual Effects features, we tend to combine the Layout and Pipeline departments.
This is because layout on these shows can become quite technical and it’s to our advantage to have them combined.

Other studios do similar things, for example some studios combine their matchmove and layout departments.

Fortunately, I had been working in Layout at Rhythm and Hues, and these skills came in very handy for this.

I ended up doing a lot of layout on this show, and in fact the opening shots in that trailer are all mine. It really helped having both the artistic and technical grounding, because it let me work more efficiently than I normally could have.

What is Layout?

Layout may be a term unfamiliar to some of the people reading this.

We essentially are the equivalent of a virtual cinematographer. Our job involves:

  • Handling all camera motion
  • Staging the scene by placing the environment pieces and characters
  • Prelighting (depends on the studio)
  • Continuity of sequences

We basically take the storyboards, or (increasingly common) the previs, and recreate it with our actual set and actual characters.

For example, in the trailer above, the very first shot of him falling into the city is mine.
I animated the camera, placed the buildings in their respective positions and set up the pacing of the shot by roughly animating Spider-Man.

This is then handed off to an animator to actually animate it properly.

Tools Built With Python

Since I was more preoccupied with layout on this show, I didn’t build a ton of tools, but there were a few that came in handy.

Quick Layout

The final battle sequence with Electro required us to be working in a very heavy environmet where we had to swap out set pieces for ones in various stages of destruction.

For example, if Electro destroys a pylon, we then need to make sure that it stays destroyed in all the other shots after that.

Since they were really wrecking this environment, I built a simple UI using Python and PyQt that let all the layout artists (including me), simply choose the state of predefined elements in the scene.

Each element had a group of radio buttons that let them decide whether they were intact, partially destroyed, fully destroyed etc…
This saved a ton of time and reduced potential errors because layout artists didn’t need to manually find the assets and swap them out, they could just click a few buttons and, BAM!, they were done.

 

A sequence that required a lot of layout work. Warning: Spoiler alert!

Lens Distortion

In Visual Effects movies, because we are shooting through a real camera lens, we pick up the imperfections of these lenses. Most importantly, we pick up the lens distortion.

This is non-ideal for us as our CG is undistorted, so when we ingest the plates, we use a calibration profile to undistort the images. This lets us work against a flattened plate.
When we output our final images, we then redistort them back to match the original camera.

However, clients are increasingly adamant that we present everything with the distortion, even earlier on in the process.
Even our animation playblasts need distortion these days, but those are just simple OpenGL captures from the viewport.

I set up a quick python script that would do the following:

  • Find the lens distortion used by the shot
  • Generate a Nuke file that would read our images
  • Write out these newly distorted images

It’s pretty simple to do, and something a lot of shows are now using, even the animated features like Storks.

Learn Python

If you're interested in learning more about Python, why not sign up for my course here?

This course takes you from learning the fundamentals of Python, to creating tools with advanced user interfaces, all while creating projects you can use immediately.
It will teach you many of the skills necessary to create tools like the ones in this article.

With over 700 students in its' first week, including artists from major studios, it is sure to provide valuable knowledge to everyone.



Python For Maya: Artist Friendly Programming - $70 $90

 

If you sign up from this link, it will save you $20 off the retail price using the coupon code: DGOVILCOM

Buy the BluRay

The movie was enjoyable. It’s definitely not one of the great super hero movies, but I enjoyed it as a Spider-Man fan. (Also I got to do the shot that finishes one of Spider-man’s most iconic story lines)

 

Affiliate Link

 

Working on Cloudy With a Chance of Meatballs 2

Working on Cloudy With a Chance of Meatballs 2

In this second part of my blog series where I go over projects that I’ve worked on, with a focus on how I used Python, I’ll be analyzing Cloudy With A Chance of Meatballs 2.
This was my first animated feature film, and my first film at Sony Pictures Imageworks.

Before I continue I’d like to give a little history.
I’d just left Rhythm & Hues as my contract was expiring. Rhythm wanted to extend it, but Sony just had a better deal between a more stable job, much higher pay and the chance to work on the sequel to one of my favorite animated films.

I was hesitant to leave because Rhythm had been a great gig, but the oppurtunity was too good to pass up. In hindsigh, this was a great decision because only a few weeks later, Rhythm fatefully filed for bankruptcy.

So begins my journey as a Pipeline TD, having transitioned from being a layout artist at Rhythm.
Imageworks had taken a chance hiring me, and so far it looks to be one that’s worked out.

Animation vs Visual Effects Films

Sony Pictures Imageworks is unique in that it’s one of the few studios that works on Animated Features as well as Visual Effects.
Seeing as I was changing from working on a Visual Effects Film to my first Animated feature, there were many  differences to take note of.

Pros of Animation

Animated features have a lot going for them, and there’s a reason why many artists try and work on them.

  • It’s so much more relaxing, at a slower pace and less overtime.
  • There is no client, or rather, the client is on the same team as you.
    They understand better the struggles of creating the imagery because they’re in the trenches with you, and there are fewer mad crunch times.
  • Teams are larger. Just the Animation department alone can eclipse the size of an entire visual effects team.
    This means work is more spread out and crunch time is easier to deal with.
  • You can deal with tasks on a sequence level rather than a shot level most of the time.
    This is because entire sequences are cut from one source, whereas in VFX films, each shot is its own beast.
  • You really get to feel like you’re crafting the movie. Even in Pipeline, you can have some influence over the final result, rather than in Visual Effects where you often feel like a cog in the machine.

Cons of Animation

It’s not always peaches and sunshine though. There are some downsides to it as well.

  • You work on the project for much  longer. It can get quite boring seeing the same shot on your screen 2 years later.
  • Teams are significantly larger, this means you don’t form as close bonds with your coworkers, and communication can be a real challenge. The show is now a giant lumbering machine, rather than an agile one.
  • As a Pipeline TD, there are fewer chances to do something really cool because the teams are so much larger, that tasks are shared around a lot, and you may have little to do.
  • There’s less of a cool factor. You’re often relegated to working on just a “kids film”. The Visual Effects films are the ones that often get the oohs and ahs.

Similarities

At the end of the day though, it’s not really all that different

  • Our pipeline at Imageworks is largely shared between Visual Effects and Animated movies. This means for the most part, you don’t have to consider them different at all.
  • Often you still have focus tests, marketing etc… on both that require crunch time. It’s not always smooth sailing, and I never go into a project thinking it’s going to be easy.

Python Tools for Cloudy With a Chance of Meatballs 2

There were quite a few major tools that I made for this show using Python. I’ll go over them here.

Deep Compositing for Animators

Cloudy 2 had a lot, and I mean a lot,of background characters.
This meant that shots couldn’t just be animated by a single artist and often had to be split up between multiple animators just to get it done in a realistic amount of time.

We have some great crowd tools that let us instance animation around the scene, but for many of these shots we needed unique, hero animation for (in some cases) a 100+ characters in a shot.

To help with this, I came up with a tool that takes our Playblasts (OpenGL captures from the animators scenes) along side a depth output, and then use this inside Nuke to combine them using depth.
This is a bit of a remedial use of deep compositing, but it’s quick, effective and animators can see the combined results of their scenes in under five minutes.

Since playblasts are a natural byproduct of animators working, there was no overhead other than enabling depth write outs for all their playblasts if certain criteria were met.

This can go even further though. Using the same depth compositing, we can bring the  data right back into Maya again as an image plane.
Maya’s viewport supports a single depth composited image plane. This means an animator can bring in either a single playblast or a combined output, and put it on an image plane.

From the shot camera, this 2d image is now properly composited into depth and you can move around the objects in the image as if they’re in the scene. It’s really quite cool to see.

Again, this process requires very little extra data, and no new workflows for the animators. It just provides a very natural way to get quick, iterative feedback on their scenes.

Here’s a video that goes over Deep Compositing on Planet of the Apes.
I didn’t work on this but it’s one of the best videos describing it.

This shot is an example of where we used the deep compositing, but also where we use the texture variation tool.
When Chester shows off his giant screens or his candy bars, each one is the same geometry and animators could pick what to show on them.

Texture Variations

Throughout the course of the movie we’d make constant reuse of the same geometry but have varying textures for them.
Traditionally lighting would just choose which texture they wanted, but for Cloudy 2, we wanted Animation to have control over it because they fed into gags in the shots.
Rather than have these be rigged assets or anything complex, we decided to keep it simple.

I built a tool that would show the animators any available textures for their assets, let them select which one they wanted and then let them apply it. They could do this for several objects at once.
Once they chose the textures, it would then be tagged to the geometry as an attribute that would then be picked up by the lighting template so that lighters didn’t have to even give it a second thought.

We used this for a lot of objects, from candybars, to ships to random objects in the scene that needed a little breakup.

Sorting Characters In The Scene

So not all of the tools we build on a show are this complex.

An example of a simpler tool I built that was pretty useful was in regards to a stadium scene in the movie. We had hundreds of characters that we needed to organize into sections.

This was a simple system of:

  • Get a list of all the characters in the scene.
  • Find their x,y,z positions in the world.
  • Sort them into sections based on the seats around them and their position.

Like I said, something really simple but even that can prove to be really useful in production.

Don’t really have an appropriate image for this one. So here’s a gif instead!

This is Pyblish, a similar publishing tool to the one we use in production.
There’s no shared code, but the fundamental design is similar. Developed by Marcus Ottosson.
Check it out here:

https://github.com/pyblish/pyblish-qml

Publishing Frontend

Like most studios, Imageworks has a very well defined publishing system to get data from one department to another.

Unfortunately, while the backend of our system was very well defined, the frontend system that was exposed to the artists was not.

This consists of these few basic ideas:

  • Artists select which assets they want to publish
  • They can configure a few options
  • The tool runs some validation tests
  • It then publishes the scene once all tests have passed

This gives us a reasonable safeguard against bad data making it to the next department, and lets us catch issues early.

Our old framework for this was old, and while the design was good, the implementation made it very unfriendly for artists, but also really hard to maintain and to add new tests. Additionally, a lot of it was in MEL.

So a coworker and I were tasked with coming up with a new framework, built from the ground up in Python.
We’d still use the same backend for publishing on our computer farm, but the frontend would be much more artist friendly and make it much easier for a TD on a show to add tests.

 

We’d built this towards the end of Cloudy 2, and we decided to beta test it on the ill fated test for Popeye

Learn Python

If you're interested in learning more about Python, why not sign up for my course here?

This course takes you from learning the fundamentals of Python, to creating tools with advanced user interfaces, all while creating projects you can use immediately.
It will teach you many of the skills necessary to create tools like the ones in this article.

With over 700 students in its' first week, including artists from major studios, it is sure to provide valuable knowledge to everyone.



Python For Maya: Artist Friendly Programming - $70 $90

 

If you sign up from this link, it will save you $20 off the retail price using the coupon code: DGOVILCOM

Buy the Movie and Artbook

The movie is pretty darn funny, and the art book has some incredibly gorgeous art behind it.


Buy the BluRay

Buy the Artbook

The links are affiliate links that will take you to your local Amazon store.

Working on Percy Jackson

Working on Percy Jackson

I’m starting a new series of blog posts, where I basically do a post-mortem analysis of the projects I’ve worked on and show the kind of things that can be built using just Python.

Today’s blog post will be about Percy Jackson and the Sea Of Monsters which I worked on at Rhythm&Hues. This was my very first feature film, and it was a great experience because it was like being thrown into the deep end immediately. Having to learn to swim on this show was the best learning experience I’ve had.

I had taught myself a little Python a few months before this job, but I really learned it and decided to leverage it while here.

Embracing Linux

Up till this point I’d mostly been a Windows and macOS user. I’d only used Linux once as a kid when I decided to dual boot and then realized I hate using it as a daily driver.

Unfortunately, like most major studios, Rhythm used Linux, and it’s not just Linux as an OS, you do almost everything from the command line.

 

There were no fancy desktop icons to launch the apps. You had to know how to type the commands in your Terminal and navigate to directories and files all via text. This was such a drastic change from what I was used to.

Fortunately it’s actually not that hard and you can get into it pretty quickly. Now I pretty much use the command line even when there are UI equivalents, just because I’m faster in it, and don’t have to take my hands off of my keyboard.

Another thing to note is that many studios use tcsh and not bash. This is important because bash is way easier, but oh well.

Resources

A good resource we were given to learn the command line was this book: The Linux Command Line. You can grab a free download here or buy the actual book here: The Linux Command Line Affiliate Link

LinuxDesktop

Voodoo and Parsley

Rhythm had their own proprietary animation software which had its’ own proprietary scripting language.
So after spending many years learning various 3D applications, I was now faced with one that you literally could not train for without joining this studio and similarly a language that had never seen the light of day outside of those walls.

Voodoo was a delight to use. It was limited in what it could do, similar to Pixar’s Presto and Dreamwork’s Premo, it could only be used for rigging, animation, some basic lighting and matchmove. You literally could not model inside it outside of creating a cube or a sphere. But the things it could do, it did incredibly well. It was a fantastic animation tool, and clearly build from the ground up for the artists at Rhythm.

Parsley was their in house scripting language. It’s pretty similar to MEL mixed with a bit of BASIC. It wasn’t hard to learn, but again, it was a bit more of a challenge than other languages because you can’t just go online to look for help.

So how do you prepare to start a job with a software package you’ve never even heard of before?

Well here are my tips:

  • When you learn any 3D package, don’t learn just where the buttons are and what they do. Instead, learn the fundamentals of what you’re doing. This will let you adapt quickly to other applications because it doesn’t where the features are in an application, it’s easy to get it to do what you want. Many people make this mistake and are utterly confused when they have to change applications.
  • Learn more than one 3D program and more than 1 programming language. Try and do the same thing in each of them. Make a simple animation or model in both applications, write a simple script in both languages. This way you really force yourself to abstract the concepts from their implementation.
  • Don’t be intimidated. There are people around you at work who are there to help you, not to judge you. Try to figure it out, but if after 15 minutes you cannot, then ask somebody. Nobody benefits from you being confused and it’s better to show the initiative of learning it than being bad at it later on.

 

 

Python Tools For Percy Jackson

Here are a few of the tools I built for the show using Python.
I was still fairly new to Python at this time, so the tools were nothing to write home about, so I’ll just give a brief overview of them here.

Python for Voodoo

This was a huge hack, but I tried it out as a proof of concept.

The pipeline team at Rhythm had a way to send Parsley commands to Voodoo over a network port. I used that to create a wrapper for every single Parsley command that let you use Voodoo straight from Python. When you initialized the module, it would connect to the Voodoo session you initialized it from, and then any commands issued would call out to Voodoo and convert values between Parsley and Python types.

Performance sucked, but it was a cool proof of concept, and could dynamically update when new parsley commands were added.

Ticket Submitter

At Rhythm, we had a program called JobTracker that was used to track all our tasks and tickets. It was pretty great, but one limitation was I could not submit tickets from other people. I was often getting messaged for requests and needed a way to submit tickets from specific users to speed up my day.

I built a custom UI using PyQt4 and the Jobtracker API that let me submit tickets quickly and it ended up saving me tons of time having to ask people to submit a ticket and then wait, when I could just do it myself much quicker.

Reference Reorder

Voodoo uses it’s own really amazing referencing system, where each reference could override the previous one. This was incredibly powerful, but it meant sometimes you need to reorder the references manually.

This used to either require a little Parsley inside Voodoo or editing the Voodoo ASCII file manually.

I built a PyQt UI that would allow you reorder the imports in the ASCII so you could quickly run through all these Voodoo files without opening them up.

Parsley IDE

We didn’t really have an editor for Parsley. We had syntax highlighting inside an editor, but no real tools other than that. There was nothing like PyCharm that would show you errors or give suggestions on how to fix the code.

I built a development environment using PyQt to help us edit Parsley. It was rudimentary but it would do a live analysis of the script and build a DOM that could then be analyzed to see where errors were happening.

Learn Python

If you're interested in learning more about Python, why not sign up for my course here?

This course takes you from learning the fundamentals of Python, to creating tools with advanced user interfaces, all while creating projects you can use immediately.
It will teach you many of the skills necessary to create tools like the ones in this article.

With over 700 students in its' first week, including artists from major studios, it is sure to provide valuable knowledge to everyone.



Python For Maya: Artist Friendly Programming - $70 $90

 

If you sign up from this link, it will save you $20 off the retail price using the coupon code: DGOVILCOM

Buy The Movie

If you’re interested in Percy Jackson and the Sea of Monsters (terrible film…don’t recommend it), you can buy the movie here:

 

Amazon Affiliate Link

 

Pin It on Pinterest