Navigate back to the homepage

Supporting Roles: Spiderverse +

Dhruv Govil
December 14th, 2018 · 6 min read

This post was originally part of my series on Python for Feature Film on my personal site, but is being ported here with minor changes. The dates have been adjusted to match release dates for the projects.

In this supplemental post, Part 9 of this series on Python for Maya, I cover the work I contributed to other shows in production. Often, even though I was not part of a show, I’d be responsible for maintaining or developing tools that those shows would use, and just generally, I enjoy helping out.

The reason I want to highlight these, is that I want to encourage TD’s to stick their noses into other projects if they can. There’s a lot to be gained from collaborating even when you might not be credited. Don’t be belligerent but also, don’t just keep your nose in your own projects.

Even if you personally can’t offer anything on those projects, you may see that they’re doing something cool that you can leverage for your own project or collaborate on towards a more general solution.

This will also be my final post on the technology developed for films I’ve worked on, since I’ve left to join the tech world to work on Augmented Reality.

However, I will continue to post about films that I haven’t worked on and try and provide insight on how technology and art can benefit each other.


Spider-Man: Into the Spiderverse

Spider-Verse was the rockstar show that everyone was vying to be on. It is probably the only show I regret not being part of in a bigger role, but I feel like I was still able to contribute in meaningful ways.

A trailer for Spider-Man: Into the Spiderverse

Let’s go over some of the ways a TD can help on show’s they’re not crewed to.

Maintaining Old Tools

On Cloudy With a Chance of Meatballs 2, I’d written a tool called SplitComp that let artists collaborate with deep compositing tools.

This was also embraced by the cloth department to allow multiple artists to simulate characters in a shot, and then see it merged with the background.

Spider-Verse had shown a lot of flaws in the original tool, especially regarding color science, and it took a considerable amount of work to update it for this show. This was also tricky because their code was a fork of the original tools code, so hadn’t been benefiting from the extensive work that other developers had put into it.

A cloth simulation of Miles using clothzcomp
A cloth simulation of Miles using clothzcomp

Writing New Tools

At Imageworks, we had a really terrific Quality Control system for shots. Every shot would have publish renders that would highlight visual discrepancies in the system, so we can approve or fix shots accordingly. Things like previewing motion blur, lighting etc..

The issue though was that this system was collapsing under a years of hacks that were making it super slow to use. Some shots would take 10 minutes or more to make a single query. This was unacceptable when you have the most important supervisors all in a room, waiting to get through many shots in a day.

We were literally burning money.

So I ended up rewriting the entire system from scratch:

  • A new database using Postgres. The trick here was using locally generated UUID keys as primary keys and relational keys. This allowed lowering the load on the database, while keeping lots of relational data in there. It made it super fast to query.
  • A new Python server and IDL based on ZeroC Ice. This provided a layer of abstraction, where the old system used to make direct database queries. The new system could optimize queries and cache. All of this was written using sqlalchemy for queries to make it easy for TD’s who weren’t familiar with SQL to make efficient queries.
  • Python based command line and GUI. This was done to be faster, provide more intuitive APIs and be more intuitive to use.

Spider-Verse was one of the first shows to adopt this, but today every show at Imageworks uses this system.

Queries that used to take 10 minutes, were now taking milliseconds. It was so fast, we had to introduce visual elements to affirm to people that the query had in fact happened.

In addition, I developed several new review tools for Spider-Verse and Smallfoot to use and leave notes for each other using this system.

Ironically, the biggest issue was that the older version of the system had so much cruft that it hid one of the biggest issues: It was querying every project from the database and then filtering after the fact. This meant that every show would get progressively slower.

So the biggest lesson here is to make sure your code is clean so you can detect weird logic bugs like that.

Provide Guidance

One of the easiest things you can do for other shows, is provide guidance. No person in the studio knows every aspect of the pipeline, and it’s often the case that a person who is the expert on something is not on a given show that needs them.

So reach out and offer your expertise , and the TD’s on that show will do the same. Everyone wins.

For example, the spider-verse team was doing a lot of work with optimizing the crowd systems that meant that we benefited on other shows from it. However they’d hit limits on querying assets from the server, and since it was something we’d hit elsewhere, we were able to help them recover.

The Spider-Verse team did amazing work with the effects on Spider-Verse


Love, Death and Robots: Lucky 13

This was such an awesome show. Our studio was responsible for Lucky 13.

While this show was going on, a friend and I were trying to develop a Virtual Production setup on the mocap stage that we’d set up for Spider-Man: Homecoming.

A trailer for Love, Death and Robots

So we ended up building a system that used a shoulder mounted camera rig, with a viewer and visual tracking markers. Daniela developed a plugin to sync the Vicon tracking straight into Maya, and we could stream in mocap as well.

This allowed us to let the director move around the shot virtually and get the exact camera moves that they wanted.

Unfortunately, this was the only show we could use it on, as we couldn’t get the budget to continue further. Which is a shame in hindsight because Virtual Production has really taken off with shows like the Mandalorian.

Behind the scenes breakdowns of the teams great work on this project

This show needed a few other enhancements to tooling like updating the Dynamic Origin tool from Hotel Transylvania to support the ships on this project. Reacting to the requests from this show meant that we actually enhanced the workflow on other shows as well, including our own. So again, it pays to be involved with other projects if possible.


Over the Moon

This is another project that could have been a really good target for Virtual Production.

The legendary animator, Glenn Keane was directing his first feature film. He was also a fan of Tilt Brush and wanted to handle leaving anotations in VR.

A trailer for Over The Moon

So another developer and myself started taking on this exploration.

Using a Vive, Unreal Engine and Tilt Brush, we started building out a proof of concept where we could bring in our 3d scenes and allow the director to annotate and draw into the space.

There was a lot of cool stuff here, with directors being able to inhabit the space they would be crafting.

This was a project that neither of us saw finished as we both left after a while, but it was really cool to setup the basic tooling for this, and collaborate with some amazing people.


What is Virtual Production?

So for both Love, Death and Robots and Over The Moon, we were verging into the territory of Virtual Production.

Virtual Production is all the rage right now due to the success of the Mandalorian, where ILM employed their stagecraft setup to create many environments virtually and have them rendered on set on giant LED walls.

A trailer for Over The Moon

Virtual Production itself isn’t new, and many studios have used it from the Lord of the Rings and before. It is effectively merging digital production tools into the on set shooting experience.

This has taken many forms:

  • Tracked cameras with a live render on a screen showing what you would see in the virtual world
  • Rendering the virtual world onto giant screens to integrate it into the shoot
  • Allowing directors to direct or provide feedback in Virtual Reality.

What has happened recently is that:

  • Engines like Unreal Engine and Unity are capable of rendering much higher fidelity images
  • We have had a dramatic increase in rendering power with modern GPUs
  • There’s been a lot more collaboration between offline and realtime companies, leading to better workflows between the two.
  • Large displays have gotten cheaper and have much higher pixel density and brightness than before
  • VR devices like the Vive have made low budget tracking solutions possible
  • Workflows in general have become more accessible to people without crazy expensive hardware

The culmination of all of these has led to the proliferation of Virtual Production.

More articles from Graphics

Smallfoot

A look back at working on Smallfoot

September 28th, 2018 · 6 min read

Spider-Man: Homecoming

A look back at working on Spider-Man: Homecoming

July 7th, 2017 · 4 min read
© 2013–2020 Dhruv Govil. All images and videos used under fair use.
Link to $https://twitter.com/dhruvgovilLink to $https://github.com/dgovilLink to $https://instagram.com/dhruvgovilLink to $https://www.linkedin.com/in/dhruvgovil/