Making Of,Pipeline

Setting Up a Shot

18 March, 2015 |  13 Comments | by Pablo Vazquez


Pablo here. Sarah and I have been using our Latino powers on the jungle world pretty much since the beginning of the project. But for such a small team, it doesn’t really work so well to divide the work between two settings: the jungle and sheep island. So since last Monday the full team is going full steam ahead with the island world.

Since Sarah moved to the animation department, almighty Andy is doing some epic rainbow-sheepnado tests while keeping an eye on Manu and myself as we set up the lighting of the shots. Luckily many of the shots in the first act happen in the same environment and lighting conditions, so setting this up should be a breeze! Right? Well, think of it as a breeze in a washing machine-looking tornado with the occasional stray pair of underwear slamming into your face.

The Workflow

The workflow for preparing shots has been improving over the last few projects, but it’s still not something you just sit down and execute. There’s a lot of thinking and planning behind everything. Back in 2006, while working in the feature film “Plumiferos”, I learned a lot about pipelines. Some of that I brought in and implemented during Sintel and tweaked for Caminandes.

Blender’s library linking system is great, so much that it can easily be abused.

Since we are a small team, we kinda need to do a bit of everything. This is more or less how it goes:

  1. Layout
    The layout artist (in our case, the director) makes the shotnumber.layout.blend file with a camera and simple models, based on the storyboards. The characters used here should be linked (and their rigs made proxy for posing). In their low-resolution form, if available.
  2. Animation
    Time for the animator to take over. They take this file and save it as shotnumber.anim.blend. Since nothing has changed, the characters are still linked. That’s good. Any specific model that’s needed for this scene is brought in and linked as well. Time to animate!
  3. Lighting
    Time to shine! Even when the animation has just begun (or before! — the pipeline should allow this), we create the shotnumber.comp.blend file. This could even be made from scratch, but it’s just easier to just open the .anim file and save it as .comp. This way the characters are already linked. But the actions are local, which is not good; we need to fix this. So we link the actions from the .anim file. Then bring the high-resolution models from their respective libraries. And we’re done!


01 02 01 A Screen

Screenshot from 01_01_02_A comp file


A new step we’re doing now is creating a ‘set’ file per scene. This file contains the high-resolution models and some basic lighting and environment setup. This is great for setting up files faster. Especially when you have many shots that take place in the same environment, without much change in the lighting. A copy-paster’s dream!

For example, our pipeline now looks like this:

Or a real example, this is the path for a single shot:

The new ‘set’ file would be here, at the scene level:

In the shot_number.comp.blend file, we append the lights from the set file (so we can tweak them easily), as well as the world settings. We then move the lights around, do *very* minimal compositing on them — no color corrections or anything. This should look good enough out of the box. It’s very tempting to go and add a vignette, bloom, lens distortion, or give it a nice look using color management tools, but that’s for later.

And that’s it! The shot is ready for the farm! …Only to find all kinds of weird glitches that need further tweaking (or even need to be fixed in Blender’s code). But that’s OK, that’s why we make open movies after all: to improve Blender.
What about simulations, you may ask? That’s yet to be solved. But this should be done somewhere after the animation phase, using mesh cache information (Alembic). For the lighting team, this would mean we bring in the caches instead of the actual character/proxy. Not much changes in terms of the pipeline. If it looks good without simulation, it will just be awesome..r once simulated.


Keep in Mind…

1. Linked stuff should stay linked.

As much as possible, anyway. Making things local is OK, but keep in mind that if something has changed in the original library, you need to migrate those changes. Say you textured a rock, quite generic, but for a specific shot you need it to be slightly different. In this case you can make the rock local and tweak it to be awesome for that shot. But! If the art director decided all rocks should be now pink (great choice!), then you have to manually adjust the copies of that rock. So it’s better make things local only when you know they are at least 99% considered final.


2. Use overrides when you can.

It’s only a matter of making the rocks pink? Python is your friend! Leave your mouse over the setting you want to change and the tooltip should display a Python path. Go to the Python Console editor (Shift+F4), type in that path, and change it to whatever you want. It’s easier than it sounds; check out this 12-min tutorial (which I made it some time ago, but is still relevant since this part of Blender hasn’t changed at all):

If you read up to here and are totally confused, that’s to be expected. It can get complicated! Hopefully, however, this small wall of text is helpful for those wondering how we do things over here. Feel free to ask questions or share your experiences in the comments below.

Have a nice day/night!

13 Responses

  1. Ambition says:

    This was fantastic, Pablo. I always love seeing the organisation & teamwork side of Blender work. For some reason you just don’t get many tutorials on that subject in particular… :P


    • Thanks Roxanne!

      Yeah this doesn’t get covered a lot. But it sort of makes sense, anybody can learn to model a house and then teach you how to do it. But learning pipeline stuff takes a lot of time making mistakes, working with a team, trying stuff, it’s a huge investment of time and money. I feel very lucky to be involved in projects like this. Especially working in open projects where you learn so much and can share absolutely everything. Fell in love with a yellow-cube-with-interesting-lighting you made? Share it!

  2. JG Loquet says:

    Great insight, thanks for sharing !

  3. Flavio says:

    That sir, was a very instructive and interesting article.
    I must say I was very confused by how links works on Blender (coming from Maya). Lately I was completely dropping their use and just finding ways to append as much as possible (waiting for the proxify everything mentioned in the bcon). The fact that you can still tweak things with python is great even if not ideal. I just missed that. Thanks for (re)sharing that tutorial.

    I like the way you use those 3 files, and the use of the actions for the animation.
    I think this topic (pipelines for Blender) is a crucial point for any company, and their should be more of those discussions on how to setup team-work with Blender.
    Thanks for sharing !

    • Hi Flavio, thanks for writing!

      I haven’t used Maya so I don’t know how it works there, but could be interesting, since to me Blender’s way is clear (even though you stumble upon UI limitations).

      The functionality is there, I’ve been using overrides pretty much since they were added in 2009-2010. It’s mostly an UI issue, communication. It’s a bit alarming that such an experienced user like you isn’t aware of these really useful kind of things. They’re too hidden. The cookies, the gurus, and the masters should make tutorials about this!

      The 3 files is for convenience. Back when working in Plumiferos (the first feature animation movie made with Blender), we used to link a chain of up to 5 files. Camera > Set > Animation > Light > Comp. Gives a lot of freedom since the director can change the cameras while the modelers update the set, at the same time the animator works uninterrupted or the light is being done. But that was a big team, for a small group of people it’s a bit overkill. When I joined Sintel, they were using 2 files only, adding an extra one made splitting work a bit easier.


  4. Bintang Senja says:

    Thanks for sharing this Pablo, I wonder how do you guys plan for compositing ? If just like caminandes, so you light and comp in the same file, render one frame then start to comp , then the comp file become final render, but I think this way I can’t see how to see what’s happening in each frame, or you go with render each render layer with a bunch of passes to File Ouput node, so later you can tweak the comp without re-render the shots and know exactly what happen in each frame ?
    I want to know which more efficient to this project.

    Thanks Pablo
    Have great day :)

    • Hi Bintang!

      We’ll try to do as much as possible without comp. Rendering a bunch of passes to 16/32 bit .EXR takes a lot of time/diskspace/moving stuff from farm to local via the network. It’s some overhead, it should be used only when needed.

      Sometimes a shot needs to be finished so fast that this workflow becomes kind of a luxury. Better to tweak a few things in comp and try to put as much as possible in the actual render. Cycles is very good at this (and its DoF is so beautiful/accurate you can’t get such a nice result only with Defocus/Bokeh node). We survived ED, BBB, Sintel, ToS and Caminandes this way, let’s try once more (until we get deep compositing :)

      Plus, computer render time is cheap, an artist’s time is expensive.


  5. Nico says:

    So you replace the linked files for high res models.
    That might be nice for some kind of automation in the GUI
    Set a level and pick models based upon level.
    Maybe models could called xxxxxxx.ResX.blend
    Like : ResL ResM ResH (low medium high).

    Maybe have it as render option to say what kind of model type to use in the render.

    • Hi Nico!

      Yes, we have that, but we use groups instead. It’s easier to link/swap them, faster too, instead of reading whole new .blend files. We did this for previous projects and will do for Gooseberry too, although computers now are fast (and the viewport too), so there’s not a big need for super low res characters, just a matter of disabling SubSurf using Simplify. Once we have OpenSubdiv we will even be able to see it subdivided. But it’s not ready yet, perhaps not for Gooseberry pilot either.


      • Flavio says:

        Hi Pablo !
        Can you be more specific about that point (linking groups in Low/High quality) ?
        You say in the article “The characters used here should be linked (and their rigs made proxy for posing). In their low-resolution form, if available.””

        So I guess you have the file of the character somewhere
        Including the rig, the High version (with the Shading ?) and the low version (for animation, without shading?).
        Do you have two groups ? One with the setup+Low and one setup+High ?
        So you link to the low group. Work the animation and… and then what ?
        How do you make a clean switch ?
        Or do you link both groups but you just hide the high version ?

        Cheers ! :)

  6. Edgar says:

    Great information, thanks!

    I always wonder, how do you work with the character model’s cloths, as in, how do you bake the cloth animations/simulations per scene? As far as I understand, there is no way to access cloth physics for linked objects. Or do you make the cloths & rigs local first?

  7. Taiwofolu says:

    Good read. Thanks for the info!

Comments are closed.