Last year I had the pleasure of working on The Commuter with Matt Tinsley, Paul Docherty and Visual Effects Supervisor Steve Begg. Here’s an interview on the excellent Art of VFX website with Steve about working on the project.
Imagine compositing a shot without seeing the result as you put it together. This is how it was for the Optical Compositing Team at ILM (Industrial Light and Magic) in the 1980s. In ‘Return of the Jedi’ the 19th shot in the Space Battle scene (called SB19) was composited by the Optical Compositing Team at ILM. They were often referred to as The Optical Dogs as they were last in a very long line of technicians to work on a shot. SB19 consisted of 63 different space ships and was made up from 170 rolls of film.
SB19 in all its widescreen glory.
These rolls of film were superimposed on top and behind each other using an Optical Printer. The Optical Printer at ILM meant the technicians could take previously shot elements and build up a new shot by exposing them on to a fresh piece of film, rewinding that film and then exposing another element on top.
Visual Effects Supervisor, Ken Ralston and his Visual Effects Editor, Bill Kimberlin produced the shot by first shooting all the ship elements using the Dykstraflex motion control camera. Kimberlin then choreographed the shot by projecting each of the shot ship elements on to a cell (a piece of transparent film) and drawing around the ships at various points to plot out their paths. This helped Kimberlin to line up the ships movements to one another and so make a dynamic and interesting shot.
Kimberlin plots the path of one of the Tie Fighters and inspects the rolls of film.
Once this ‘cell’ version of the shot was signed off a temp composite was created using the optical printer. It was a very rough black and white version of the shot (the mattes from each ship combined) but something the editors could work with for timings. This temp composite showed Kimberlin and Ralston how the shot was starting to look. What’s interesting here is that the ships were choreographed after they were shot.
The temp composite. You can see the Tie Fighters but also lighting stands!
The dope sheet for SB19. Every mark means a change to the shot.
SB19 took John Ellis (the Optical Printer Technician) 10 hours to composite. One mistake at any point would have rendered the shot unusable and the whole process would have to start again. I wonder how many versions he went through?
Ralston and Ellis.
There is in fact an error in the shot. Towards the end of the shot the Tie Fighters in the background are seen on top of the Millennium Falcon which is in the foreground. Ellis, talking about SB19 in the 1985 Horizon documentary ‘How To Film The Impossible’ said – ‘We don’t get them all perfect… most of them we do!’. I guess the error was deemed too small to go back and fix and so it stayed in the film… that is until the special edition of ‘Return of the Jedi’.
So next time you’re watching ‘Return of the Jedi’ and SB19 pops up think of all the hard work that went into it. It’s mind blowing when you think of all the hours spent on it and all the talented people that created it at ILM. What’s also amazing is that ILM continue to do incredible work to this day, even 30 plus years after ‘Return of the Jedi’ was released… and long may their astounding work continue.
- 1985 Horizon documentary ‘How To Film The Impossible’.
- fxphd: The Role of the Optical Printer.
- Industrial Light and Magic: The Art of Special Effects
- Images taken from the 1985 Horizon documentary ‘How To Film The Impossible’.
Since I’ve been working in-house as a compositor I’ve been rendering all my Nuke scripts in the Terminal on my Mac. All good… or so I thought.
I have recently discovered that Nuke does not use the GPU (Graphics Processing Unit) by default when rendering this way. I guess it’s to do with rendering with the command line which is used for render farms which often won’t have a GPU.
Luckily the fix is simple. Add ‘–gpu’ to the terminal command and hey presto… the GPU is used when rendering. So now the NVIDIA GeForce GT 750M in my MacBook Pro is used when rendering by the GPU accelerated nodes in my scripts. Winner.
I have re-writen my Nuke Applescript droplet to use the GPU when rendering which you can download.
This is a great tip for all pro users of the Avid DNX codec.
From Mavericks onwards it has not been possible to use Quick Look to preview Avid DNX Quicktimes. This great tip from janusz though points out that you can transplant the necessary Quick Look files from a Mountain Lion machine and voila… you can preview to your hearts content! There is also a link in the forum to the Mountain Lion files incase you don’t have access to them.
I recently discovered you can set your Mac to automatically mount a server when you start up. Just drag the mounted server from your desktop to your ‘Login Items’ in your ‘User & Groups’ panel in ‘System Preferences’… and voila!
However it also opens a finder window directed at the server even when you click the ‘hide’ check box. I started thinking of a way to do this with AppleScript, a language that I’m slowly getting more and more comfortable with.
What’s nice is that you can add saved AppleScripts to your ‘Login Items’. Finder will then run the script at login.
I started by trying the following script which connects to the server and then closes the window. Server path and server name are found by hitting cmd+i when your server is mounted and selected from your desktop. My path for example starts with ‘afp://’.
#Connect to server. tell application "Finder" to open location "server path/server name" #Wait until the window opens. delay 10 #Close window. tell application "Finder" to close Finder window "server name"
This works perfectly but only at work. When I’m at home and I can’t connect to the server the script will return an error when it can’t find the window (because the server has not been connected). So yesterday on the cycle home I’m thinking… ‘what I need here is an ‘IF’ statement.
An ‘IF’ statement is a way of saying – if a certain set of circumstances are true
then do this command other wise do this command instead. It’s perfect for this script as I want Finder to first check if the window is open. If it is open then close it otherwise do nothing and don’t return an error.
You can download my AppleScript here… MB_ConnectToServer_v01
And here’s my final code…
tell application "Finder" to open location "server path/server name" #Wait until the window opens. delay 10 #Close the window if it exists, if it doesn't do nothing. tell application "Finder" if Finder window "server name" exists then tell application "Finder" to close Finder window "server name" end if end tell
Matt Damon is back, Paul Greengrass is back, Chris Rouse is back – what’s not to get excited about!!
Yesterday saw the new Jason Bourne trailer being premiered at Super Bowl 50… and my word it looks fun.
Jason Bourne 5 Teaser
I’m currently working in London on the project as a Compositor with the editorial department.
In an effort to be more productive at work I’ve been looking at my focus and how I can improve it.
One of the big things that has helped me on this quest is meditating each morning. I use an app called Headspace which leads you through various types of mediation for a variety of things you might want to achieve. The pack on focus really helped me understand now focus works and that getting distracted is unavoidable but that it can be lessened.
Surprisingly the other thing that has helped my focus at work is having more breaks! An app call BreakTime runs on my Mac and tells me when I have been working at my computer for an hour. When it goes off I get up, walk to get a glass of water, make a cup of tea (my favourite kind of break), anything that isn’t sitting at my desk. It not only means that I remember to take a break every hour (which is very good for your back, posture and eyesight) it also means that in that hour at the computer I do nothing but work… no Facebook, no email, no web browsing, no distractions.
I’ve also turned off all the notifications on my iPhone which is hugely liberating. Rather than being a slave to the thing as it lights up and tells me I have a message I now choose when to respond.
Understanding my focus has had the unforeseen consequence of improving my memory. By having set times to do certain tasks my brain seems less overwhelmed and is therefore much more willing to remember!
As with any lifestyle change it’s all about repeating it until it becomes habit… watch this space.
I have updated this script to use the GPU when rendering. Please see this post for more details – Rendering with the GPU in Nuke
Since working on Spectre I’ve started rendering all my Nuke scripts in terminal shells and not the gui. This feels a lot tidier and also seems to run much faster.
I’ve created an Applescript droplet to help with this. Simply drop your script on to the droplet and it will open a terminal, enter the necessary information and then start rendering.
Thanks to jweaks on Stack Overflow for the help on this!
Please note – the version of Nuke you are running must be set up in your .bash_profile file for this droplet to work. The Foundry have written an excellent piece on this if you need help.