Movie Barcode

I was recently shown a fascinating way of displaying an entire movie in one image.  They have been nick named ‘movie barcodes’.   It consist of the average colour of certain frames in a movie, displayed left to right in one image.

It’s a really interesting and unique representation of a movie.  You can instantly see how the colour and luminance choices within a movie complement and contrast with each other.  When watching a movie we can’t see the bigger picture.   But here it is all laid out to see.

I wanted to make a movie barcode for the Aliens – Special Edition (1986) in Nuke.

alien_frames_v1
400 frames from Aliens – Special Edition reading left to right, top to bottom

Some challenges in Nuke:

  • Creating an automated way to set the frames.
  • Extracting the average colour of the chosen frame, ideally automatically.
  • How to display the frames.

Setting the Frames

I started by trying Python – scripting the creation of FrameHold nodes.  This turned out to be very fiddly and not easy to adjust later, not the way to go.

I also tried using a FrameHold with two keyframes.  Frame 1 = 1, Frame 2 = 270.  Then in the curve editor setting the curve ‘After’ to ‘Linear’.  This continues the time offset beyond the last keyframe.  It achieved the correct result but was a little tricky to control and changing the value of the 2nd keyframe changed how many frames would be displayed.

I needed something that was easy to control and would let me set the amount of frames I wanted to display, letting Nuke take care of the time offset automatically.

Enter the Retime node.

retime_node

With the Retime node you can not only set the output range you can also set the input range. Being able to adjust the input range was great as it meant I could easily remove the opening idents and the credits (not so fun to see… as they are mostly black).

For the output range I settled on 4000 frames.

Extracting the Average Colour of a Frame

I looked into doing this a number of ways.

I tried scaling the image down, blurring and then scaling the image back to the full resolution.  This gave an average colour of the frame but was incredibly slow when trying to display 4000 frames!

scale

I also discovered a way of getting the average colour of a frame using TCL script from this Nuke Forum post.  It uses a Rectangle node which samples to input image.  It worked well when moving the playhead to each frame but would not display properly when plugged into a ContactSheet.

tcl

I’m guessing the TCL needs to be run for each frame individually so displaying 4000 frames at once doesn’t work.  Could be useful in the future but not for this project.

I finally settled on the simplest option, the CurveTool.  With this node you can sample the average intensities of an image.  Once run the average intensities can be pasted into a Constant. This way is super fast to display but if anything changes upstream the CurveTool must be run again.   Not quite as automatic as I wanted it but I felt this was an adequate solution.

And Voilà!  Each of the frames average colour is displayed in the Constant!

Displaying the Frames

I used the ContactSheet node set to display frames instead of images to render my movie barcode.  I set each of the frames to be displayed as 1 pixel wide and 1 pixel high.  I then used a Crop to stretch each of the 1×1 pixels up.

I’ve uploaded my final script to GitHub.

script
part of the final script

The Final Product!

aliens_movie_barcode_v1
4000 frames from Aliens – Special Edition

Once the system was set up it was very easy to put other films through the process.  So… of course I had to do Alien and Alien 3!

alien_movie_barcode_v1
Alien
alien3_movie_barcode_v1
Alien 3

 

Many thanks to Lev Kolobov, Zissis Papatzikis and Cameron Smither for help with this project.

Advertisement

PIR Magic Mirror Motion Sensor

I’ve added a PIR (passive infrared) motion detector to my magic mirror.  Now the monitor turns off after 5 minutes if no one is in the kitchen.

Super simple to do.

I brought an Aukru HC-SR501 sensor from Amazon.

This excellent YouTube video explains how the sensor works and how to check it’s operational.  HC-SR501 PIR Motion Detector – With Arduino & Raspberry Pi

This post gets the sensor set up using Python and Shell script.  How to: Install a PIR motion sensor on your Raspberry Pi Magic Mirror.

Jobbed!

Recreating 2001’s Slit Scan Effect in Nuke’s 3D System

In 1968 a film was released that would change the sci-fi movie landscape forever.  ‘2001: A Space Odyssey’ is still considered the pinnacle of science fiction story telling and visual design.

When the film was being produced the human race had yet to land on the moon let alone live and work in space for long periods.  The science fiction films released around the same time were all of a B movie quality.  ‘2001’ treated its viewers with upmost respect and its visual style is anything but B movie!

2001-posterThe films climax sees astronaut David Bowman travel through the aptly named ‘Star Gate’ to another time and space.  When posed with creating the visuals for this scene visual effects genius Douglas Trumbull turned to a technique that had been used in many films before – Slit Scan. Although John Whitney has often been credited with pioneering the Slit Scan effect (in films like Alfred Hitchcock’s ‘Vertigo’) it is Trumbull who refined the technique.  Trumbull’s real brilliance was how he built and ran his Slit Scan machine. 

Trumbull’s Slit Scan rig consisted of a camera on a track that could move forwards and backwards, a slit about 4’ high and illuminated artwork on a panel behind the slit.  The synchronous motors he used meant the camera, slit and artwork could all be programmed to run simultaneously.  This meant that he could do endless testing and repeat shots with super accuracy.  He could also layer up the exposures for maximum effect, rewinding the film in the camera and exposing another piece of artwork.

The shutter on the camera was open as the camera moved towards or away from the slit on the track.  The shutter would then close at the end of the move, the camera would move back to the start position and the artworks start position was moved slightly.  The whole process would then be repeated for the next frame.

exampleFrames
example frames of the ‘2001’ Slit Scan effect at Stanley Kubrick: The Exhibition
slitScanKubrick
a diagram of a Slit Scan machine at Stanley Kubrick: The Exhibition in London

Brief:

After watching ‘2001′ at The Prince Charles Cinema in London I felt inspired to try and understand more about the Slit Scan effect and recreate it in a digital age.  Using the 3D environment inside Foundry’s NukeX I felt I would be able to build a virtual version of Trumbull’s Slit Scan rig.  I used a 3d camera and 3d cards to create the other parts of the setup.

giphy-1
my 3D setup in Nuke, the slit is in the front card, the artwork moves on card behind the slit
Insights:
Some factors that change the effect created include:
  • The size of the slit.
  • The amount of movement in the start position of the artwork between each exposure.
  • The speed of camera move.

Process:

The first version I created was good but was more like a 1980’s video effect, a little like a ‘howl around’ (the effect used in the opening credits of BBC’s Dr Who).  The problem was that the artworks start position was not being adjusted between each exposure.  By moving the artworks start position ever so slightly between exposures the illusion of movement is given.  Thanks to Lev for this insight.

Each exposure in my setup was made up of 100 frames.  One disadvantage of Nuke’s 3D camera is that you can not leave the shutter open as Trumbull would have done.  Everything must be done in frames.   I used the TimeEcho node to mimic the long exposure time.  After that a FrameHold made sure I’m only rendering the fully ‘exposed’ frame.

giphy-3
an example of how each frame is made up, first is shown the slit moving over 100 frames and then is shown the TimeEcho output of those 100 frames… which makes up 1 frame

Next was how to set this up so it would render automatically.  Lev and I tested setting the curves of the camera and artwork movement to loop automatically but this was very fiddly to control (and adjust later) and never worked perfectly.  Setting the loop even one frame out meant that the slit would slowly move out of sync with the camera and chunks of the image would be lost.

In the end I used Python to adjust the start position of the artwork after each frame was rendered.  In the Write node Python commands can be set to run ‘after each frame’ is rendered for example.  After each frame was rendered a TransformGeo node moved the artwork 3D card in ’X’ very slightly.  The next frame would then be rendered, again made up of the 100 frames but with a slightly new start position on the artwork.

After the frames were rendered I brought them back in to Nuke and rendered out as a QT, added a flopped version on the other side of the screen, some blurs, retimes and glows to help with the effect.

starGate_v1
the finished Nuke script
giphy
an example of the completed effect rendered from Nuke
giphy-2
another version using Kronos to slow down and blur the final output, tippy man!

Conclusion:

This project has improved my understanding of how to loop curves and how the generate keys function can be super useful.  It’s also helped me see how useful setting Python commands in the Write node can be.

This was a great project to understand a process not used these days.  As with any project of this nature, delving in to retro techniques has given me even more admiration for the technicians that worked on this effect. I’m sure it was a lot of trial and error but once they cracked it they really cracked it.  The Star Gate sequence still stands up today and along with the music really does transport Bowman (and the audience!) to another time and space.

GitHub Scripts:

Final Nuke Script – starGate_v1_to_publish.nk
Python to advance slit after each frame is rendered – afterEachFame_advanceSlitScan.py

References:

Doctor Who Slit Scan and Howlaround

Douglas Trumbull – Slit Scan Motion Picture Photography – 2009 SOC Technical Achievement Award

The History and Science of the Slit Scan Effect used in Stanley Kubrick’s 2001: A Space Odyssey

Making of Slit Scan Effect

Cinefex #85 (April 2001 of Course!) – 2001: A Space Odyssey | A Time Capsule

 

Many thanks to Lev Kolobov, Zissis Papatzikis, Angus BickertonSteve Begg, Matt Tinsley and Ed Plant for their help with this project.

First Man – Landing – An Apollo Story

This is my tribute to the film First Man and the incredible endeavours of the people who worked on the Apollo Program at NASA.

I thought First Man was a very beautiful film.  It contains some extremely novel visual effects methods.  Those novel methods won First Man the Visual Effects Oscar in 2018.

My video is made up of shots from the 1989 documentary ‘For All Mankind‘ directed by Al Reinert.   Well worth checking out if you haven’t seen it.

This was my first project using Final Cut X and I have to say… I quite enjoyed it.

rsync – The Clever Way to Copy

I was recently introduced to the very clever rsync terminal utility by my  colleague Lev Kolobov.

Rsync is a fast, versatile, remote and local file-copying tool.  It is run from the command line and comes preinstalled on Linux type operating systems, like macOS.  It has many (many!) different modes to help transfer data from one location to another.  I have been using it at work to copy large amounts of video and image data from one place to another.

The beauty of using rsync to copy data is if you have to stop a copy half way through you can.  Then when you are ready to resume the copy simply run the command again and the transfer will pick up from where it left off.  Super useful if your copying using your laptop and want to take it home at night!

This is the command I have been running at work in Terminal on my Mac:
rsync -vuahr –progress “SOURCE” “DESTINATION”

Here is a breakdown of the command:

rsync” calls the utility.
v” increases verbosity so you get more feed back about what the utility is doing.
u” skips files that are newer at the destination, good if anyone has updated files since the last copy.
a” archive mode will keep timestamps, permissions etc the same from the source to the destination.
h” output numbers as human-readable.
r” recurse into directories, copies folders and subfolders.
–progress” show progress during transfer.
SOURCE” the path of what you want to copy.
DESTINATION” the path of where you want to copy to.

Here is an example:
rsync -vuahr –progress /Users/mrjack/Desktop/photos /Volumes/DRAKE/photos_bkup

All rsync flags and a description of what they do can be found here:
https://www.computerhope.com/unix/rsync.htm

And here is the Linux reference page for rsync:
https://linux.die.net/man/1/rsync

Also worth noting is the “-n” flag that will perform a dry run of the transfer and show you in the terminal what will change.  Useful if you’re worried about deleting anything in the destination folder.  You can also copy only files with a certain extension.  Lev has been using to backup all of our Nuke scripts for example.  Useful or what?!

rSync1.0v1

As it’s a bit of a faff getting the paths and writing the command I’ve built an AppleScript App that asks for the source and destination folders using a GUI. It then runs the command in Terminal.

rSync1.0v4 can be downloaded here:
https://www.dropbox.com/s/s7ezo8z51p1yqgy/rSync.zip?dl=0

Moving forward – in v2 of this app it might be nice to add some if statements to ask the user if they would like to set extra flags. One flag for example that might be useful can mirror the changes to the source in the destination, basically deleting what is no longer in the source folder. Super useful for mirroring locations.

This has been a great project for coding (shell script and AppleScript) as well as understanding the very powerful and useful rsync. Happy transferring people!

Please note – I am not responsible with how this utility works on your system.  I’m posting here for educational purposes only. Use at your own risk.

Mirror Mirror on The Wall…

I recently found myself with a spare Raspberry Pi and I came across this awesome project – to turn your Pi in to a mirror with digital display. Super futuristic and very useful I thought.

IMG_7714

My Magic Mirror is comprised of just a few components:

Raspberry Pi 3 Model B, case and power supply. £43.03
An old flat screen monitor from eBay. £56.99
A sheet of pre cut two way acrylic mirror. £55
Wood to make frame from B&Q. £10

Total: £165.02

The most labour intensive part of the project was making the wooden frame (which my Dad expertly executed). We made the frame to fit the monitor I brought from eBay. I removed the plastic casing from the monitor and sat it in to the wooden frame.

IMG_7569Installing the Magic Mirror 2 software is very simple. I did it all through a terminal on my MacBook Pro using SSH.

https://magicmirror.builders/

I also set Magic Mirror 2 to auto start on boot using PM2.

https://github.com/MichMich/MagicMirror/wiki/Auto-Starting-MagicMirror

I did have a few teething problems.  The monitor I purchased didn’t have an HDMI port so I needed a converter to VGA. I foolishly brought the first one I found online. Big mistake. Turns out they are not all built to the same standard. The one I brought used too much power from the Pi and so I got the lightening bolt symbol a lot – which signifying the Pi was low on power. The problem persisted for quite a while and even affected the WiFi adapter  which completely stopped working at one point!  (See UPDATE below)

After a wipe, re install and a little bit of Googling I found another VGA adapter from the PiHut… which (touch wood) has given no such problems.

The Pi Hut HDMI to VGA Adapter
https://www.amazon.co.uk/dp/B01G9YQPZ4/ref=pe_3187911_189395841_TE_3p_dp_1

This was a great project… and an ongoing one! The Magic Mirror software is endlessly customisable with new modules and new layouts to download and try. Moving forwards I would love to write a module myself.

I’d recommend this project to anyone at any Pi skill level. As always I learnt a lot on the way – coding, JAVA, Unix and even about carpentry!

UPDATE 01/09/19:

The VGA issue never really sorted it’s self out so I brought a new monitor WITH an HDMI port (ASUS VE VE228H).  Fingers (and toes) crossed, it’s currently stable.  I wouldn’t recommend using VGA with a Raspberry Pi 3.

IMG_7713

My shopping list…

Two Way Acrylic Mirror
https://www.cutplasticsheeting.co.uk/search/?search=Two+Way+Acrylic+Mirror+Sheet

Raspberry Pi 3 Model B
https://www.amazon.co.uk/dp/B01CD5VC92/ref=pe_385721_37986871_TE_item

iBetter®Raspberry Pi 3 Case
https://www.amazon.co.uk/dp/B01CPCMWWO/ref=pe_385721_37986871_TE_item

NorthPada Power Supply
https://www.amazon.co.uk/dp/B01DBZ49EI/ref=pe_385721_37986871_TE_item

 

Blender Doughnuts

I’ve recently been working with Blender.  Such a great program, and completely free!

If you would like to learn Blender I can recommend Blender Guru’s Beginner Series.  The series takes you through right from learning about the interface to understanding lighting and rendering.  And you end up with a yummy image of Doughnuts!

https://www.blenderguru.com/tutorials/blender-beginner-tutorial-series

Here is my render from the Beginner Series.  Tasty. 😉

blender_donuts