MY WORK         ABOUT ME        CONTACT       BLOG
Dead Centre

To Be A Machine

Technical Direction
Camera Rig Design
Video Design

To Be A Machine is a live streaming theatre production that runs until Oct 10th 2020 from Project Arts Centre, Dublin as part of this year’s Dublin Theatre Festival.

An early release of a future project from Dead Centre. Adapted from the Wellcome Prize-winning book by Mark O’Connell, this event attempts to re-imagine theatre without the hindrance of the body.

Be the best audience member you can be by uploading yourself into the crowd for this exploration of technology, the race to defeat death, and the limits of live performance. Featuring Jack Gleeson trying to be or not to be a machine.

My work on this involved:
- developing and constructing a custom main camera rig with a motor controlled slider that could accuratley move the camera between the actor and an iPad Pro on stage. The whole thing also dollied on 6m of track.

- designing and implementing an audience ‘upload’ system that allowed ticket holders to upload three short clips of themselves to be displayed and cued live on 110 iPads in an audience seating bank. This included an automated processing pipeline that downloaded, resized, cropped and looped thousands of videos, matched them to ticket-holders and assigned them seats / ipads for their particular night.

- designing and supervising the production of a custom iOS app that could display these audience videos and be controlled using OSC, allowing them to be integrated into the rest of our show-control system.

- producing the content that is intercut with the live material and on the stage iPad. 

- operating the main camera live for this initial run

The camera rig was designed to perform a set of specific shots for live streamed theatre production To Be A Machine. Specifically, to smoothly and accurately track between an actor and an iPad Pro, and to dolly in and out simultaneously on 6m of track. The setup necessitated the lens being nearly wide open so a wirelss follow focus was also added to the mix. This was fun to operate live!

Based around a Blackmagic Micro Cinema Camera with 12mm Veydra Mini Prime, Tilta Nucleus-M wireless follow focus, Edelkrone SliderPlus Pro, all on a platform dolly on 6m of track.

Bulk video processing

As part of the ticket-buying process, our audience members were asked to upload three clips of themselves (watching, laughing, eyes closed). They were guided through this by actor Jack Gleeson and could use any device from phone to laptop. With nine performances, and 110 seats / iPads, this involved at least 2970 videos. I found a near perfect app for this in VideoAsk, a relatively new product from Typeform.

As part of my work, I developed an automated pipeline using Python and a bunch of libraries including FFMPEG and OpenCV that did a number of things:

- downloaded all clips from VideoAsk
- stored this videos under the ticket reference number
- checked videos for minimum length and dimension
- cropped to iPad portrait, I used OpenCV to do face detection to make sure I centred the face where possible
- made a seamlessly loopable clip
- compressed and renamed files
- striped audio for separate use

Finally, another script ran a process that gathered a particular performance’s audience, using the latest ticket sales data from the box office, audience members were assigned iPads for the performance.

Custom iPad App

We wanted to be able to easily load content onto our fleet of 110 iPads, and to wirelessly trigger the different clips on each individual iPad, ideally using a standard show-control protocol like OSC.

Believe it or not, nothing suitable existed so I proposed we build our own. With the help of talented developer Zac Davison, we put together a barebones media playing app and a small Node server that ran on the network. Upon launching, the app on each iPad would attempt to register with the iPad name. If a folder existed on the server with that name, the app would pull whatever content that folder contained. This way, we could push an audience to our 110 iPads in a couple of minutes.

Finally, the app on each iPad could be controlled using OSC targeting each iPads IP address and a particular port. We developed a basic command set for play file, loop arguments, stop etc.