The robots are coming for us all. It’s not just the fast food workers and factory cogs that need to worry about being phased out by automation, but the arty types, too. Algorithms are assembling the news into roughly understandable review articles, and composing basic music based on pop trends. Still, it’s not in content creation but content management that robots truly excel, and this week MIT showed off some simple drone technology that could have far-reaching effects on film.
The video below shows a custom quadrotor drone react dynamically to the movement of both its subject and the photographer, ensuring perfect lighting without fuss — or large crews.
When you think about it, the various decision trees navigated by a movie lighting specialist are not all that difficult to automate. Once the director (or the cinematographer themselves) chooses a certain colour temperature, light intensity, angle, and movement profile for a project or scene, there is a fairly mechanistic process to achieving them — all the while making sure to keep lights and flares out of the shot. That sort of multi-variable calculation is exactly what software is good at — and its potential does not stop with lighting. The underlying logic of the lighting drone applies just as easily (if not more easily) to microphones and camera-work; in this world, why would studio’s choose to keep paying a guy to stand around holding a microphone taped to a stick?
Of course, the actual creative input would still have to come from a human being; most directors would almost certainly still make use of an expert in creating visuals, but that person’s skill-set might more resemble that of a professional DotA player than of a turtlenecked artiste. Writing a quick algorithm to control a drone’s responses to character movement might end up being more essential than anything else.
This MIT team chose “rim lighting” as their test specifically because it’s a difficult effect to maintain with movement, and it would thus make a good example of the principle at work here. Complex frame-by-frame analysis is done on a nearby computer, providing feedback to the drone on how to adjust its behavior to maintain the photographer’s desired lighting situation. This tech was shown off at the International Symposium on Computational Aesthetics in Graphics, Visualization, and Imaging, which is cool merely because it exists.
CEO DroneLife.com, DroneRacingLife.com, and CMO of Jobfordrones.com. Principle at Spalding Barker Strategies. Proud father of two. Enjoys karate, Sherlock Holmes, and interesting things. Subscribe to all things drone at DroneLife here.