Pages Menu
TwitterRssFacebook
Categories Menu

Posted by on Oct 4, 2015

Stanford research group develops tool for drone camera autonomy

Stanford research group develops tool for drone camera autonomy

 

Researchers and developers at Stanford University along with Adobe3DR and others, have created a tool to help both experienced and novice videographers get better aerial shots from their sUAS and camera drones.

Horus-screenshotFrom the project’s website:  “Horus is an in-browser Javascript application with a custom Python backend. It relies on the Google Earth NSAPI plugin and Google Chromium. It has been exclusively tested on Mac OS X. Note that this is a research prototype, and contains experimental and untested code. It is released here to aid researchers in reproducing our results.”

Horus-panels

I’ve been following this project for a couple of years now, as one of the aerial videographers that the team had interviewed while developing the project. For the early examples I was shown, I was excited to see where this technology was going.

This was even before we heard about some of the latest features recently released for 3DR and DJI drones, such as cable-cam, arc/POI and waypoints. But what Horus does goes much further and more autonomous.

Horus-captioned

The Abstract from their website states:

Cameras attached to small quadrotor aircraft are rapidly becoming a ubiquitous tool for cinematographers, enabling dynamic camera movements through 3D environments. Currently, professionals use these cameras by flying quadrotors manually, a process which requires much skill and dexterity. In this paper, we investigate the needs of quadrotor cinematographers, and build a tool to support video capture using quadrotor-based camera systems. We begin by conducting semi-structured interviews with professional photographers and videographers, from which we extract a set of design principles. We present a tool based on these principles for designing and autonomously executing quadrotor-based camera shots. Our tool enables users to: (1) specify shots visually using keyframes; (2) preview the resulting shots in a virtual environment; (3) precisely control the timing of shots using easing curves; and (4) capture the resulting shots in the real world with a single button click using commercially available quadrotors. We evaluate our tool in a user study with novice and expert cinematographers. We show that our tool makes it possible for novices and experts to design compelling and challenging shots, and capture them fully autonomously.

Here’s a video explaining more about their system:

I will be doing a follow-up with this project with some hands-on testing of the technology in the next few months and will be posting my results here.

From the project’s website, you can read the full technical paper, review and download the source code and learn more about the team in this study.

On that last note, my heart goes out to the team; the friends, families and colleagues of their Research partner, Floraine Berthouzoz, for her untimely passing recently. Floraine was the person I was in most contact with whom interviewed me initially and was revered by so many of her colleagues. She is greatly missed by all that worked with her and knew her.