This project is currently being worked on. Check back for updates!

The Pitch 
JetEdit is designed to exponentially accelerate the time it takes filmmakers to go from shooting footage to a rough cut by leveraging their impressions of takes while filming, and using that semantic data to generate a rough cut.

When choosing my Master’s project, I looked for a project that would combine one of my hobbies, amateur filmmaking, with computing.  I started by looking at what digital enhancements could be made to the filmmaking process.
The Problem
Making films is a complicated process, and is as much technical as it is artistic.  One of the most common issues at all skill levels is the need to “reshoot” or film a portion of the movie again to fix some error.  Sometimes these errors are artistic choices, but many times they can also be technical mistakes.  I looked at the filmmaking process as a whole, and I noticed that there is a long temporal gap between when scenes are filmed and when a rough cut is produced.  In a Hollywood setting, scenes aren’t usually put together until the “daily” produced each night, but for a smaller amateur group, the gap might be even longer.  This separation between the action and the feedback allows for issues to develop.

Doing a literature review, I found a few apps and research projects attempting to solve this problem.  Movie Slate 8 bills itself as a “digital clapper” with the goal of “reducing fruitless searching for footage.”  Essentially, they want to replace the traditional shot notes with a digital equivalent.  Researchers at Stanford look at what they call AI filmmaking, organizing takes and shots and then editing them using “idiom” editing such as fast or slow pacing, emphasizing certain characters.  The system is extremely promising and allows editors to try different styles easily, but is also constrained to dialogue-driven scenes and  is still in its infancy.  Finally, Andy Quitmeyer investigated the idea of capturing what he calls “semantic impressions” in the field and using them to computationally create a rough cut.  His system is tailored specifically towards documentaries and is not as useful for the general user.  

Given that research, I collected the requirements and design for my system.  Like Movie Slate, Jet Edit will act as digital shot notes, collecting that same semantic data about each take.  Using that information, Jet Edit will take the best takes and computationally build a rough cut.  In other words, filmmakers will be able to annotate their shots as they are being taken, and then Jet Edit will intelligently edit a rough cut based on those annotations.  Specifically, it will take advantage of In and Out points (the start and end of a clip) as well as a shot’s quality (Good, Ok, Bad).

Next, I started to prototype what the app would look like.  I decided early on that I would need to separate the functionality into two apps: one to manage recording semantic data and one to convert the annotations to a rough cut.  I designed the layout in Adobe XD and showed it to my adviser, a video production professor, to get feedback.  
After a few design passes, I decided to implement a working prototype in code so that I could test the process improvements with users.  I chose Meteor as my development environment to quickly bootstrap the prototype.  As a web app, it can be used on iOS, Android and Desktop with one code base to reduce the development complexity.
The Next Steps
After developing the functional prototype I ran my first user test with a class at Georgia Tech.  As is usually the case, my participants had lots of feedback!  The main take-away I had from the whole round of testing was that my system was too different from what they were used to.  Adding the stress of learning it on top of their project requirements meant that many people struggled and ultimately gave up on it.  Going forward this semester I plan on running a user test outside of a class to get feedback on the system itself and on-boarding, before trying to integrate it into users’ workflow
Back to Top