Final Project - Info for Live Video Editor

Initially I was working with Yining and Gabriel on Singing Lamps, so I have user testing results for that. But I'm working with Sisa now on a live video editor. We


LIVE VIDEO EDITOR - Sisa Bueno & Sweta Mohapatra

We are interested in creating a physical controller object that has the capability of producing an interactive video. This gives users the ability to control a film by doing a simulation of live editing, which we think will provide a very engaging experience for users, as well as a new way to look at watching movies.

There are two principal elements for this project- a monitor which will display a short film, and a controller which will have certain commands to adjust the video. When the user makes certain movements with the controller, it produces certain results. Move the controller to the left side, you can change the perspective of the camera explore the scene from a left angle. The same perspective change applies when the User moves the controller to the right, or angles it upward or downward. In the back end, this 3D effect is permissible thanks to the use of depth image applications such as DepthKit. This application helps sync a DSLR camera to a Kinect, and by stitching both images together, you create a 3D image that allows for perspective manipulation by the User.

In addition to exploring the perspective of the scene, the User will also be able to change the scene all together. Give the controller a shake, and the video will cut to a completely different scene with a different character.  In actuality, there are a total of two short films playing concurrently, but the User is only able to see one film at time in the monitor. The video clips in the monitor seem to be separate films that are independent of each other with two distinct characters, but in reality the characters of both films conjoin in the end, which will be a pay off for the User. S/he will be using the controller to not only play with perspective, but to also intercut between two characters in separate scenes, only to discover that they actually come together at the end, and that s/he had just edited a short movie!

In the back end, all of the video assets will be controlled with javascript using popcorn.js (although we may initially use Processing), and the all of the components related to the controller will be managed via Arduino. In the front end, the User will interact with a website where the video will play, and the controller will respond in real time.

System Diagram 



Bill of Materials 


  • Video recorder - camera, Kinect, Kinect-to-DSLR mount, tripod, lights

    • Kinect-to-DSLR may need to be purchased - $100

    • Rest of equip from ER

  • Actors - pro bono work (pizza, beer, transportation) - $100

  • Sound recording - equip from ER (within camera)

    Estimated max budget: $200


  • Arduino

  • Components

    • Gyroscope or accelerometer - track shaking and direction

    • Vibration motor - vibrate when in contact with someone (2)

    • Arduino

    • Breadboard

  • Case for components

  • Monitor to display (ER)

    Estimated max budget: $120



  • Javascript

  • Arduino


Project Plan 

Week 1 (Nov 3)

  • Video script version 1

  • Technical analysis: components needed to build v1

  • Sketch of casing

  • Prototype: move around an image with an Arduino

Week 2 (Nov 10)

  • Finalize script

  • Find actors, setting, rehearsal

  • Purchase materials

  • Shake to change video

  • Playtest - paper prototype

Week 3 (Nov 17)

  • Finalize filming and editing

  • Start filming

  • Vibration when bumping into something

  • Playtest - high fidelity

Week 4 (Nov 24) - Thanksgiving

  • Link together video and controller

  • Finalize filming and editing

  • Case for controller

  • Presentation script

Week 5 (Dec 1)

  • Final preparations

  • Panic!