PComp

Final Project Documentation - Stitching Stories

Here is the final documentation on the PComp final project. Our piece has a lovely name now courtesy of Sisa!

Overview

Concept: Stitching Stories is a tale of a seamstress and her boss, but the user controls the non-linear narrative with a physical controller. Each function reveals a different part of the story that relates to the function in some way. 

Materials: Hardware - Bop It, Arduino Micro, rainbow cable and other misc electronics

Software: Arduino and Processing

Hardware

Our initial concept was to fabricate a custom controller similar to a cube, but we found that that that would have been expensive and limited in terms of interactions. I pitched the idea of hacking a Bop It, a popular handheld party game from the 90's with multiple controls, to control the video. We hacked one open and discovered that the Bop It was basically controlled by a bunch of individual push buttons, which we could use to hook up to an Arduino that we could program ourselves by resoldering the buttons to our wires.

At one point we thought about hacking up the Bop It since we were using an Arduino Uno and a breadboard, but we opted to keep the Bop It intact as a controller rather than build a custom one thanks to Xuedi's input. She directed us to replace the Bop It's original contents with an Arduino Micro (thanks to Joe Mango for lending us his!) and a ribbon cable, which would help us contain all of the Bop It's wiring in a clean way. 

So I gutted the Bop It of its innards to give us a blank slate to work with: 

Sisa had created a prototype circuit to test the functions, which became the basic template for each function's circuit. Originally we planned to use all six functions: Bop It, Twist It, Flick it, Twist It, Spin It, and the mysterious purple button that I don't know the function of; but since we had limited footage in our narrative, we opted to forgo the center button - I stuffed it with foam to maintain the integrity of the controller while rendering the button immovable so the user can't press it on accident. 

I also hacked the purple pushbutton to prevent it from catching - as seen in the original picture above, the purple button stays pushed down in its original state. To pop it back up, you have to push on it again - this caused some problems with the way the video event handlers were written in Processing. I stuffed it with foam so that the user doesn't have to push twice to restore the button back to its original state

The foam forces the top to spring back so the metal hook doesn't catch

It took us a few days to put the hardware together (achievement unlocked: learned to solder!), and it was interesting trying to put it back in. Luckily the Bop It had built in hooks to thread the cable through to hold it out of the way. 

Here is the final product. There are little tape tags labelled to show which button goes to which function, which made my life a lot easier when I reassembled the Bop It

Additional photos and videos can be found in my Google Plus album, the link to which is in the Resources section of this post. 

Software

The software was used to play the video; as the user played with the physical controller, the video on the screen would switch to show a different clip based on what the user selected. 

Since switching to the Bop It for our controller, we thought we'd have to use something like Unity or Max MSP to get this done, but the time limit forced me to write a Processing sketch to connect to the Arduino. This actually worked in our favor - Processing did the job quite well with the exception of some performance issues where the video would stutter. I attribute this to a couple of things: lag from the serial communication with the Arduino and from the computer running too many things at once. I was able to minimize this effect by closing down all windows before running the program. It also helped to have smaller videos. 

I used Processing's video library to load the videos, track the function from the Bop It, and change the video whenever a change was made. I also was able to implement a 'Plan B' that Sam Slover recommended - the code would also work based on some keyboard inputs in case the Bop It failed. 

I've posted the code to Github, which I link to at the Resources part below

Lessons Learned

  • Worship the Residents - The residents are amazing. Bless the residents, in particular Sam and Xuedi. They were so helpful and I don't think we could have done this without them. 
  • Keep Calm and Change On - this was the big takeaway for me. This project has evolved so much from the concept that Sisa pitched in class and that I got interested in and took us down many paths in regards to the technology used to implement it as well as the interaction model (since we changed the controller entirely). There was a lot of stress and pressure, but I think that Sisa and I had a good deal of patience that helped us get this done well and produced a working project.
  • Take Notes - documenting all the meeting notes in a shared Google Drive was so helpful because we could both see what was going on and share resources easily. We kept on track pretty well when we were taking notes to have something to go back to. 
  • Practice - I was much more comfortable with the concepts I learned in this class going through this project. Putting the circuit together was a great way of relearning the basics of electronics. I also was able to get a good handle on how to do serial communication via Processing, which was something that I had done with Will in the midterm as well. 
  • Don't Fear the Solder - this was my first time soldering, so I was a little nervous about it. Even though we made a few mistakes, it worked out well in the end! I enjoyed it immensely

Resources

Final Project Progress Report 2

We only have a week left? What?!

We scrapped the idea of coding in Max MSP/Unity and stuck to Processing. I was able to get the sketch working with help from Sam Slover. 

Video of working Processing sketch can be found here

Here's the list of things left to do - it's mostly around getting the container to work 

Things To Do

  • Sweta: Fade in/out transition when the user does an action 

  • Sisa: Video editing (4-5 videos about 2 minutes each)

  • Sweta: Audio - pick 3 choices from mobygratis.com (remote, EOD Thursday)

  • Container

    • Soldering the switches

    • Cutting up the Bop it

    • Cutting the container to put things in 

    • Putting things in

    • Beautification

Final Project Progress Report

For this week, we had planned to do the following: 

  • Video script version 1

  • Technical analysis: components needed to build v1

  • Sketch of casing

  • Prototype: move around an image with an Arduino

(items in bold are the ones I focused on)

I unfortunately didn't get much done on this - I looked at Popcorn.js and realized that I would need Node.js to do the serial communication no matter once. Since I hadn't had much luck in the previous lab on this, I just tried to get an accelerometer working in Arduino. I borrowed an ARDX 335 from the ER and played around with it in Arduino. It looks like it never gets to 0,0,0 even on a flat surface - I was constantly getting readings. Reviewing my midterm helped because there was an accelerometer involved here; I will have to use the map() function in order to recognize when it's flat. 

I ordered an accelerometer as well as a few vibration motors to see if I could test shaking the accelerometer and registering it as a shake. 

I also thought about the case a bit more and pitched the idea of a camcorder case to Sisa - since people know the physical interactions with cameras and camcorders and our idea involves video, the controller could be in the shape of one. We could hollow an actual device out and have the Arduino and breadboard in there or maybe 3D print one. We're still debating on that. 

Sisa took a look at DepthKit for the video component and realized we would have to do everything in that, so we may have some more setbacks with the technology than we realized. Yeek. 

Current status: 


Final Project - Info for Live Video Editor

Initially I was working with Yining and Gabriel on Singing Lamps, so I have user testing results for that. But I'm working with Sisa now on a live video editor. We

Description

LIVE VIDEO EDITOR - Sisa Bueno & Sweta Mohapatra

We are interested in creating a physical controller object that has the capability of producing an interactive video. This gives users the ability to control a film by doing a simulation of live editing, which we think will provide a very engaging experience for users, as well as a new way to look at watching movies.

There are two principal elements for this project- a monitor which will display a short film, and a controller which will have certain commands to adjust the video. When the user makes certain movements with the controller, it produces certain results. Move the controller to the left side, you can change the perspective of the camera explore the scene from a left angle. The same perspective change applies when the User moves the controller to the right, or angles it upward or downward. In the back end, this 3D effect is permissible thanks to the use of depth image applications such as DepthKit. This application helps sync a DSLR camera to a Kinect, and by stitching both images together, you create a 3D image that allows for perspective manipulation by the User.

In addition to exploring the perspective of the scene, the User will also be able to change the scene all together. Give the controller a shake, and the video will cut to a completely different scene with a different character.  In actuality, there are a total of two short films playing concurrently, but the User is only able to see one film at time in the monitor. The video clips in the monitor seem to be separate films that are independent of each other with two distinct characters, but in reality the characters of both films conjoin in the end, which will be a pay off for the User. S/he will be using the controller to not only play with perspective, but to also intercut between two characters in separate scenes, only to discover that they actually come together at the end, and that s/he had just edited a short movie!

In the back end, all of the video assets will be controlled with javascript using popcorn.js (although we may initially use Processing), and the all of the components related to the controller will be managed via Arduino. In the front end, the User will interact with a website where the video will play, and the controller will respond in real time.


System Diagram 

 


 

Bill of Materials 

Video

  • Video recorder - camera, Kinect, Kinect-to-DSLR mount, tripod, lights

    • Kinect-to-DSLR may need to be purchased - $100

    • Rest of equip from ER

  • Actors - pro bono work (pizza, beer, transportation) - $100

  • Sound recording - equip from ER (within camera)

    Estimated max budget: $200

Hardware

  • Arduino

  • Components

    • Gyroscope or accelerometer - track shaking and direction

    • Vibration motor - vibrate when in contact with someone (2)

    • Arduino

    • Breadboard

  • Case for components

  • Monitor to display (ER)

    Estimated max budget: $120

     

Software

  • Javascript

  • Arduino

 

Project Plan 

Week 1 (Nov 3)

  • Video script version 1

  • Technical analysis: components needed to build v1

  • Sketch of casing

  • Prototype: move around an image with an Arduino

Week 2 (Nov 10)

  • Finalize script

  • Find actors, setting, rehearsal

  • Purchase materials

  • Shake to change video

  • Playtest - paper prototype

Week 3 (Nov 17)

  • Finalize filming and editing

  • Start filming

  • Vibration when bumping into something

  • Playtest - high fidelity

Week 4 (Nov 24) - Thanksgiving

  • Link together video and controller

  • Finalize filming and editing

  • Case for controller

  • Presentation script

Week 5 (Dec 1)

  • Final preparations

  • Panic! 

PRESENTATION: DEC 3


 

Midterm Documentation

I worked with Will Field on this project, which was a Make Your Own Jackson Pollock masterpiece. This was inspired by some of Will's work in Processing. The workflow went as follows: 

  • On the screen is a blank 'canvas' 
  • The user can fling paint at the screen by using a 'paintbrush' - we had talked about having it attached to a paintbrush but used a breadboard as a substitute
  • Paint would splatter onto the screen where the user flicked
  • To change the color, the user would alter the color on a 'palette' - this was another breadboard with three potentiometers controlling the RGB value. An RGB LED would show the user an approximation of the color they selected. 

We divided this project into two parts: one was centered around the splatter and the other around setting the color. We then combined the two parts to get them to function. Will worked on the physics of the splatter and I tackled the color setting along with getting the serial communication to work.

We opted to use two Arduino chips to make this - an Uno dedicated to the color and a mini for the paintbrush. The paintbrush also consisted of an accelerometer to track the movement; in Processing the readings were converted to the screen. 

Here's what I learned: 

  • Spec sheets are not always the best documentation - I struggled with getting the RGB LED to work; even though my prototype code worked perfectly, the LED just wouldn't light up! Even the resident couldn't figure out what was going on. Turns out the LED was inverted - instead of going to ground, it had to go to power! This was not made clear at all, even when I looked around the Adafruit website. Eek.
  • Prototype your circuit - I used 123 D Circuits to check my circuit, which was a good way to check my wiring and see if I could clean it up (I had multiple versions of my circuit with tangled wires and it was easier to see the clean configuration)

Additional documentation and video can be found on Will's blog