As computing becomes more ubiquitous in our objects, designers need to be more aware of how to design meaningful interactions into electronically enhanced objects. At the University of Washington, a class of junior Interaction Design majors is exploring this question. These pages chronicle their efforts.

Monday, April 4, 2016

1) Dillon Baker - Drawing with Motion Tracking

I wanted to play around with motion tracking for this assignment. I began with a pretty sweet drawing program I found online, which was definitely more advanced than what I'm familiar with. The first step was to break down how this program worked, so I could understand how to modify it without messing everything up. It uses momentum and a bunch of other crazy variables that determine how everything slides around. It basically draws a spectrum of rectangles along a path the cursor takes when the mouse is pressed.

The original drawing program I found.

Instead of focusing primarily on changing the graphical output, I wanted to try and make this program work through motion tracking, instead of dragging the cursor around. I tried messing with the output more at first, but there's so much stuff going on that I'm not really familiar with that I ended up just screwing it up. The original visuals were already really cool, so I wanted to focus on changing the input instead. This was a pretty daunting task since I've never tried to do webcam stuff, much less use it as feedback for drawing.

My program tracks the motion of a particular object by calibrating to a specified color from the webcam feed, and searching for that color within a small radius from its point in the last frame, and using its (x,y) coordinates as the position for the drawing.

My additions were mostly contained within two parts: a calibration section, and a motion detection section.

The first part of the program allows the user to calibrate the motion tracking to a specific color. This part was more straightforward to make: open the webcam, display the feed, and prompt the user to choose a color with a makeshift eye dropper tool. It then saves that color code, and uses it in the following drawing program.



The code for the color calibration section.

Calibration scene. An enlarged circle displays the color of the pixel the cursor is over.

Tracking this color was really tough to get working right. At first, it would either jump around finding remnants of that color from all around the screen, or just not show up when the lighting on my tracking object changed. It also started running pretty slow when it was searching the entire frame 30 times a second. 

I had to find the right balance on a few variables to make it as smooth as I could. I had to find the right amount of leniency on how close a color needed to be to count as the intended color. I made it only search within a 30px radius of the coordinates from the previous frame to speed up the program and keep it from jumping around to a different part of the screen. I ran into glitch after glitch trying to get this to work, which honestly made this project take a lot longer than I wanted. I finally got the right balance of all these variables to make a reasonably functioning motion tracking drawing program.

The majority of the code that tracks the color's motion and returns its coordinates for each frame.
The mousePressed() function calls either one of these functions based on the context of the program. If there's a bunch of colors in the background of the webcam feed, the tracked object gets lost pretty easily. I made it so that pressing "R" resets to the calibration section if this happens. Besides some global variables and other small stuff, the rest of the code is devoted to the color effects when being drawn.



Drawing with motion tracking.

I also did some small changes to the visuals. I changed the trail to fly upward, just cause it looked kinda cool. All I did to change this was just adjust some of the existing variables.

It's still a bit choppy. I am 100% sure there are better ways to do this, since this was my first attempt at something like this. Existing motion tracking programs take much more into consideration than just color, which definitely helps make a smoother path with less glitches. Going forward, I'd definitely factor in more than just color by proximity. But this was a super fun and rewarding chance to dive into an new area of coding in Processing!

1 comment:

  1. To d8a9bc.us
    From a5a0a9.ie
    Good luck to the work You might be interested in project d4w509
    https://plus.google.com/u/0/+a5a0a9ie



    Using #colour/ Support Vector to classify by default the coordinate geometry when you need a container in the running codes this serves an important archival purpose as vector machine developed bots/personas are now clearly distinguishable from real world persona colours such as: d8a9bc.us which would under normal circumstances, be randomly distributed by a recognised proper authority.

    ReplyDelete