Computational Design Lab


The Kinect sensor’s IR depth camera and rgb camera was accessed through processing. The closest point to the sensor produces an x,y coordinate set that controls signals being sent to the remote control. A red dot is placed on the screen to let the user know what is the closest point. This is often the hand or extended arm as demonstrated.

An Arduino micro-controller was used with an Adafruit Motorshield to control 2 servos that physically turn the potentiometers your thumbs usually move on the helicopter’s controller. The previous hand held interface is replaced with motors that receive signals to move based on your gestures.

Both Arduino and Processing were used. The processing library for Kinect is thanks to Daniel Shiffman’s and can be found here.

Find the source code here for the project.

Author: Shawn Sims
Category: Projects