MFA project at SVA // Concept + Coding (Processing)

Collaboration with Luke Stern. Merging location data with a modified Kinect camera image, we generated an ambient 'quantified self' visualization with a competitive twist. Data for each user (which for the prototype is me on the left, Luke on the right) is pulled from the Moves app API. The OpenNI library for Processing is used to access the Kinect camera. Using the activity and location data to modify the Kinect image, the app indicates progress towards a daily target of 8,000 steps or 5 kilometers travelled (the top of the screen is the target). By tracking the closest point towards the camera, users can switch between different days and the step or distance data. Additionally, by stepping towards the camera users can see maps of their movement around New York City, and by stepping even closer an info screen displaying text data.

The underlying idea is to visualize activity data ambiently. The app would appear on a screen in the home or at a workplace, and by stepping to it the user could check on their daily progress.

Limitations of the prototype / proof of concept are the users are hard-coded in, rather than different users being able to authenticate, and we constrained to visualizing only two users.