Interactive digital landscape made for an art exhibit at Diode Gallery in Portland, August 2014.
This was a collaboration by a dozen digital artists, who each made an interactive installation. A single multi-touch monitor in the gallery broadcast touch events to each piece, meaning visitors controlled everything around them simultaneously.
My piece used terrain elevation data from the hills around Crooked Falls in the Missouri River, near Great Falls, Montana. A layer of land was added with each touch, y-axis movement translated into height of the drawing, and x-axis movement translated into zooming and a subtle day/night shift.
Collaboration with OMSI to create exhibit for NASA about ICESat-2, March 2014
ICESat-2 is NASA's Ice, Cloud, and Land Elevation Satellite (the second), which uses a laser altimeter to measure, among other things, sea ice surface elevation and vegetation canopy height. This exhibit uses a model of the ICESat-2 satellite and a custom altimeter to measure anything beneath it, so visitors can pose as the model+altimeter moves along a track above, and then take a printout of their personal elevation profile home with them.
I enjoyed using the device below, which was quickly prototyped up to send test data to my computer.
Augmented reality kiosk made for DorkbotPDX art show, September 2011. Built with Processing and the NyARToolkit. There were five markers on wooden blocks, and each one triggered a different visual result when recognized by the camera. My favorite was the one below, which moved like a fiber optic lamp.
Active from 2012-2016, this website complemented a new sustainability exhibit at OMSI. You could sign up and earn virtual badges for completing challenges, and hear local stories from people who have made changes for the better. The site was also entirely in Spanish, and was made with Ruby on Rails.
Part of a renewable energy exhibit that opened at OMSI in November 2012. This interactive is paired with a physical turbine model, so you can move levers to adjust the blades and get a sense of how they work. The activity increases in difficulty as you learn about wind energy, and the screens are also in Spanish.
One of seven software components in an exhibit about technologies that aid or assist human abilities. This one is a Kinect-based activity that allows visitors to jump or throw with augmented legs and arms. Built with Cinder and the Microsoft Kinect SDK.
Sisbro Studios, a wildlife filmmaking team, hired me to code up this web activity that goes with their shark movie. You can upload a picture of yourself, move it into place, rotate and resize it to fit, and it makes it look like you are diving with sharks.
Another of the seven software components in an exhibit about technologies that aid or assist human abilities. People with limited use (or absence) of their legs can ski with a device called a monoski or sit-ski, and this activity allows visitors to imagine that experience. Visitors sit in a custom-made seat and steer down the slope on-screen by leaning left or right.
This exhibit simulates the fast data transfer enabled by photonic chips, which use lasers instead of electricity to transmit data. Small videos are "downloaded" to a destination device quickly or slowly depending on how the visitor has manipulated the lasers. Built with Processing.
I helped with the Hand-Eye Supply float for the 2012 Starlight Parade by programming some sequences for the lights. The lights were already set up to turn on and off in response to MIDI input, so that was my starting point. I used Processing to mock up the grid of lights and keep track of which MIDI note controlled which light. Whenever an animation sequence called for a light to be turned on or off, a line was written to a text file, specifying the light ID (which was its MIDI note), and whether it was being turned on or off. That text file could then be used to create a MIDI file. The handy csvmidi Unix command line tool converted from CSV (comma-separated value) to MIDI.
Quick project made for a DorkbotPDX open mic night, December 2012. I used a Makey Makey and Processing to control sounds by turning cookies, loosely based on the idea of turntables. The left cookie changed samples, and the right cookie changed direction of play. I had a working version going with Play-Doh by the time I made my first batch of cookies, and then discovered that sugar makes dough resistive rather than conductive! So I made a batch of salt cookies and made them look like monster cookies.