Augmented reality kiosk made for DorkbotPDX art show, September 2011. Built with Processing and the NyARToolkit. There were five markers on wooden blocks, and each one triggered a different visual result when recognized by the camera. My favorite was the one below, inspired by a fiber optic lamp.
Launched in 2012, this website complements a new sustainability exhibit at OMSI. You can sign up and earn virtual badges for completing challenges, and hear local stories from people who have made changes for the better. The site is also entirely in Spanish, and was made with Ruby on Rails.
Part of a renewable energy exhibit that opened at OMSI in November 2012. This interactive is paired with a physical turbine model, so you can move levers to adjust the blades and get a sense of how they work. The activty increases in difficulty as you learn about wind energy, and the screens are also in Spanish.
One of seven software components in an exhibit about technologies that aid or assist human abilities. This one is a Kinect-based activity that allows visitors to jump or throw with augmented legs and arms. Built with Cinder and the Microsoft Kinect SDK.
The excellent wildlife filmmaking team, Sisbro Studios, hired me to make a fun web activity to go with their shark movie. You can upload a picture of yourself, move it into place, rotate and resize it to fit, and it makes it look like you are diving with sharks. Try it out.
Another of the seven software components in an exhibit about technologies that aid or assist human abilities. People with limited use (or absence) of their legs can ski with a device called a monoski or sit-ski, and this activity allows visitors to imagine that experience. Visitors sit in a custom-made seat and steer down the slope onscreen by leaning left or right.
This exhibit simulates the fast data transfer enabled by photonic chips, which use lasers instead of electricty to transmit data. Small videos are "downloaded" to a destination device quickly or slowly depending on how the visitor has manipulated the lasers. This is the first exhibit I made using Processing.
I helped with the Hand-Eye Supply float for the 2012 Starlight Parade by programming some sequences for the lights. The lights were already set up to turn on and off in response to MIDI input, so that was my starting point. I used Processing to mock up the grid of lights and keep track of which MIDI note controlled which light. Whenever an animation sequence called for a light to be turned on or off, a line was written to a text file, specifying the light ID (which was its MIDI note), and whether it was being turned on or off. That text file could then be used to create a MIDI file. The handy csvmidi Unix command line tool converted from CSV (comma-separated value) to MIDI.
In early 2010 I wanted a new web project, and my mom acquired many boxes of historical family documents, so we teamed up to create this family archives website. I made the site, and she types in the documents (and often scans and uploads pictures of them). She has entered over 1800! There are captain's logs from whaling ships, and letters by my grampa when he was a young dreamer, and much more.
Quick project made for a DorkbotPDX open mic night, December 2012. I used a Makey Makey and Processing to control sounds by turning cookies, loosely based on the idea of turntables. The left cookie changed samples, and the right cookie changed direction of play. I had a working version going with Play-Doh by the time I made my first batch of cookies, and then discovered that sugar makes dough resistive rather than conductive! So I made a batch of salt cookies and made them look like monster cookies.