Final project for Bridging Worlds, “Diffuse” with Dan Melancon. A static physical object is “infected” with a virus that is being projection mapped onto the surface. The virus is a sketch mimicking chemical reaction diffusion written in Cinder by Dan. User interaction through a kinect changes parameters and behavior of the projection. Better video documentation coming soon.
For our final project in Bridging Worlds, Dan Melancon and I are making a reactionary projection mapping installation.
The basic ecosystem of our project is seen above: a Kinect is monitoring the environment around an abstract, polygonal shape which will be constructed out of intricately folded paper. A projection of chemical reaction-diffusion, written in Cinder, will be projected onto this shape, and will react to external stimuli in its environment. Seeming to ooze out pores and crevices in the paper, the substance will change shape, size and color based on user interactions. Users will be invited to explore and determine whether they have control over the diffusion, or if it has a life of its own.
Bridging Worlds “Morning, Noon and Night” assignment was to create an interaction that changed based on the time of day. Dan and I are continuing our work with servos and lasers to create a device that writes the time on the wall with a laser. As the day progresses, the motions of the servo motors get “sleepy,” falling off of their originally rigid structure. Incrementing Perlin Noise would likely be used to help create this effect. Giving a robot device personality through more realistic motion is a concept we are eager to explore in this project.
Inspirations include Reflectus and Matt Richardson:
New concept assignment for Bridging Worlds involved a calendar/reminder system that functioned without the use of screens.
The system will feature a small set of colored balls (lights or lasers is currently undecided) projected on a small wall space, bouncing up and down. The lights or lasers will be individually mounted on servo motors. Each ball represents an event on the users Google Calendar, up to the next 8 events. The Arduino Yun wirelessly pulls information from the Google Calendar API, through Temboo, then parses the JSON data to find the time of the events. The difference between the current time and the event time is mapped to the rate of bounce on the balls. Thus, an event tomorrow will a ball with a decreasing bounce height compared to an event 5 days from now. Currently, the data collection and parsing works, and one servo is correctly mimicking the bouncing motion. After presentation tomorrow, we will clarify our idea and approach to completion of the project in the next week.
Preliminary idea for art installation with Dan Melancon, conceived for our second assignment in Bridging Worlds. The user of the space will enter the room and interact with various sensors tracking their presence. These sensors will control servo-based mirrors, ultimately searching for alignment to connect one or multiple beams of light in the space.
VERY early prototypes
Assignment 1 – Create an experience that is driven from passive input gathered in a location. I am presenting a living landscape art exhibit.