Generative Graffiti

Draw with physics and biometrics

In this project, I designed a simple installation art that invites participants to co-create a piece of generative graffiti. When they use their hands to touch the device, their finger pressure and skin reflectance are recorded and translated into a colored moving particle on the screen. Each particle encapsulates the individual character of the participant and interacts with one another based on gravity. As more people interact with the device, they collectively paint the canvas with colorful traces of the particles.

Inspirations

Inspired by the famous painting "The Creation of Adam", I decided to get a bit poetic/metaphoric and invite participants to touch the device with their fingers. Every touch gives birth to a new lively particle (data point) that encapsulates their mind and body at the moment of touching. Interestingly, such simple interaction already lets us infer a lot about the person. How hard they press their fingers reflects their mood, physical strength, and even personalities, while their skin tone and reflectance tell us about their body conditions, ethnic background, and indoor/outdoor habits.

Besides, I want the graffiti to be unique so that different people painting it would result in completely different trajectories of particles. I admittedly geeked out a little bit — instead of relying on simple randomness, I made the particles interact with each other based on Newton's law of gravity. According to the chaos theory, this kind of n-body system is chaotic (mathematically unpredictable) and a slight change to the input results in drastic changes in the output. When more and more people touch the artifact, their data points leave colorful yet unique traces of movement that resemble a graffiti drawing.

Implementation

The hardware setup is relatively straightforward. I used 3 LEDs as the light emitters and 2 photocells as the light receivers for measuring the skin reflectance. There is also a force-sensing resistor attached to the fingertips of a 3D-printed hand. All sensors are configured in a voltage divider circuit and their output is sent to the analog pins 0 to 2.

On the software side, I set up a local Node.js server as an intermediary. The sensor data are first sent over the wire to the server. It then forwards the data to a webpage written in p5.js over WebSocket.

In terms of the data mapping in the visualization, the skin reflectance measurement is mapped to the hue of the particle, while the force of the finger press corresponds to its mass. This sets the appearance and initial conditions of the particles when they enter the canvas.

I wrote a very basic physics engine for calculating the interaction among the particles. There is a "black hole" at the center that confines the particles within the canvas. All newly generated particles are shot from the top left corner. In every tick, the O(n²) algorithm computes all the forces acting upon every particle and updates the acceleration, velocity, and position correspondingly.