Living Pixels
Living Pixels takes a live image feed and frame-by-frame changes each pixel’s color to the color of a random neighboring pixel. Each rendered image is then fed back into the program in a loop, compounding the effect.
My goal was to write a set of rules and let them run; to define the boundaries and the means by which my code will distort the images, but to ultimately allow the code to create an output I couldn’t fully predict or control.
Seeing Motion
The motion detection effect is done with a secondary program that is looking for pixel differences between the current frame and the previous frame that meet a threshold set by the user. Pixels that meet the threshold difference are added directly on top of the current render, essentially allowing moving objects to be visible through the noise. But, as soon as these objects slow down or stop, their forms blend back into abstraction.
Interacting with living pixels
The inclusion of the live camera input and the motion detection are there to make this effect visible and engaging to viewers in real time, giving viewers a way to interact with the distortion while still having no control over the final output. Imagine Living Pixels on a huge screen installation where the camera is pointed at the viewers, who can see themselves becoming abstracted if they stay still too long.
Try it yourself below!
Living Pixels
Created by Josh Hakman for ART-315: Art History Since 1945.