Walabasquiat is an interactive generative art installation using the Walabot 3D imaging sensor, Raspberry Pi, and Android.
Idea ? ?
In the mid-1990s, William Latham amazed the world with his Organic Art PC application and screensavers – introducing the public to bizarre, other-worldly forms rendered using cutting-edge genetic algorithms that continually mutate simple shapes into elaborate organic lifeforms. I’ve always been fascinated by generative art, and have long dreamt of creating an interactive installation where participants can influence the algorithms by their presence or movement. Walabasquiat uses Processing on Raspberry Pi and Android with the Walabot sensor as input parameters to create a unique, ever-evolving tapestry of pixels in response to the movement of its viewers. ? ?
Getting Started ? ??
Getting the Walabot working on a Raspberry Pi is extremely straightforward: simply plug it into an available USB port via the included micro-USB cable (be sure you’re using a 2.5A+ power supply) and install the Walabot API. I like to use the CLI whenever possible, so from Terminal on the Pi itself, I ran:
sudo dpkg –i walabot_maker_1.0.34_raspberry_arm32.deb
to install the API, and then:
pip install WalabotAPI —no–index —find-links=“/usr/share/walabot/python/
Now that everything’s up and running, it’s time to make something cool with it! ?️
Development Process ? ?
The first challenge was coming up with a way for Processing, which I wanted to use to create the generative art, to talk to the Walabot. I initially tried to integrate the Walabot API directly into my sketch using Processing‘s Python Mode, but after experiencing difficulty with differing Python versions and other compatibility issues, I realized I should abstract the Walabot‘s sensors via a RESTful API, which Processing (and any other network-enabled client!) could consume. I started putting together a Flask-based server, then somehow stumbled upon @TheArtOfPour’s walabot-web-api which was pretty much exactly what I was in the process of creating, although intended for use with Windows and the Developer version of Walabot, while I was using Linuxand the Creator version – but it was still quicker to modify it to work with my OS/hardware than create my own from scratch! With a working RESTful APIserving Walabot target data on my Raspberry Pi, I then switched over to the generative art portion of the project using Processing. ? ?
I had been using the book Generative Art by Matt Pearson as a guide for harnessing Processing to create generative art, but in searching for examples I happened upon @hype’s HYPE Processing Library, which despite not being updated for over two years still worked perfectly, and provided exactly the kind of help I needed to create something that looked spectacular! I combined the generative functionality of HYPE with the JSON sensor data provided by the Flask-based RESTful API server to create beautiful representations of Walabot targets:
Since Walabasquiat is intended as an art installation, with the Processingsketch being displayed on a large screen, or projected, I thought it would be fun to provide a “souvenir” that would allow visitors to continue to enjoy the project even after they left. I created Walabasquiandroid, an Android live wallpaper, again using Processing for visuals and the same RESTful API to obtain the Walabot sensor values. The visualization is more simplistic in the Android app, as not to use unreasonable amounts of CPU just to provide a pretty background, but it presents an attractive, generative display of the same targets that are informing the main piece, which can be enjoyed long after viewing the primary installation:
Walabasquiat Live Wallpaper
Steps to Repro ? ?
To recreate this project, simply connect Walabot to the Raspberry Pi and install the API as outlined in Getting Started above, then, on the Raspberry Pi, using Terminal, download and run the server:
sudo wget https://raw.githubusercontent.com/ishotjr/walabot-web-api/rpi/app.py
You can use curl to ensure that everything’s working:
curl –include http://192.168.1.69:5000/walabot/api/v1.0/sensortargets
HTTP/1.0 200 OK
Server: Werkzeug/0.11.15 Python/3.5.3
Date: Tue, 11 Sep 2018 04:06:12 GMT
In this example, the local IP address of the Raspberry Pi on my network is 192.168.1.69 – you can find yours using ip addr show.
Now for the art! ? If you don’t already have Processing installed on your Raspberry Pi, grab that first (again, I like using the CLI, but there’s an easierway if that’s not your thing!):
curl https://processing.org/download/install-arm.sh | sudo sh
Next, clone the Walabasquiat and HYPE library repos, and install the latter by unzipping it into the libraries folder in your sketchbook:
git clone https://github.com/ishotjr/Walabasquiat.git
git clone https://github.com/hype/HYPE_Processing.git
unzip HYPE_Processing/distribution/HYPE.zip -d ~/sketchbook/libraries/HYPE
Open Processing from under Graphics in the Raspberry Pi‘s application menu, and use File > Open to load the sketch from your sketchbook:
Read More Detail :Walabasquiat: An Interactive Generative Art Installation!