Inspired by the amazing images and brilliant write-up by Phillip Burgess at Adafruit, I decided to give my Raspberry Pi something interesting and colourful to do – light painting. A happy day of geeking saw my brother and I pull together a wireless remote-controlled, battery-powered, Raspberry Pi light painting robot. It’s not perfect, but we think the initial results are pretty good for a day’s work and I thought I’d share some notes and photos in case anyone else is interested.
A few days earlier I had downloaded the PDF guide from Adafruit and used it to work out a kit list for mail order (LED strip, connectors, Raspberry Pi GPIO cable etc). I already knew I could run my Raspberry Pi from a USB battery and figured I could keep things simple by powering the Pi and LED strip from the same battery (the two-port Tecknet iEP387 battery is fantastic). The Edimax Wi-Fi dongle in the Pi would also let me control it without wires (although admittedly within the confines of my home wireless network to start with). After downloading/installing the Occidentalis operating system image for the Pi and setting up Wi-Fi, we had a battery-powered Pi which we could administer from my Windows laptop using PuTTY (for terminal sessions using SSH) and WinSCP (to copy files to/from the Pi). It was then easy to copy across and run the Python script that does the magic of converting image files into “RGB slices” and sending the necessary control signals to the Raspberry Pi GPIO ports to drive the LED strip.
For the LEDs, we soldered a 4-pin JST connector onto the input side of the LED strip as shown in the Adafruit guide (only we mixed up the yellow and green wires, which caused a bit of confusion until we figured that out). On the opposite side of the JST connector, we connected the red and black power wires to a standard USB A plug (via a cannibalised mobile phone charging connector) and the green (data) and yellow (clock) wires into the Raspberry Pi GPIO connector via jumper leads.
Test time! I used GIMP to create a 32-pixel high image – the word “Finventing” in white on a black background (with an extra 1-pixel black column on the right so that the light painting “ended” with all LEDs off) . Here’s the image I created and copied over to the Pi:
After diagnosing and solving the misplaced green vs yellow leads, we ran the script (sudo python lightpaint.py) and saw the LEDs flash into life. Keen to try out light painting, I grabbed my digital SLR camera, stuck it on a tripod, set it to Manual mode on a 5-ish second exposure with a 2-second delayed shutter release (to let me get into position before the shutter opened). I timed the shutter press to anticipate the start of the sequence and walked across the living room holding the Raspberry Pi and battery pack at shoulder height with the LED strip hanging down. It worked better than expected. You’ve seen the result above at the start of this post.
Excited by this progress, we raced to get get a robot doing the painting before my brother had to catch his last train home. To get our robot platform up and running, we assembled the DFRobot “Pirate” kit and lashed up a remote control system for it, reusing stuff from previous projects (an E-Sky ET6 001726 remote control transmitter and EK2-0246 receiver connected to a Pololu TReX Jr motor controller). This did the job, provided we orientated the transmitter 90 degrees clockwise (so that the robot direction and control stick direction were aligned) – good enough for a first attempt.
But how to mount the LED strip vertically from the robot? Although it’s quite light, the strip is a full metre tall and nothing to hand would obviously do the job of a mast. After much head scratching, we settled on cannibalising a telescopic antenna from an old remote control toy and extended it further by attaching a length of coat hanger using electrical terminal connector block. We lashed on a couple of guy ropes made of string to help brace the mast, tidied the battery and Raspberry Pi into the robot chassis and we were ready to try painting.
But what to paint?
“Let’s do Pac-Man” said my brother, so a Google Images search found this classic Pac-Man image (resized to 32-pixels tall):
With bro on robot remote control duty and me acting as photographer, we set about overlaying the living room with a ghostly Pac Man image. Here’s how it turned out:
Pretty cool I reckon! In hindsight the ghosts are upside down, but for next time we could simply flip the image (or adjust the Python script) to solve that. For a one-day project we were happy enough.
I’m so taken with this concept I’m looking forward to doing more of this stuff. Key changes planned for the future:
- Get the robot running faster in a straight line. It’s currently able to go in a straight line at slow speeds, but goes squint at higher speeds.
- Link motion control with image display control and possibly image capture (currently they’re all independent and require manual synchronisation).
- Establish an out-of-home control approach for Raspberry Pi, so that I can do light painting “on location” at London landmarks (might involve using my phone as portable a Wi-Fi hot spot).
- Create a web interface for the Pi (to select the image, choose orientation/direction/speed, turn LED display on and off and shutdown the Pi safely once finished – feels weird issuing the text command “sudo shutdown -h now” to a robot!). Possibly use a web interface to control the robot motion too.
Hope this stuff is of interest to people. Did this help or inspire you? Have you done other cool light painting or robotics stuff? I’d love to hear from you in the comments.