High Power RGB LED Controller Shield

Subtle Emergences was a massive undertaking and, as often happens, a lot of work went into features that didn’t quite make it into the final project. One such side project was the development of an Arduino shield to perform control of 1 watt-per-channel (3W total) RGB LEDs.

The Problem

LED control is a relatively solved problem. LEDs need PWM controllers in order to fade in and out and a number of solutions. Both Sparkfun (https://www.sparkfun.com/products/10615) and Adafruit (https://www.adafruit.com/products/1411) make PWM shields that can control 16 PWM channels (more, actually, because they can be daisy-chained) from a subset of the available Arduino pins. The problem is power. The TLC5940, the chip upon which the Sparkfun shield is based, can only drive a maximum of 130 mA per channel, which is fine if you are driving a servo motor or a small LED, but not if you have a 1 W (400 mA @ 2.5 V) per channel monster LED.

Higher-power PWM can be achieved by using MOSFETs to quickly switch a high-current source (for example), but mounting 16 MOSFETs on a shield could be tedious as well as time- and space-consuming.

The Solution

Fortunately, TI makes the ULN2803A, an 8-channel Darlington Transistor Array that has a rated collector current of 500 mA, well above our 400 mA requirement. Two of those arrays, attached to the outputs of the aforementioned TLC5940 would give us 16 high-powered PWMed slots, enough to drive 5 RGB LEDs (3 channels x 5 LEDs = 15 channels). I designed a breadboard circuits to test the design and then moved on to a shield layout, designed for SMD components. At this point, I still have to solder the components on to the shield board, but since the breadboard test circuit worked, I’m relatively confident that the shield will be good to go as soon as I get the components in the mail!

The nice thing is that because the shield uses the same chip as the SparkFun PWM Shield, the library from that shield works with this shield as well. That said, since this one is specifically designed for LEDs, I will likely extend that library to deal specifically with RGB LEDs.

I’ll post more pics and results as soon as I get the final shield together.

The Making of Subtle Emergencies

Subtle Emergences was shown at the Alternator Centre for Contemporary Art in Kelowna, BC in April and May 2015. The show featured a number of technical innovations in addition to the artistic content and I wanted to take some time to go over them here. For more information on the artwork itself, check out the main project page.

Arduino Setup

Overall, the project involved actuating 7 hanging fabric elements (with a total of 9 independent SMA actuators), 13 halogen lightbulbs, and 3 speakers. Additionally, 8 sensors fed data about the space into the control system which modulated the behaviour of the elements. The code that ran both the Arduino microcontollers and the Python (PyGame) is available at https://bitbucket.org/dkadish/subtle-emergences.

The software that drives the installation is run from a Mac Mini and the Arduinos act as an intermediary in the process, driving the control circuits and collecting sensor data. A total of 4 Arduinos were used in the installation – 3 to control the lights and the hanging SMA sculptures and to collect light-level data from sensors embedded in the sculptures (fritzing file) and 1 to sense ambient room conditions (noise, light, temperature, humidity).

Gestural Control

One of the technical/procedural innovations of the work was the use of gestural control to create the different behaviours for the hanging elements. I had begun to try to create their motion graphically/numerically, creating input curves in an attempt to elicit certain affects from them. I quickly realised that this was relatively futile. Each element had different motion characteristics derived from the difference in fabric and acrylic arrangements, as well as slight shifts in the quality of the SMA annealing.

A much better approach, I realised, was to engage a more artistic and the gestural tactic. I knew how I wanted them to move and I could use my own internal feedback to accomplish this. A simple Arduino circuit to read a potentiometer input and then control the signal based on that input, combined with a record button and a python script to read the data over a serial connection and transcribe it for posterity did the trick. With that in place, I could turn a dial until I got the kinds of motion I was looking for and then press a button to record the motion.

A circuit that allows for controlling a load (a motor, light, shape memory alloy, etc) using gesture (turning a dial), and then a record button to save the motion.A circuit that allows for controlling a load (a motor, light, shape memory alloy, etc) using gesture (turning a dial), and then a record button to save the motion.

Dynamic Arduino Identification

A final technical innovation of note. I ran into some difficulty as soon as I began to use more than one Arduino at a time to run the exhibition. With a single Arduino on the computer, it is easy to determine from Python which serial connection to open (on Unix with Leonardo boards is was the one that had ttyACM# in the name, where # is any number). However, once more boards were attached, the numbers simply incremented and they were not guaranteed to point to the same board each time. So, one light may be found on pin 3 of /dev/ttyACM0 one minute and pin 3 of /dev/ttyACM1 the next time the installation started. Since I wanted the boot process to be automatic, this wouldn’t do.

The solution was relatively simple. I used the EEPROM to permanently store a unique ID for each board (code at https://codebender.cc/sketch:196271) . The python code then connects to each board that is available and queries it’s ID number. The controlled elements are attached to a pin and a board ID, which is then mapped to a serial connection each time the connection is made. This means that the python code is able to find the correct board each time. The python code is not terribly well-documented (hopefully I’ll fix that soon enough), but is available at https://bitbucket.org/dkadish/subtle-emergences/src/c2e7832e50fa/python/emergences/serial/?at=master

Textural Music Glove

The textural music glove interprets the texture of a material as music. A participant wears the glove and headphones and drags their left index finger across a surface. A piezoelectric element vibrates with the motion of the finger across the surface and the vibrations are filtered through the circuit on the headphones and then processed by the Arduino microcontroller.

The notes played by the system are based on a markov chain generated from a piece of MIDI music (any MIDI file can be used to generate the transition table). As signals are generated by the piezoelectric material, the Arduino code aggregates the strength of the signals. Once the signal aggregation reaches a certain threshold, the next transition in the Markov chain is triggered. The result is music that is more staccato on a rougher surface, and music that is more melodic on a smoother surface.

The Textural Music Glove was used as part of Materiality.

The code for the Textural Music Glove is available on github at https://github.com/dkadish/textural-music/.