Subtle Emergences was shown at the Alternator Centre for Contemporary Art in Kelowna, BC in April and May 2015. The show featured a number of technical innovations in addition to the artistic content and I wanted to take some time to go over them here. For more information on the artwork itself, check out the main project page.
Arduino Setup
Overall, the project involved actuating 7 hanging fabric elements (with a total of 9 independent SMA actuators), 13 halogen lightbulbs, and 3 speakers. Additionally, 8 sensors fed data about the space into the control system which modulated the behaviour of the elements. The code that ran both the Arduino microcontollers and the Python (PyGame) is available at https://bitbucket.org/dkadish/subtle-emergences.
The software that drives the installation is run from a Mac Mini and the Arduinos act as an intermediary in the process, driving the control circuits and collecting sensor data. A total of 4 Arduinos were used in the installation – 3 to control the lights and the hanging SMA sculptures and to collect light-level data from sensors embedded in the sculptures (fritzing file) and 1 to sense ambient room conditions (noise, light, temperature, humidity).
Gestural Control
One of the technical/procedural innovations of the work was the use of gestural control to create the different behaviours for the hanging elements. I had begun to try to create their motion graphically/numerically, creating input curves in an attempt to elicit certain affects from them. I quickly realised that this was relatively futile. Each element had different motion characteristics derived from the difference in fabric and acrylic arrangements, as well as slight shifts in the quality of the SMA annealing.
A much better approach, I realised, was to engage a more artistic and the gestural tactic. I knew how I wanted them to move and I could use my own internal feedback to accomplish this. A simple Arduino circuit to read a potentiometer input and then control the signal based on that input, combined with a record button and a python script to read the data over a serial connection and transcribe it for posterity did the trick. With that in place, I could turn a dial until I got the kinds of motion I was looking for and then press a button to record the motion.
A circuit that allows for controlling a load (a motor, light, shape memory alloy, etc) using gesture (turning a dial), and then a record button to save the motion.
Dynamic Arduino Identification
A final technical innovation of note. I ran into some difficulty as soon as I began to use more than one Arduino at a time to run the exhibition. With a single Arduino on the computer, it is easy to determine from Python which serial connection to open (on Unix with Leonardo boards is was the one that had ttyACM# in the name, where # is any number). However, once more boards were attached, the numbers simply incremented and they were not guaranteed to point to the same board each time. So, one light may be found on pin 3 of /dev/ttyACM0 one minute and pin 3 of /dev/ttyACM1 the next time the installation started. Since I wanted the boot process to be automatic, this wouldn’t do.
The solution was relatively simple. I used the EEPROM to permanently store a unique ID for each board (code at https://codebender.cc/sketch:196271) . The python code then connects to each board that is available and queries it’s ID number. The controlled elements are attached to a pin and a board ID, which is then mapped to a serial connection each time the connection is made. This means that the python code is able to find the correct board each time. The python code is not terribly well-documented (hopefully I’ll fix that soon enough), but is available at https://bitbucket.org/dkadish/subtle-emergences/src/c2e7832e50fa/python/emergences/serial/?at=master