Control system version 1

Almost al non-human figure is equipped with an Adafruit HUZZAH32 - ESP32 Feather1 microcontroller, with #11 paired with a Teensy 4 board for its higher number of hardware serial ports. These microcontrollers are used to control the activities of all actuators in all animated figures. The actuators are generally controlled either using motion controller code I have developed that runs on the ESP32, or using the motor driver’s built in motion controllers. A few actuators receive direct instructions setting for example the speed of a pump.

The ESP32 is also a Wi-Fi chip allowing for reasonably convenient Wi-Fi connectivity and all of the figures receive instructions over a wireless network.2 The transport layer used is UDP and the communication protocol is based on OSC.3

The ESP's receive OSC messages and bundles with instructions for actuator movement. There are two main types of instructions:

  1. Continuous change instructions, for example controlling the vacuum pumps' speed for the figures that use them, or dimming LED lights.

  2. Instructions meant instigating physical movement caused by motorized actuators such as servos, smart servos, stepper motors or brushless DC motors. Since these actuators are mostly controlled by using a motion controller either running on the ESP32 or on the motor driver they genereally receive messages like the following: [figure type, number of that type, which actuator, target destination, and the time to reach that destination.]

  3. Instructions modifying how the motion controller software responds to messages causing moving, for example how fast the acceleration should take place, speed and power.

As an OSC message sent to figure #13 could look like this: /joint/1/motor/setAmp/axis/2 6.614.

This translates to a message for figure Joints number one, the instruction is in the category motor. This message will set the peak amperage(power) of the brushless motor actuating axis 2 of the figure joint number 1.

Here is one that causes movement: /joint/1/motor/axis/1 1.13.

This will cause the motor driving axis one of figure joint number 1 to rotate from its current position to a position that is 1.13 revolutions from its position relative to the zero point of the motor encoder.

Most actuator control uses ease-in ease-out acceleration implemented either in the programs running on the microcontrollers, or in the case of the brushless motors and smart servos, available as options provided by their drive boards.

In developing compositions featuring the figures I have generally made use of a centralized control unit. In other words, a control computer that sends out messages to all figures taking part in an exhibition or concert. Both the control computer and all figures are connected to the same local area network with the control computer being assigned a static IP address. In this version of the control system, each of the figures connects using DHCP. After connecting to the local area network, each figure will then report its IP and port number to the control computer. This automates the addressing of each figure and the control computer will automatically register the presence of each figure taking part in the composition. Each figure will after initially connecting periodically report its motor positions, port, and IP, allowing monitoring of whether or not they are still connected. The reporting of IP and port number means that every time an exhibition or project is booted up, the destinations to which the control computer will send commands are verified and, if necessary, automatically updated.

Speed

The main priority in the development of the control system was to allow quick experimentation of movement sequences and patterns. The system is designed around using a digital audio workstation, in this version, Reaper4, as its primary tool to create and organize movement sequences. This is possible because Reaper has basic support of the OSC protocol. Reaper allows relatively quick creation and composition of movement sequences using the highly developed organizational methodology of a DAW. The choice of a digital audio workstation to organize the movement material is based on the need for agility when distributing movement material, sound and other artistic material in time. The graphical user interface developed in such software' is geared towards efficiently manipulating material in time.

Reaper layout

Each Reaper track is named with its OSC destination. This destination corresponds to the name of each figure and, in most cases, the target actuator. For example, "/prickly/2/X". This means that the target figure is Prickly number 2, and the data is intended to control the motor moving the X-axis. The control data is generated using a dummy JS plug-in5 that is simply a large bank of faders ranging from 0.0 -1.0. In the case of "/prickly/2/X," the motor is controlled using data generated using the first of the dummy plug-ins faders to determine the position to move to while the other specifies the time it will take the motor to move from its current position to the new one. This makes it possible to automate the target positions and the time to reach those positions. The ease in and ease out acceleration and deceleration is calculated and executed on the ESP 32.

A MIDI note for each parameter is sent via loopback midi port to a OSC translation program created in Max MSP to trigger the transmission of the target position and time to get there.

Reaper does not allow for custom OSC messages but provides its own set of OSC addresses. I have written a minimal OSC configuration file for the Reaper OSC module that instructs Reaper to transmit the js dummy plug-ins faders' position but no other parameters. On loading the Reaper session it also performs a dump of track names and initial fader positions. The translation program receives this data and automatically matches the names of the tracks in the session and uses keyword matching to route the parameters to a subpatcher that in turns relays the information to the correct figure.

Example keyword list from the translation program.

The commands are then transmitted via the Wi-Fi network as encoded OSC messages or bundles to the target figure, which parses the incoming message to control the relevant actuator as indicated by the OSC path. Using the automatically reported IP addresses, each command is sent only to the figure for which it is intended. When a figure receives a message, the program running on the ESP32 for that figure dispatches or routes the OSC encoded message to the relevant subfunction of the program running on its microcontroller. For example, a message prepended by /prickly/2/X is destined for the servo driving movement on figure #3: Prickly‘s X-axis. The advantage of doing it this way is that the translation program will automatically update depending on which figures are listed in the Reaper session. If a session has only track names related to a couple of figures, no data will be handled or transmitted to any other. It also makes it easy to switch off figures in an exhibition by merely muting the Reaper tracks relevant to those. In the Max/MSP Patch, the OSC data is scaled to the appropriate value for each figure and transmitted over the network when triggered by the MIDI notes sent over internal MIDI ports from Reaper to Max/MSP.

There is some variation in how the control messages for the various figures are structured. For figures driven by RC servos or stepper motors the structure is generally as follows: target figure, target motor, target position and time to get there for example /prickly/1/servo/1 0 <goalPosition> <timeToGetThere>.

For actuators that have a built in motion controller such as figures actuated by brushless DC motors, commands are generally sent as OSC bundles including instructions for target position, speed, acceleration and deceleration.

For example a message triggering movement by figure #13 would normally be an OSC bundle consisting of the following OSC messages:

/motor/trajSpeed/axis/0 2.725806 //setting the speed

/motor/trajDecel/axis/0 5.7 //setting the deceleration rate

/motor/trajAccel/axis/0 2.5 //setting acceleration rate

/motor/axis/0 0.708661 //setting target position and triggering the move

A MIDI note for each parameter is sent via loopback midi port to a OSC translation program created in Max MSP to trigger the transmission of the target position and time to get there.

A few commands are implemented as network broadcast messages. These are messages received by all figures irrespective of their IP address. There is on message requesting all figures on the network to report their IP and port number to the control station, another is a request for all figures to report current motor position and status, and lastly a global "off" message for figures that have actuators that can be energized without it being apparent in their behaviour such as those using solenoid valves. (To avoid them overheating.)

Further possibilites

The realization of performances and exhibitions featuring the figures as part of this project have utilized the control system as implemented using a centralized control computer from which all figures receive their instructions and report back their presence and status. However, since it is a networked system, nothing hinders the control logic being distributed or even running on one or more of the microcontrollers used to drive the figures' actuators. It is easy to imagine the figures communicating with each other, telling each other what they are are doing and reacting to the actions of each other. Such possibilities are there in the methodology of the control infrastructure, and may be pursued further in projects to come.

Priorities for composition

With the focus on compositional control this project has the main reason for using a centralized control station and using Reaper to generate control messages for the figures is the ease of editing. I have found that ease of control and the ability to quickly edit movement sequences is far more critical than implementing technical features allowing more fine-tuned control of each actuator. To experiment with the movement material of a figure together with material from another and combined with audio, it is paramount to be able to quickly manipulate the timewise relation between the various materials. In order to realize artistic content using a technically demanding infrastructure, avoiding as far as possible being bogged down in technical troubleshooting is paramount. DAW's are designed for quick editing of a timeline, so using one to organize movement material in addition to audio allows quick changes in compositional structure.

Outside of agility of timewise organization of material, the systems' main advantage is its expandability facilitated by the ability to add more animated figures simply by connecting them to the network. Secondly, since the motion control system, meaning the functions that regulate acceleration, deceleration, speed and power, is mainly implemented on each figure's microcontrollers, minimal amounts of data have to be sent over the network. The bottlenecks I have experienced have not been on the Wi-Fi network but instead internal OSC traffic on the control computer from Reaper to the Max/Msp translation patch. Still, because of the system's networked nature, there is no technical obstacle hindering the use of more than one control computer, or even having one of the figures control some of the others or all of the others.

The exception

Up to this point, the only exception to this technical implementation is Figure #11: Bones. This figure is intended to be used by the performer Alwynne Pritchard for a solo performance of the composition Nether to allow performances to take place without me being present. Therefore, this figure's control system is implemented as a standalone system that automatically starts a sequence of preprogrammed movements the figure when electrical power is provided.

This figure uses a Teensy 4 initially chosen to allow the use of the Teensy audio shield for synthesis and audio playback. I wrote a simple sequencer program that runs on the Teensy that reads a list of JSON encoded instructions that the sequencer steps through in a timed manner, and a matching Max/Msp patch that allows me to "record" movement instructions allowing the material to be developed using Reaper. I could remotely transmit messages using slip encoded serial communication to develop the movement material in the same system used for the rest of the figures.

After recording the movement sequence and uploading it to the Teensy the JSON is parsed on the Teensy and transmitted internally on the microcontroller as OSC messages to the relevant motor subfunctions maintaining compatibility with the rest of the technical ecosystem for all figures. (This figure can easily be put under Wi-Fi control by hooking it up to an ESP 32 that will forward OSC commands over a serial connection to the Teensy. There is also a version running only under remote control with an ESP32 controlling the actuators).

Additionally, this figure has a small synthesizer and sample playback built-in. This is done using a teensy audio shield, this internal sound source was not in used in the final composition.

 

A segment of the score running on the Teensy in Dog/God