Overview

This post documents a project undertaken in Spring ’19 as an Associate Musician for Drake Music – leaders in music, disability and technology.

I’ve been involved in the DMLab North West – a monthly meet up open to all with an interest in music being more accessible via new technology – since it’s inception in Summer 2016. 

Tim Yates (Drake Music Program Leader Research and Development) invited participation in a new collaborative project between DM and Brighter Sound – a pioneering music charity based in central Manchester – and specifically their Swan Street Collective project – a musical ensemble of young people with disabilities. The project ran over a series of ~12 sessions from March 2019 and involved experimentation, improvisation and composition culminating with a special performance in the Barbirolli Room at the Bridgewater Hall on Sunday 23rd June 2019. 

Billy Payne and I from DMLab North West joined the project to adapt and develop accessible instruments for the Collective that could be integrated into the sessions and final performance.

Initial Ideas

We took part in an initial planning meeting in early March and then met up a couple of times (before attending project sessions from early May) to discuss ideas and develop possible approaches.

We agreed:

  • we’d each independently develop but share resources for a hand-held, wireless MIDI, distance sensor based device – an alternative to the usual ‘air-harp’ type instrument where the sensor is mounted in the device and then controlled by moving a hand above it. This alternative approach could potentially be far more flexible – using movement between the device and the other hand (in an accordion like movement), the body or any other surface while also freeing up the interaction from cables and the tabletop;
  • I would explore approaches for converting voice to MIDI – Tim Chatterton & Kenton Mann from Brighter Sound had flagged up interest by potential participants in using vocalisation.

Billy and I also made a wish list of interface issues we’d like to try and address:

  • integrating a visual indicator for notes in the scale against distance – likely via a NeoPixel Ring;
  • integrating haptic feedback for a more subtle but also potentially more intuitive indication of approaching/reaching notes in the scale – via a mini exciter or an LRA motor and DRV2605L haptic controller module;
  • realising more musical expressivity by controlling portamento – the interpolation between notes;
  • making our devices musically ‘scaleable’ by integrating Chris Ball’s Scale Manager Arduino library.

At an initial session, project lead Kenton Mann also suggested developing an interface based on gestural control – which I thought could use older but still very usable technologies such as the Wii Remote + Wii MotionPlus or Leap Motion controllers.

Development

I subsequently researched, prototyped and developed three different devices/interfaces:

  • a gesturally controlled MIDI harp – using openFrameworks v10.1 and a Leap Motion controller;
  • a voice to MIDI convertor – using a Teensy 3.6, Audio Adaptor Board, Audio Library and cheap mic head-set;
  • and a hand-held, wireless MIDI, distance sensor instrument – integrating an Adafruit Huzzah ESP8266 breakout, a Pololu VL6180X Time-of-Flight distance sensor, a Pimoroni Haptic BZZZ DRV2605L linear actuator haptic breakout, an Adafruit LIS3DH triple-axis accelerometer, a NeoPixel 24 x LED ring, a 1000mAh LiPo battery and an Adafruit Powerboost 1000C charger module + other passive components.

Each of these are documented in more detail below.

Leap Motion MIDI Harp

The Leap Motion controller promised much for the future of gestural control on its public launch circa 2014… but subsequently languished (IMO it was always a bit ‘clunky’ and never really lived up to its hype). Acquired by UltraHaptics in 2019 and rebranded as Ultraleap it’s found a new lease of life in the burgeoning of VR… although the latest V4 SDK is Windows only. 

You can still pick one up on eBay for under £50 and the older V2 (Legacy Desktop Apps) software and SDK for macOS is still available from https://developer.leapmotion.com – you have to sign up for a developer account.

I coded the project in the creative C++ toolkit openFrameworks (oF) v0.10.1 on a MacBook Pro 15” (Late 2013) running macOS 10.14.6 + an Xcode version pre 11.3.1 (I’ve since updated my laptop and am not quite sure which Xcode version I used at the time). 

I found and tested a couple of now relatively old, user-contributed, Leap Motion V2 SDK oF addons listed in the ofxAddons directory and used Gene Kogan’s ofxLeapMotion2 (last updated 2 years ago) – “A wrapper for the Leap Motion SDK compatible with Leap 2.0 Beta with skeletal tracking” along with ofxMIDI & ofxDatGui.

The project also integrates a (pretty much as is) port into oF of Chris Ball’s Scale Manager and Rob Tillaart’s StopWatch Arduino libraries – which while not quite C++ are close enough.

Returning to the project a year later, the oF project (and the addon example project) failed to compile in Xcode 11.5 on my now macOS 10.15.5 MacBook Pro giving a ‘Command CodeSign failed with a nonzero exit code’ error. I found a solution in vanderlin’s post to the ‘Can’t run examples in Xcode, signing issue with libfmodex.dylib?” thread via the oF forum. Old sketches failing to compile is a frequent issue in creative coding projects when updates to the OS and/or programming IDE/frameworks ‘break’ earlier working versions. The oF project can be found at my Github (link at the bottom of this post) – specifically the ‘Leap_Midi_DatGui_working’ folder. 

You don’t actually need to compile the oF project to see it in action – the runtime binary works (as does a debug binary of the addon example) – albeit it’s unstable and crashes fairly regularly (I suspect this is an issue with the version of Chris Ball’s Scale Manger library I’m using which he admits wasn’t optimised). Both are in the oF project’s ‘bin’ folder – you need to leave them in there – they reference the included ‘data’ folder.

Install the Leap Motion software; plug in the controller; double-click to run the binary; open a soft synth – I used NI’s Massive and found the ‘Prepared Piano’ patch from the ‘Massive Factory’ bank worked well (this patch on GitHub too); enable the ‘ofxMidiOut’ MIDI input – and you should be good to go.

Here’s a video demo of it in action…

Vocal to MIDI Interface

I’ve previously developed an audio to MIDI patch using the fiddle~ object in MaxMSP (with help from creative collaborator Ben Lycett) and used it successfully in a couple of audiovisual projects including Stravinsky Rose – a John Whitney Sr. inspired visualisation of Igor Stravinsky’s ‘Three Pieces for Clarinet’ as performed by Fiona Cross of the Manchester Camerata.

Interested in finding a standalone solution that didn’t require a laptop I wondered if it might be possible to use a microcontroller – specifically the Teensy 3.6 + Audio Adaptor Board + Audio Library (Billy Payne suggested this approach too). The library features an AudioAnalyzeNoteFrequency object by Collin Duffy which can “detect with fairly good accuracy the fundamental frequency ‘fo’ of musical notes, such as electric guitar and bass”.

Using the default mic input on the Audio Adaptor Board I started out by testing several different microphones – including the Adafruit Electret Microphone Amplifier – MAX4466 with Adjustable Gain – but settled on a cheap microphone headset sourced via eBay which empirically had the best signal to noise ratio.

I then tried to optimise the audio to MIDI processing within the Arduino sketch:

  • initially by optimising input gain levels; 
  • then via Damien Clarke’s ResponsiveAnalogRead – “an Arduino library for eliminating noise in analogRead inputs without decreasing responsiveness” – which though intended for sensors, pots etc. can be also be used for any sketch variable; 
  • then via the Audio Library’s AudioAnalyzeRMS object to set a threshold for the incoming audio signal to trigger the analysis; 
  • and finally – since the functionality seemed to warrant it – by organising the Arduino code as a state machine implementation – “an abstract concept or system that helps you systematically design and implement the logic behaviour of an embedded system”. This device’s specific ‘states’ include: checking incoming audio amplitude against a ‘start’ threshold; then triggering the frequency analysis and calculating the MIDI Note On and pitchbend data; then sending these MIDI messages; and finally checking the incoming audio amplitude against an ‘end’ threshold to complete the cycle, send a MIDI note off message and drop back into the first state.

I added MIDI pitchbend to try and realise a bit more musical expressivity (though it’s difficult to actually control) using functionality from Chris Ball’s Scale Manager library to calculate the frequency of a given MIDI note and compare this to the Audio Library analysis.

The project only requires the hardware listed above – though I made up a 3.5mm mono-jack socket to 2-way female Dupont cable to get the mic signal into the Audio Adapter Board. The Arduino sketch is on my GitHub along with the working version of Chris Balls’ Scale Manger library. Other required libraries are available from the links above or are part of the Teensyduino installation.

Latency is noticeable (to be expected) but actually fairly minimal… and it works reasonably well… but it does require a non-legato vocal technique – demo video below:

I also looked for alternative solutions – and found MIDI Guitar 2 by Jam Origin for macOS, Windows and iOS devices. It claims to offer “state-of-the-art guitar tracking” – and it’s actually pretty good – though it requires an in-app purchase for the MIDI Output module (~£30) to send out data via it’s own virtual output or to the iOS MIDI Network Session. I managed to get it to trigger Korg’s Animoog on my iPad easily enough using the ‘External MIDI Output’ preset which routes MIDI via it’s own virtual output – Animoog lists this as ‘MIDI Guitar out’ in its inputs. The main issue I found is that since it’s using the iPad’s in-built microphone (my Apogee One for iPad/Mac isn’t fully compatible with the iPad Pro 2018+ – even using Apogee’s own USB-C cable – so I couldn’t use its studio quality mic) the audio output from Animoog also triggers MIDI Guitar 2’s input and you get caught in a crazy audio to MIDI to audio to MIDI etc.. loop. An alternative mic in solution which allowed more isolation and careful adjustment of the input levels should overcome this – but I didn’t explore it further.

Hand-held, wireless MIDI, distance sensor instrument

While this was the most ambitious of the devices/interfaces I developed – both in terms of the complexity of the build and the coding – it’s still only a working prototype and ‘proof of concept’. However, it does show potential and it allowed me to respond to and find working solutions for most of the ‘Initial Ideas’ outlined above.

Since the build is more complicated than the two projects above I’ve organised it into sections:

MIDI over WiFi

I managed to implement MIDI over WiFi (Billy implemented MIDI over Bluetooth) using an Adafruit Huzzah ESP8266 breakout, lathoub’s Arduino-AppleMIDI-Library – “enables an Arduino with IP/UDP capabilities (Ethernet shield, ESP8266, ESP32, …) to participate in an AppleMIDI session” – and a macOS MIDI Network Session. 

Rather than use a macOS ad-hoc WiFi network (I’ve found them a bit temperamental in the past) I bought a compact, 5V powered wireless router – the GL.iNet GL-AR300M Series Mini Smart Router for ~£25. While this made setup a bit more cumbersome it provided a dedicated WiFi network for the ESP8266 to join, an Ethernet LAN port to connect to my MacBook Pro 15” (and so minimise latency) and the ability – via the router’s admin interface – to configure fixed DHCP allocated IP addresses for both devices as well as port forwarding to ensure the relevant port (5004) was open. Once configured it worked without issue.

One upshot of MIDI over WiFi I hadn’t initially considered was that MIDI messages can be sent both ways – not just from the ESP8266 module to the MacBook Pro but vise versa too. I capitalised on this functionality by coding a rough and ready MIDI monitor and control GUI in openFrameworks v10.1 – to display the incoming note and pitchbend data from the device but also send MIDI (using a few of the lowest pitch Note On messages but with stepped variations in velocity) back to the ESP8266 module to trigger and change various options in the device (this also saved me having to implement input/navigation buttons and a screen on the device itself). You can see this implemented in the oF project and Arduino sketch on my GitHub and demoed in the video above.

Some MIDI issues remain unresolved – the ESP8266 module occasionally resets itself and automatically rejoins the MIDI Network Session without dropping the previous connection first which messes things up. In principle this could be managed via AppleMIDI but it requires more research into the macOS MIDI Network Session protocol.

Time-of-Flight Distance Sensor

For determining distance I used a Pololu Vl6180X Time-of-Flight Distance Sensor – a short-range infrared lidar sensor that can measure a default range of ~20cm with up to 1mm accuracy. It’s very compact and also far better than simpler optical sensors at making accurate readings regardless of ambient lighting conditions or the target’s color or reflectivity. The sensor communicates with the microcontroller via I2C (likewise the Pimoroni Haptic BZZZ and Adafruit LIS3DH) and I used the Pololu Arduino library (it has some additional functionality not featured in the Adafruit library which I tested but subsequently commented out) to configure it and implement functionality within the Arduino sketch.

Power

For power I used a LiPo 1000mAh battery and an Adafruit Powerboost 1000 Charger – a compact DC/DC boost converter module that can be powered by any 3.7V LiIon/LiPoly battery and convert the output to 5.2V DC (as well as charge the battery via a 5V 2A PSU). I used this to power the Adafruit Huzzah, LIS3DH accelerometer, Pimoroni Haptic BZZZ and the NeoPixel Ring directly (all can accept a 5V power supply). I also added a switch to power down and disable the Powerboost to prevent draining the battery when not in use.

While the Pololu VL6180X module has an on-board voltage regulator which allows power input from a 2.7-5.5V supply it also shifts the I2C clock and data lines to the same logic voltage level as the supplied VIN. Since the Adafruit Huzzah GPIO pins are 3.3V logic I added a compact 5-3.3V DC/DC convertor module sourced via eBay and used its output to power the Pololu Vl6180X and the thumb momentary push button so that I wouldn’t need to level shift these clock and data lines. 

However I did add an Adafruit 4-channel I2C-safe Bi-directional Logic Level Converter to shift the 3.3V logic level of the Adafruit Huzzah to 5V for the NeoPixel ring.

Note: Although I2C is working without issue for all the modules in my prototype – in retrospect I didn’t add the usual 3.3k (for 3.3V logic) pullup resistors on the I2C lines and didn’t check if the LIS3DH accelerometer and/or Haptic BZZZ which are powered at 5V also shift the I2C logic to 5V (which could potentially damage the Adafruit Huzzah). I should have used the logic level convertor for this (it’s bi-directional). While my actual PCB doesn’t include this I have updated the schematic accordingly.

NeoPixel Ring

One key usability issue with distance based controllers that Billy and I had discussed is the difficulty to accurately judge how far your hand needs to be positioned above the sensor (or in this case the hand-held device from a target) to trigger a specific note. We thought using a 24 x 5050 RGB LED NeoPixel Ring as a visual indicator to display distance against notes in the scale might help. I implemented this within the device driving the NeoPixel Ring’s 24 LEDs via the FastLED library and mounting it so it was positioned over the back of the hand on the acrylic sleeve (a basic hand-sized box with the front and back faces open).

It kinda works… but more refinement is needed to accurately calibrate the notes in a given scale distributed around the 24 LEDs (using Chris Ball’s Scale Manager library) with the actual distance measurements. You can see they don’t quite align when triggered in the demo video below.

Pimoroni Haptic BZZZ

To supplement the NeoPixel Ring visual indicator Billy and I had also discussed integrating additional haptic feedback – supplemental clicks or buzzes as you approached and/or triggered notes in a given scale. 

Billy had found the Dayton Audio DAEX-13-4SM Skinny Mini Exciter – “an extremely compact speaker which can be used to add audio or haptic (tactile) feedback to projects”. I’d previously used compact LRA (Linear Resonant Actuator) motors and DRV2605 IC based haptic controllers and thought they offered more subtle and varied vibrational feedback (albeit with less amplitude) than the usual ERM (Eccentric Rotating Mass) motors. These two types of motors work in different ways – LRAs use an AC signal to generate vibration in the Z plane (akin to a speaker driver) while ERMs use an offset counter weight and a DC signal to adjust the speed of the motor. The DRV2605 IC features over ~120 preset clicks and buzzes which can be used to drive both types.

Looking for a larger and potentially more powerful haptic feedback module I found the Pimoroni Haptic BZZZ DRV2605L linear actuator haptic breakout and integrated it into the device, fixing it onto the underside of the acrylic sleeve. While an I2C scanner sketch found on the Arduino Playground confirmed the module was active, I couldn’t get it to work – until I discovered the useLRA() and useERM() functions in the Adafruit DRV2605 library header file (these don’t appear in any of the examples) and the comment that the default was ERM. Once I switched to LRA it worked well enough – though it could likely be improved further still with a better design – perhaps fixing the module inside the acrylic sleeve so that it directly touches either the palm or back of the hand.

Also the test clicks I programmed on reaching a note in a scale had similar calibration issues to the NeoPixel Ring.

Adafruit LIS3DH

Finally I added an Adafruit LIS3DH Triple-Axis Accelerometer to detect movement of the device in the X & Y axes (as opposed to the Z axis of the VL6180X ToF distance sensor) to try and use this data to add musical expressivity e.g. tremolo to the MIDI output while shaking the device side-to-side (Billy had added a similar accelerometer to several of his previous instruments to good effect). Though it’s integrated into the Arduino sketch via the Adafruit LIS3DH library and is sending data (processed via the ResponsiveAnalogueRead library) it doesn’t actually control anything yet.

Adafruit Huzzah ESP8266 breakout

While the ESP8266 module on the Adafruit Huzzah can be flashed easily enough from the Arduino IDE as well as use existing Arduino libraries and output to the Serial Monitor, it takes a bit more set up than a regular Arduino board. First step is to install the ESP8266 Arduino core via the Arduino IDE boards manager which adds a list of different ESP8266 boards to the Arduino IDE ’Tools’ > ‘Boards’ menu. I used the ‘Adafruit Feather Huzzah ESP8266’ option. 

Note: This documentation has taken a while to complete – it’s over a year since I developed these projects – and in a clean update to macOS Catalina on my MacBook Pro 15” in the interim I lost my previous Arduino IDE installs (I used to maintain several versions of the IDE and various sketchbook folders with associated libraries for backwards compatibility of older sketches). While my current working version is 1.8.10 (with Teensyduino v1.48) it didn’t have the ESP8266 Arduino core installed – so I looked through the Releases history and selected release 2.5.0 (06/02/19) which is the version I likely used. I wanted to avoid breaking functionality in the Arduino sketch – and it’s working OK. There may well be no issues with installing a current version of the ESP8266 Arduino core (it’s now up to v2.7.1 released 07/05/20) but I haven’t tried it.

Another issue with the Adafruit Huzzah is that it doesn’t have a USB socket or on-board USB to TTL conversion such as the FTDI IC in most Arduinos/Teensys. You need a console or FTDI cable – though I used the Adafruit FTDI Friend which is a useful tweaked FTDI adapter. It’s pin outs match the Adafruit Huzzah so you just need a straight 6-way cable between them.

Finally, the Adafruit Huzzah has no automatic bootload mode – you have to manually press and hold the GPIO button then the reset button then release the reset button and then the GPIO button. The red on-board LED will go dim telling you it’s now in bootload mode and you can upload the sketch from the Arduino IDE. Upload speeds are relatively slow – and while you can increase the buadrate up to 921600 via the ’Tools’ > ‘Upload Speed’ menu – my experience is that upload more than occasionally fails at the highest speeds.

Having said all that, for the money the Adafruit Huzzah is a compact and capable microcontroller with WiFi functionality that is well supported by the ESP8266 Community – though it’s since been superseded by the more capable ESP32. The excellent Adafruit tutorial explains all you need to know…

Schematic and PCB

I frequently use the RkEducation Prototyping PCBs sourced via eBay. They’re slightly smaller than a half-sized breadboard and come in three flavours with different track layouts – really useful for making up small prototypes without having to cut the tracks of standard Veroboard.

The photos below show my ‘make it up as you go along’ layout on the PCB and the schematic (I’ve recently started using the free EasyEDA online PCB design tool) is on my GitHub as a PDF.

Code

The project was originally coded in Arduino v1.8.5 (with Teensyduino 1.40) – though as noted above the sketch still compiles OK in Arduino v1.8.10 (with Teensyduino 1.48) and the ESP8266 Arduino core v2.50 by ESP8266 Community on a MacBook Pro 15” (late 2013), macOS 10.15.5.

I incorporated a fair number of libraries – you’ll need to download and install all of these in your sketchbook ‘libraries’ folder if you want to try and compile the sketch yourself. Some are already described and linked to above – I’ve added URLs and short descriptions for those not already mentioned:

  • Arduino-AppleMIDI-Library;
  • Pololu vl6180x-arduino;
  • FastLED;
  • ResponsiveAnalogueRead – listed in the ‘Vocal to MIDI Interface’ above;
  • ScaleManager – likewise;
  • Adafruit_DRV2605;
  • Adafruit_LIS3DH;
  • Bounce2 – Thomas Fredericks’ debouncing library – for the thumb momentary push button;
  • ArduinoEasing – Tobias Toft’s simple easing library for Arduino – “an implementation of Robert Penner’s easing functions”. Used in motion graphics to create more naturalistic movement of position over time I applied it to the movement between notes to try and create a more controllable portamento ((it kinda works – but it’s not very musical). I switched to it as an alternative to Andy Brown’s EasingAB library which I’ve used in the past but couldn’t get to work here (code also included in the sketch code but commented out);
  • StandardCplusplus – maniacbug’s “straight port of uClibc++ for Arduino”. While the advice is to “use with care” (most microcontrollers just don’t have the headroom to cope with the full range of C++ features) – it enables C++ functionality I’m used to in oF within the Arduino IDE – though I actually only used it to enable and work with vectors;
  • Adafruit_Sensor – Adafruit’s “unified sensor abstraction layer… that enables you to switch sensor models with very little impact on the rest of the system”. It was copied over from the Adafruit_LIS3DH example code but isn’t actually used and is commented out.

The sketch is messy for sure… I try to comment as I go and add debugging output to the Serial Monitor (which I then comment out once I get things working)… but there’s plenty of rejected test code (since commented out) that I haven’t deleted and it’s certainly in need of rationalisation.

The sketch also uses a state machine implementation as per the ‘Vocal to MIDI Interface’ above – but with the state enumerators: CONNECT_WIFI, INIT_SENSOR, SET_MIDI, TRIGGER_MIDI, EASE_MIDI and PANIC_MIDI. While this coding technique requires a specific (and unfamiliar) sketch structure and syntax and took some time to get grips with – once I had, I found that it really helped to organise the code – switching between these states once a specific function had been called or a condition met.

I could write more… but I think it’s time to stop there… phew.

All the working code f0r the projects above is on my Github – https://github.com/Prodical/DMLabNorthWest in the Swan Street Collective Folder.

If you have any specific questions I haven’t addressed drop me a line or use the ‘Contact’ page form.