Musical Interactions – part 1

Design, Interaction Design, Music Technology, Opinion, Physical Computing, Presentation, UX

I’ve known for a while about the Strategy and Innovation MA that John Boult runs at Brunel, and as well as collaborating with John as part of Big Potatoes, he recently asked me to come and talk at the Digital Brand Jam event at Brunel (30th of May 2012).

When he learned that I was deeply interested in musical technology, John suggested it might be relevant to some conversations he was having around experience design and brand. Either he was incredibly insightful and in tune with some of my raw thinking, or he was just looking for some random spice. Either way, I thought it would be good to do a rapid re-hash of one of my Musical Tech interaction design talks. I did the first one at MEX in November 2010 as a 20 minute rollercoaster ride through physical music tech interaction, and extended it to a 1-hour geekfest at London IA in January 2012.

Getting even 20 minutes down to 10 minutes was always going to be hard for me – I have too much to say on most subjects! It wasn’t made any easier by having to juggle this with exciting new business activities, large account management challenges, my ever-tiring recruiting mission and the need to solidify a plan for my UX practice with my team.

Even so, I gave it a shot, using some of the videos and imagery I had dug out previously and giving thought to a new framework of discussion which I thought was pretty interesting to share.

My main premises:

  • The ways in which we interact with technology are always changing but some things always stay the same: our physiology, our need to show off and control multiple parameters at once
  • Music technology innovates and lasts in this space, standing the test of time and forging new paths I showed how this is demonstrated through four areas of transition for interaction design relating to analogue, physical and digital electronics and systems design.

Physical Analogue: Electronics with fluid paths

Technics 1210 turntable – the DJ’s instrument.

The theremin – an early example of free-space gestural interaction, demonstrating the imprecise nature of Kinect, Wii and similar free-space gestural interaction systems.

The Roland TB-303, without which there would be no acid house or techno. After 20+ years these units sell for £1500+ despite flaws. These things have lasted well and retain a set of avid fans.


From tape-machine to Fairlight sequencing and sampling, the cumbersome and the sonic flaws get replaced by convenient, but arguably colder interactions and sound.

As we move to fully software-based powerful Digital Audio Workstations (DAWs) like Ableton Live or Logic, we can use our laptops to make orchestral masterpieces or ear-piercing dubstep soundclashes wherever and whenever we like.


Moving from Technics 1210 turntables to CDJs meant that DJs could take their whole collection with them without fear of luggage handlers at Heathrow nicking their most prized vinyl.

When synthesisers began to hide their power behind small-screen menus with limited controls, others re-exposed the innards in software editors controlled by a mouse. These days the VST software control is almost a de-facto complement to the hardware synthesiser to allow more flexibility in programming.

As new instruments like the Tenori-On and the Monome (see my talk from a few years back, here) emerged they brought with them newer and more flexible ways to compose and create music, without having to know much musical theory or go through the rigmarole of learning a new instrument.

At the same time new products like the Teenage Engineering OP-1 combine aircraft-quality industrial design and engineering with powerful digital synthesis and sequencing to give us a powerful audio workstation in a unit smaller than half a Macbook Air.


Synthesizers are starting to expose hardware controls again, but these controllers manipulate digital electronic signal paths rather than analogue ones. We are empowered by this hands-on control.

Increasingly we are seeing the use of cheaper and dumber control surfaces and devices. It’s not uncommon to see banks of linear faders or rotaries (knobs) that can be easily programmed to control a number of parameters on other physical devices or within musical software. You can even get plugins for controlling Adobe’s Lightroom these days.

And people are making their own controllers, including the biggest knob I have ever seen, and some of the most gorgeous and polished knobs ever to be created.


Scratch DJs always struggled with the idea of CDJs; they just don’t have the tactility of vinyl on turntables. SeratoFinal Scratch and Traktor have been working hard to fill this gap bv using real vinyl with time-coded high pitch sounds to control MP3s. Mind-blowingly cool as this is (even 5+ years on), you need bulky turntables to play with it.

Recently, Numark created some CDJs with real vinyl platters to better reflect the tactility of vinyl but with the power and convenience of hooking up to an MP3 collection on a laptop or CDs with MP3s or higher quality recordings.

Meanwhile increasing numbers of cheaper and dumber controllers are helping to better control the digital brains behind the glass screens of iPads and laptops. For a mindblowing example of this check out Korg’s iMS-20 iPad app, which uses the Korg MS20ic Midi controller keyboard with patch cables. Just move the patch cables on the physical device and it will connect them on the iPad. I bought one straight away on eBay when I saw this.

Then you get things like the Reactable, where the digital brain complements physical objects allowing collaborative Simultaneous Multi User Interface (SMUI) interaction.

Expanding this topic for forthcoming IxDA London

I am looking to curate an event pretty soon for IxDA London on Music Tech interaction design. I believe this subject needs some full-on airing. I am, however, very biased.

In the meanwhile, please check out some of presentations and video around this subject: