Composing for controllerism

My first set of attempts at controllerism used samples of the Beatles and Michael Jackson. For the next round, I thought it would be good to try to create something completely from scratch. So this is my first piece of music created specifically with controllerism in mind.

The APC40 has forty trigger pads. You can use more than forty loops, but it’s a pain. I created eight loops that fit well together, and then made four additional variations of each one. That gave me a set of loops that fit tidily onto the APC40 grid. The instruments are 808 drum machine, latin percussion, wood blocks, blown tube, synth bass, bells, arpeggiated synth and an ambient pad.

40 loops

I tried to design my loops so that all of them would be mutually musically compatible. I didn’t systematically test them, because that would have required trying thousands of combinations. Instead, I decided to randomly generate a song using Ableton’s Follow Actions to see if anything obviously unmusical leapt out at me. The first attempt was not a success — hearing all eight loops all the time was too much information. I needed a way to introduce some space. Eventually I hit on the idea of adding empty clips to each column that would be randomly sprinkled in.

40 loops plus blanks

It is exceptionally relaxing watching a song write itself while you sit there drinking coffee.

The computer plays itself

The result was a mix of pleasing and not-so-pleasing. I edited the random sequence into a more coherent shape:

Edited randomness

Even with my editing, the result was not too hot. But it was useful to have something to react against. Finally, with all the prep behind me, it was time to play all this stuff live on the APC. Here’s the very first take of improv I did.

raw improv

I let it sit for a couple of days while I was preoccupied with other things, and when I finally listened back, I was pleasantly surprised. Here it is, minimally edited:

The piece has a coherent shape, with lifts and lulls, peaks and valleys. It’s quite different from the way I’d structure a piece of music by my usual method of drawing loops on the screen. It’s less symmetrical and orderly, but it makes an intuitive sense of its own. I’ve been looking for a way to reconcile my love of jazz with my love of electronic dance music for many years now. I think I’ve finally found it. For my next controllerist opus, I’m going to blend samples and my own MIDI loops, and have more odd-length loops. And maybe I’ll play these things for an audience too.

How should we be teaching music technology?

This semester, I had the pleasure of leading an independent study for two music students at Montclair State University. One was Matt Skouras, a grad student who wants to become the music tech teacher in a high school. First of all, let me just say that if you’re hiring for such a position in New Jersey, you should go right ahead and hire Matt, he’s an exceptionally serious and well-versed musician and technologist. But the reason for this post is a question that Matt asked me after our last meeting yesterday: What should he be studying in order to teach music tech?

Matt is an good example of a would-be music tech teacher. He’s a classical trumpet player by training who has found little opportunity to use that skill after college. Wanting to keep his life as a musician moving forward, he started learning guitar, and, in his independent study with me, has been producing adventurous laptop music with Ableton Live. Matt is a broad-minded listener, and a skilled audio engineer, but his exposure to non-classical music is limited in the way typical of people who came up through the classical pipeline. It was at Matt’s request that I put together this electronic music tasting menu.

So. How to answer Matt’s question? How does one go about learning to teach music technology? My first impulse was to say, I don’t know, but if you find out, please tell me. The answer I gave him was less flip: that the field is still taking shape, and it evolves rapidly as the technology does. Music tech is a broad and sprawling subject, and you could approach it from any number of different philosophical and technical angles. I’ll list a few of them here.

Teach the technology itself

NYU’s Music Technology program takes this approach. You learn the foundations of audio engineering and signal processing from the ones and zeroes up. The production of actual music is a secondary concern. The one required electronic composition class is rooted squarely in the modernist Euroclassical tradition (though since I took it, pop music has made some inroads as well). If you want to learn about the culture, history and aesthetics of non-academic music, NYU’s Music Tech program is not the place to do it.

Use new tools to teach traditional repertoire and concepts

Most music teachers in the US are operating in the Euroclassical tonal tradition. Notation software and the DAW can make teaching and learning that material a lot more engaging. I have my NYU music ed students read Barb Freedman’s excellent book, Teaching Music Through Composition. If you want to teach the basics of Western common-practice era composition and theory in an interactive, creativity-oriented way, Barb’s method is a great one.

Teaching Music Through Technology by Barb Freedman

The big problem here is not in Barb’s execution, but rather the philosophical assumptions underlying it. I don’t believe that Euroclassical tradition is the right way to bring most kids into active music-making. Barb’s methods are battle-tested and effective, but I think we should be using those methods in the service of different musical ends.

Use technology as a transmission vector for Afrocentric dance music

You can use the computer to make any kind of music, and people do, but there is a particular set of practices most naturally suited to it: hip-hop, techno, and their various pop derivatives. I put this music front and center in my music tech classes, for a couple of reasons. The big one is its systematic neglect by music education. The African diaspora is a more salient influence on American music at this point than Euroclassical, but you’d never guess it from looking at our syllabi or standards.

The other reason I use an Afrofuturistic approach is that this is the music that sounds the best when you make it with computers. Classical music sounds dreadful in synthesized form. Hip-hop and EDM sound terrific. Copy and paste is the defining gesture of digital audio editing, and it fits the loop-centrism of Afrocentric pop perfectly.

With due respect to my music tech professors, I don’t believe that most musicians need to know the details of timestretching algorithms or MP3 encoding. The kids don’t really need to be taught how to use a DAW or a mic or a preamp; all of those things are amply documented for the curious. What musicians need to be taught is how to use these tools for expressive purposes. They need to know how to use recordings as raw material for new music, how to program synths and drums in a way that sounds good, the best aesthetic practices for loop structures. I believe that these practices are valuable for musicians working in any idiom, not just pop. I have my students remix each others’ tracks, so they can discuss each others’ musical ideas in the language of music itself. It works much better than any verbal discourse possibly could.

Take a historical view of music technology

My Montclair State colleague Adam Bell shares my musical values, but he puts them into practice somewhat differently. Rather than focus on the musical present, he likes to bring his students on a journey through the last hundred years of technology, taking in audio recording and manipulation, electronic and electroacoustic music, film scoring, and yes, rock and pop. For example, he has students explore musique concréte by recording environmental sounds on their phones, and then editing them in the DAW. He’s an enthusiastic proponent of maker culture, and has the kids create DIY custom electronic music interfaces using the Makey Makey and LittleBits. He wants the students to explore the expressive possibilities of technology, not just as users of tools, but as designers of them as well. I’m working on absorbing more of this approach into my own.

Examine all of the above methods critically

My mentor figure, Alex Ruthmann, is an expert on many music tech pedagogies and philosophies, and he takes a thirty-five-thousand-foot overview of them all. While he teaches music technology and methods for teaching music technology, his main mission is to look critically at all of the myriad ways that people teach it to find out what they conceal and reveal, what their unstated values and goals are, and how the various methods have emerged and interacted throughout history. After all, while music technology is a new subject, it isn’t completely new, and forward thinkers have been teaching it for many decades.

NYU’s music education program has been taking steps recently to make technology more of a priority. NYU brought Alex on board with the express goal of bridging the gap between music tech and music ed. My own NYU class is another step toward preparing future music teachers to do music tech.

There is no single best approach

So where does all this leave Matt, and other would-be teachers of music tech? The bad news is that there is no clearly defined set of practices to learn, no equivalent to Orff or Suzuki or Kodály. The good news is that we’re left with a lot of freedom to define our mission in our own terms. It’s a freedom that few music teachers enjoy, and we might as well take advantage of the opportunity to innovate.

Music theory on Hacker News

This fascinating thread about music theory on Hacker News showed up recently in my blog pingbacks.


Two posts in particular caught my eye. First, kev6168 had this eminently reasonable request:

I wish there is a lecture in the format of “One hundred songs for one hundred music concepts”. The songs should be stupidly simple (children’s songs, simple pops, short classical pieces, etc.). Each lesson concentrates on only _one_ concept appeared in the piece, explains what it is, how it is used, why it is used in that way, and how its application makes nice sound, etc. Basically, show me the _effect_ for each concept in a simple real world setting. Then more importantly, give me exercises to write a few bars of music using this concept, no matter how bad my writings are, as long as I am applying the new knowledge…

[A]rmed with only a few concepts, a newbie [coder] can start to write simple programs from the very beginning, in the forms of piecemeal topic-focused little exercises. The result of each exercise is a fully functioning program. I wish I can find a similarly structured music theory course that uses the same approach. Also, are there projects in music which are similar to or the likes, where you can do focused practice on specific topic? I would be happy to pay for those services.

This represents a pedagogical opportunity, not to mention a market opportunity. The NYU Music Experience Design Lab is hard at work on creating just such a resource. It’s going to be called Play With Your Music: Theory, and we hope to get it launched this summer. If you want to help us with it, get in touch.

Deeper in the thread, TheOtherHobbes has a broader philosophical point.

Pop has become a massive historical experiment in perceptual psychology. The most popular bands can literally fill a stadium – something common practice music has never done.

While that doesn’t mean pop is better in some absolute sense, it does suggest it’s doing something right for many listeners.

If your training is too rigidly classical it actively stops you being able to hear and understand what that right thing is, because you’re too busy concentrating on a small subset of the many details in the music.

This is a point that I spend a lot of energy pursuing, but I hadn’t explicitly framed it in terms of perceptual psychology. It gets at some bigger questions: Why do people like music at all? Even though pop can indeed draw huge crowds, it’s mostly a recorded art form. How does that work? What does it mean that we’re so attracted to roboticized voices? A lot to think about.

Prototyping Play With Your Music: Theory

I’m part of a research group at NYU called the Music Experience Design Lab. One of our projects is called Play With Your Music, a series of online interactive music courses. We’re currently developing the latest iteration, called Play With Your Music: Theory. Each module presents a “musical simple,” a short and memorable loop of melody or rhythm. Each simple is a window into one or more music theory concepts. Users can learn and play with the simples using a new interface called the aQWERTYon, which maps scales and chords to the regular computer keyboard.

aqw screengrab

We’re presenting the simples in a variety of formats, including YouTube videos, standard music notation, MIDI, data visualization, and our custom aQWERTYon notation.

Get Ur Freak On compound simple - notation



The goal is to teach theory through creative engagement with meaningful real-world music. We also want to put more emphasis on rhythm, which traditional music theory pedagogy tends to neglect. I’ve put some prototypes up, and I invite you to take a look and play around.

There’s a lot of work to do to make this vision a reality, and we’re looking for partners to help us do it. Specifically, here’s what we’d like to do in the coming year:

  • Create more musical simple modules, music theory
    concept pages, and instructional videos.
  • Implement a drum programming and rhythm pedagogy interface.
  • Add guitar tabs.
  • Create a database front end enabling us to offer multiple points
    of entry.
  • Build a community platform, including a system for
    crowdsourcing simples and concept pages.
  • Create course pathways for specific audiences: AP Music Theory students, lead guitarists, bedroom producers, and so on.
  • Design more interactive functionality.
  • Develop content and business partnerships.
  • Profit!

If you’d like to get involved, or you want to offer some feedback, please let me know.

Here’s what’s cooking with the NYU MusEDLab

I’m a proud member of the NYU Music Experience Design Lab, a research group that crosses the disciplines of music education, technology, and design. Here’s an overview of our many ongoing projects.

MusEDLab logo

Performance interfaces

My personal baby is the Groove Pizza, an outgrowth of my NYU masters thesis. It’s a prototype circular rhythm interface that we’re shaping into a powerful math teaching tool. Try it yourself:

Bembe pizza with lines

My other major personal involvement is in the aQWERTYon, which turns any computer keyboard into a futuristic MIDI controller.


You can choose from a variety of scale and chord mappings, and then jam or compose with the confidence that you can’t play a wrong note. You can use our built-in sound library, or you can  play software instruments from Logic, GarageBand, Ableton, and so on.

Music theory

The aQWERTYon and Groove Pizza are core components of a new learning tool called Play With Your Music: Theory, part of the Play With Your Music series. They were originally conceived as MOOCs, but have since evolved into online learning communities. All of the recording, mixing, editing and performance interfaces run in the web browser, so you don’t need any additional hardware or software to participate.

Conferences and workshops

The MusEDLab lab hosts regular meetups, hackathons, and the annual IMPACT conference. Here’s the sizzle reel for last summer’s conference, in which you can see me awkwardly breakdancing!

Hip-hop education

The lab has a close relationship with the Urban Arts Partnership. We’re creating web tools in support of Fresh Ed, an amazing initiative that teaches various humanities subjects using hip-hop. We’re talking to them about incorporating the Groove Pizza into their work as well. And we play host to Smartbomb Labs–last summer a kid designed a biology game starring a character named Homie O. Stasis.

Chamber music

Other folks in the lab are working with the Chamber Music Society of Detroit to create a set of chamber music engagement tools. I’m particularly fond of the extreme POV string quartet videos, giving you the chance to see and hear a Haydn performance from the players’ perspectives. There’s also a cool musical puzzle card game.

Finally, lab director Alex Ruthmann has been consulting on a new multitrack audio format called OOID, which lets you hear a song’s instruments and vocals in isolation or the mix of your choosing, and also layers in video, lyrics, and even guitar chords. It’s sadly not available in the US yet, but if you’re in Europe, you can download away.

How can I help?

If you are an educator, coder, or designer, and you want to get involved, be in touch. If you’re a philanthropist or grantmaker and you want to support us, definitely be in touch. Also, we’ll be spinning off a business arm this winter; if you’d like to become an investor, be in touch as well.