Inside the aQWERTYon

The MusEDLab and Soundfly just launched Theory For Producers, an interactive music theory course. The centerpiece of the interactive component is a MusEDLab tool called the aQWERTYon. You can try it by clicking the image below.

aQWERTYon screencap

In this post, I’ll talk about why and how we developed the aQWERTYon.

One of our core design principles is to work within our users’ real-world technological limitations. We build tools in the browser so they’ll be platform-independent and accessible anywhere there’s internet access (and where there isn’t internet access, we’ve developed the “MusEDLab in a box.”) We want to find out what musical possibilities there are in a typical computer with no additional software or hardware. That question led us to investigate ways of turning the standard QWERTY keyboard into a beginner-friendly instrument. We were inspired in part by GarageBand’s Musical Typing feature.

GarageBand musical typing

If you don’t have a MIDI controller, Apple thoughtfully made it possible for you to use your computer keyboard to play GarageBand’s many software instruments. You get an octave and a half of piano, plus other useful controls: pitch bend, modulation, sustain, octave shifting and simple velocity control. Many DAWs offer something similar, but Apple’s system is the most sophisticated I’ve seen.

Handy though it is, Musical Typing has some problems as a user interface. The biggest one is the poor fit between the piano keyboard layout and the grid of computer keys. Typing the letter A plays the note C. The rest of that row is the white keys, and the one above it is the black keys. You can play the chromatic scale by alternating A row, Q row, A row, Q row. That basic pattern is easy enough to figure out. However, you quickly get into trouble, because there’s no black key between E and F. The QWERTY keyboard gives no visual reminder of that fact, so you just have to remember it. Unfortunately, the “missing” black key happens to be the letter R, which is GarageBand’s keyboard shortcut for recording. So what inevitably happens is that you’re hunting for E-flat or F-sharp and you accidentally start recording over whatever you were doing. I’ve been using the program for years and still do this routinely.

Rather than recreating the piano keyboard on the computer, we drew on a different metaphor: the accordion.

The accordion: the user interface metaphor of the future!

We wanted to have chords and scales arranged in an easily discoverable way, like the way you can easily figure out the chord buttons on the accordion’s left hand. The QWERTY keyboard is really a staggered grid four keys tall and between ten and thirteen keys wide, plus assorted modifier and function keys. We decided to use the columns for chords and the rows for scales.

For the diatonic scales and modes, the layout is simple. The bottom row gives the notes in the scale starting on 1^. The second row has the same scale shifted over to start on 3^. The third row starts the scale on 5^, and the top row starts on 1^ an octave up. If this sounds confusing when you read it, try playing it, your ears will immediately pick up the pattern. Notes in the same column form the diatonic chords, with their roman numerals conveniently matching the number keys. There are no wrong notes, so even just mashing keys at random will sound at least okay. Typing your name usually sounds pretty cool, and picking out melodies is a piece of cake. Playing diagonal columns, like Z-S-E-4, gives you chords voiced in fourths. The same layout approach works great for any seven-note scale: all of the diatonic modes, plus the modes of harmonic and melodic minor.

Pentatonics work pretty much the same way as seven-note scales, except that the columns stack in fourths rather than fifths. The octatonic and diminished scales lay out easily as well. The real layout challenge lay in one strange but crucial exception: the blues scale. Unlike other scales, you can’t just stagger the blues scale pitches in thirds to get meaningful chords. The melodic and harmonic components of blues are more or less unrelated to each other. Our original idea was to put the blues scale on the bottom row of keys, and then use the others to spell out satisfying chords on top. That made it extremely awkward to play melodies, however, since the keys don’t form an intelligible pattern of intervals. Our compromise was to create two different blues modes: one with the chords, for harmony exploration, and one just repeating the blues scale in octaves for melodic purposes. Maybe a better solution exists, but we haven’t figured it out yet.

When you select a different root, all the pitches in the chords and scales are automatically changed as well. Even if the aQWERTYon had no other features or interactivity, this would still make it an invaluable music theory tool. But root selection raises a bigger question: what do you do about all the real-world music that uses more than one scale or mode? Totally uniform modality is unusual, even in simple pop songs. You can access notes outside the currently selected scale by pressing the shift keys, which transposes the entire keyboard up or down a half step. But what would be really great is if we could get the scale settings to change dynamically. Wouldn’t it be great if you were listening to a jazz tune, and the scale was always set to match whatever chord was going by at that moment? You could blow over complex changes effortlessly. We’ve discussed manually placing markers in YouTube videos that tell the aQWERTYon when to change its settings, but that would be labor-intensive. We’re hoping to discover an algorithmic method for placing markers automatically.

The other big design challenge we face is how to present all the different scale choices in a way that doesn’t overwhelm our core audience of non-expert users. One solution would just be to limit the scale choices. We already do that in the Soundfly course, in effect; when you land on a lesson, the embedded aQWERTYon is preset to the appropriate scale and key, and the user doesn’t even see the menus. But we’d like people to be able to explore the rich sonic diversity of the various scales without confronting them with technical Greek terms like “Lydian dominant”. Right now, the scales are categorized as Major, Minor and Other, but those terms aren’t meaningful to beginners. We’ve been discussing how we could organize the scales by mood or feeling, maybe from “brightest” to “darkest.” But how do you assign a mood to a scale? Do we just do it arbitrarily ourselves? Crowdsource mood tags? Find some objective sorting method that maps onto most listeners’ subjective associations? Some combination of the above? It’s an active area of research for us.

This issue of categorizing scales by mood has relevance for the original use case we imagined for the aQWERTYon: teaching film scoring. The idea behind the integrated video window was that you would load a video clip, set a mode, and then improvise some music that fit the emotional vibe of that clip. The idea of playing along with YouTube videos of songs came later. One could teach more general open-ended composition with the aQWERTYon, and in fact our friend Matt McLean is doing exactly that. But we’re attracted to film scoring as a gateway because it’s a more narrowly defined problem. Instead of just “write some music”, the challenge is “write some music with a particular feeling to it that fits into a scene of a particular length.

Would you like to help us test and improve the aQWERTYon, or to design curricula around it? Would you like to help fund our programmers and designers? Please get in touch.

Space Oddity: from song to track

If you’ve ever wondered what it is that a music producer does exactly, David Bowie’s “Space Oddity” is a crystal clear example. To put it in a nutshell, a producer turns this:

Into this:

It’s also interesting to listen to the first version of the commercial recording, which is better than the demo, but still nowhere near as majestic as the final version. The Austin Powers flute solo is especially silly.

Should we even consider these three recordings to be the same piece of music? On the one hand, they’re all the same melody and chords and lyrics. On the other hand, if the song only existed in its demo form, or in the awkward Austin Powers version, it would never have made the impact that it did. Some of the impact of the final version lies in better recording techniques and equipment, but it’s more than that. The music takes on a different meaning in the final version. It’s bigger, trippier, punchier, tighter, more cinematic, more transporting, and in general about a thousand times more effective.

The producer’s job is to marshall the efforts of songwriters, arrangers, performers and engineers to create a good-sounding recording. (The producer might also be a songwriter, arranger, performer, and/or engineer.) Producers are to songs what directors are to movies, or showrunners are to television.

When you’re thinking about a piece of recorded music, you’re really talking about three different things:

  1. The underlying composition, the part that can be represented on paper. Albin Zak calls this “the song.”
  2. The performance of the song.
  3. The finished recording, after overdubbing, mixing, editing, effects, and all the rest. Albin Zak calls this “the track.”

I had always assumed that Tony Visconti produced “Space Oddity,” since he produced a ton of other Bowie classics. As it turns out, though, Visconti was underwhelmed by the song, so he delegated it to his assistant, Gus Dudgeon. So what is it that Gus Dudgeon did precisely? First let’s separate out what he didn’t do.

You can hear from the demo that the chords, melody and lyrics were all in place before Bowie walked into the studio. They’re the parts reproduced by the subway busker I heard singing “Space Oddity” this morning. The demo includes a vocal arrangement that’s similar to the final one, aside from some minor phrasing changes. The acoustic guitar and Stylophone are in place as well. (I had always thought it was an oboe, but no, that droning sound is a low-tech synth.)

Gus Dudgeon took a song and a partial arrangement, and turned it into a track. He oversaw the addition of electric guitar, bass, drums, strings, woodwinds, and keyboards. He coached Bowie and the various studio musicians through their performances, selected the takes, and decided on effects like echoes and reverb. He supervised the mixing, which not only sets the relative loudness of the various sounds, but also affects their perceived location and significance. In short, he designed the actual sounds that you hear.

If you want to dive deep into the track, you’re in luck, because Bowie officially released the multitrack stems. Some particular points of interest:

  • The bassist, Herbie Flowers, was a rookie. The “Space Oddity” session was his first. He later went on to create the staggeringly great dual bass part in Lou Reed’s “Walk On The Wild Side.”
  • The strings were arranged and conducted by the multifaceted Paul Buckmaster, who a few years later would work with Miles Davis on the conception of On The Corner. Buckmaster’s cello harmonics contribute significantly to the psychedelic atmosphere–listen to the end of the stem labeled “Extras 1.”
  • The live strings are supplemented by Mellotron, played by future Yes keyboardist Rick Wakeman, he of the flamboyant gold cape.
  • Tony Visconti plays some flute and unspecified woodwinds, including the distinctive saxophone run that leads into the instrumental sections.

You can read a detailed analysis of the recording on the excellent Bowiesongs blog.

The big difference between the sixties and the present is that the track has assumed ever-greater importance relative to the song and the performance. In the age of MIDI and digital audio editing, live performance has become a totally optional component of music. The song is increasingly inseparable from the sounds used to realize it, especially in synth-heavy music like hip-hop and EDM. This shift gives the producer ever-greater importance in the creative process. There is really no such thing as a “demo” anymore, since anyone with a computer can produce finished-sounding tracks in their bedroom. If David Bowie were a kid now, he’d put together “Space Oddity” in GarageBand or FL Studio, with a lavish soundscape part of the conception from the beginning.

I want my students to understand that the words “producer” and “musician” are becoming synonymous. I want them to know that they can no longer focus solely on composition or performance and wait for someone else to craft a track around them. The techniques used to make “Space Oddity” were esoteric and expensive to realize at the time. Now, they’re easily within reach. But while the technology is more accessible, you still have to have the ideas. This is why it’s so valuable to study great producers like Tony Visconti and Gus Dudgeon: they’re a goldmine of sonic inspiration.

See also: a broader appreciation of Bowie.