How to record from the aQWERTYon

People ask us a lot if there’s a way to record the output of the aQWERTYon. We might introduce recording functionality some day, but in the meantime, there are two methods for recording your aQW performances.

Theory aQWERTYon

Method one: Record the audio

Don’t hold a mic or your phone up to the computer speakers! You can record the sound coming from the web browser directly inside the computer. On a Mac, my preferred tool is Audio Hijack–the free version is fine. On Windows, you can use Stereo Mix and/or Audacity.

Method two (recommended!): Record the MIDI

Thanks to the miracle of web MIDI, you can use the aQW to play software instruments in any DAW. The computer will think the aQW is a regular MIDI keyboard. This method gives you a lot more flexibility than recording the audio directly, because you can edit and quantize your MIDI, play it back on different software instruments, and import it into notation software.

In order to send MIDI from the aQWERTYon, you will need to open it in Google Chrome. You will also need to set up a virtual MIDI (IAC) bus. On a Mac, this is trivially easy; just follow steps one and two on this tutorial. On Windows, you will need to install MIDI ox or loopMIDI, both are free.

Once you have your web MIDI going, the rest is easy. Open the aQW in a browser window. Then open GarageBand, Ableton, FL Studio, or whatever DAW you are using. Put a software instrument on a MIDI track. Switch back over to the aQW and play. You should hear both the sound of the aQW and the sound of your software instrument. Turn the volume on the aQW to zero, and you are ready to rock.

When you record MIDI, whether it’s from the aQW or a regular controller, it’s a good idea to do it over the metronome or (better yet) a drum loop. That way, everything you play will be lined up to the bars and beats. Then you can easily quantize, edit, copy and paste, and add more loops. If your MIDI is not lined up to the grid, it will be very difficult to do these things. Happy recording!

Announcing the Theory aQWERTYon

A few years ago, the NYU Music Experience Design Lab launched a web application called the aQWERTYon. The name is short for “QWERTY accordion,” because the idea is to make the computer keyboard as accessible for novice musicians as the chord buttons on an accordion. The aQWERTYon maps scales to the keyboard so that there are no “wrong notes,” and so that each column of keys plays a chord. Yesterday, we launched a new version of the app, the Theory aQWERTYon. It visualizes the notes you’re playing on the chromatic circle in real time. Click the image to try it! (Be sure to whitelist it on your ad blocker or it won’t work.)

Theory aQWERTYon

In addition to playing the built-in instruments, you can also use the aQWERTYon as a MIDI controller for any DAW or notation program via the IAC bus (Windows users will need to install MidiOX.) Turn the aQWERTYon’s volume to zero if you’re doing this.

The color scheme on the pitch wheel is intended to give you some visual cues about how each scale is going to sound. Green notes are “bright”–i.e., major, natural, sharp, or augmented. Blue notes are “dark”–i.e., minor, flat, or diminished. Purple notes are neither bright nor dark, i.e. perfect fourths, fifths and octaves. Grey notes are outside the selected scale. Finally, orange notes are the ones that are currently being played. If you play two notes at a time, they will be connected by an orange line. If you play three or more notes at a time, they will form an orange shape. These geometric visualizations are meant to support and complement your aural understanding of intervals and chords, the way that they do with rhythms on the Groove Pizza.

This idea has been in the pipeline for a while, but the impetus to finally push it to completion was my Fundamentals of Western Music class at the New School. I have been drawing scales and chords on the chromatic circle by hand for a long time, and I wanted to be able to produce them automatically. You can read about the design process here, and read about the pitch wheel specifically here.

Eventually we would like the aQWERTYon to show other real-time information as well: notes on the staff, chord symbols, and the like. We want to do for the web browser what Samuel Halligan’s pop-up piano does for Ableton Live Suite: turn it into a visual and aural Rosetta stone that translates in real time between different visual and aural representations of music.

If you use the aQWERTYon in your classroom, or for your own personal exploration (and we hope you do), please let us know!

The Groove Pizza now exports MIDI

Since its launch, you’ve been able to export your Groove Pizza beats as WAV files, or continue working on them in Soundtrap. But now, thanks to MusEDLab developer Jordana Bombi, you can also save your beats as MIDI files as well.

Groove Pizza MIDI export

You can bring these MIDI files into your music production software tool of choice: Ableton Live, Logic, Pro Tools, whatever. How cool is that?

There are a few limitations at the moment: your beats will be rendered in 4/4 time, regardless of how many slices your pizza has. You can always set the right time signature after you bring the MIDI into your production software. Also, your grooves will export with no swing–you’ll need to reinstate that in your software as well.

We have some more enhancements in the pipeline, aside from fixing the limitations just mentioned. We’re working on a “continue in Noteflight” feature, real-time MIDI input and output, and live performance using the QWERTY keyboard. I’ll keep you posted.

Design for Real Life – QWERTYBeats research

Writing assignment for Design For The Real World with Claire Kearney-Volpe and Diana Castro – research about a new rhythm interface for blind and low-vision novice musicians

Definition

I propose a new web-based accessible rhythm instrument called QWERTYBeats.Traditional instruments are highly accessible to blind and low-vision musicians. Electronic music production tools are not. I look at the history of accessible instruments and software interfaces, give an overview of current electronic music hardware and software, and discuss the design considerations underlying my project.

QWERTYBeats logo

Historical overview

Acoustic instruments give rich auditory and haptic feedback, and pose little obstacle to blind musicians. We need look no further for proof than the long history of iconic blind musicians like Ray Charles and Stevie Wonder. Even sighted instrumentalists rarely look at their instruments once they have attained a sufficient level of proficiency. Music notation is not accessible, but Braille notation has existed since the language’s inception. Also, a great many musicians both blind and sighted play entirely by ear anyway.

Most of the academic literature around accessibility issues in music education focuses on wider adoption of and support for Braille notation. See, for example, Rush, T. W. (2015). Incorporating Assistive Technology for Students with Visual Impairments into the Music Classroom. Music Educators Journal, 102(2), 78–83. For electronic music, notation is rarely if ever a factor.

Electronic instruments pose some new accessibility challenges. They may use graphical interfaces with nested menus, complex banks of knobs and patch cables, and other visual control surfaces. Feedback may be given entirely with LED lights and small text labels. Nevertheless, blind users can master these devices with sufficient practice, memorization and assistance. For example, Stevie Wonder has incorporated synthesizers and drum machines in most of his best-known recordings.

Most electronic music creation is currently done not with instruments, but rather using specialized software applications called digital audio workstations (DAWs). Keyboards and other controllers are mostly used to access features of the software, rather than as standalone instruments. The most commonly-used DAWs include Avid Pro Tools, Apple Logic, Ableton Live, and Steinberg Cubase. Mobile DAWs are more limited than their desktop counterparts, but are nevertheless becoming robust music creation tools in their own right. Examples include Apple GarageBand and Steinberg Cubasis. Notated music is commonly composed using score editing software like Sibelius and Finale, whose functionality increasingly overlaps with DAWs, especially in regard to MIDI sequencing.

DAWs and notation editors pose steep accessibility challenges due to their graphical and spatial interfaces, not to mention their sheer complexity. In class, we were given a presentation by Leona Godin, a blind musician who records and edits audio using Pro Tools by means of VoiceOver. While it must have taken a heroic effort on her part to learn the program, Leona demonstrates that it is possible. However, some DAWs pose insurmountable problems even to very determined blind users because they do not use standard operating system elements, making them inaccessible via screen readers.

Technological interventions

There are no mass-market electronic interfaces specifically geared toward blind or low-vision users. In this section, I discuss one product frequently hailed for its “accessibility” in the colloquial rather than blindess-specific sense, along with some more experimental and academic designs.

Ableton Push

Push layout for IMPACT Faculty Showcase

Ableton Live has become the DAW of choice for electronic music producers. Low-vision users can zoom in to the interface and modify the color scheme. However, Live is inaccessible via screen readers.

In recent years, Ableton has introduced a hardware controller, the Push, which is designed to make the software experience more tactile and instrument-like. The Push combines an eight by eight grid of LED-lit touch pads with banks of knobs, buttons and touch strips. It makes it possible to create, perform and record a piece of music from scratch without looking at the computer screen. In addition to drum programming and sampler performance, the Push also has an innovative melodic mode which maps scales onto the grid in such a way that users can not play a wrong note. Other comparable products exist; see, for example, the Native Instruments Maschine.

There are many pad-based drum machines and samplers. Live’s main differentiator is its Session view, where the pads launch clips: segments of audio or MIDI that can vary in length from a single drum hit to the length of an entire song. Clip launching is tempo-synced, so when you trigger a clip, playback is delayed until the start of the next measure (or whatever the quantization interval is.) Clip launching is a forgiving and beginner-friendly performance method, because it removes the possibility of playing something out of rhythm. Like other DAWs, Live also gives rhythmic scaffolding in its software instruments by means of arpeggiators, delay and other tempo-synced features.

The Push is a remarkable interface, but it has some shortcomings for blind users. First of all, it is expensive, $800 for the entry-level version and $1400 for the full-featured software suite. Much of its feedback is visual, in the form of LED screens and color-coded lighting on the pads. It switches between multiple modes which can be challenging to distinguish even for sighted users. And, like the software it accompanies, the Push is highly complex, with a steep learning curve unsuited to novice users, blind or sighted.

The aQWERTYon

Most DAWs enable users to perform MIDI instruments on the QWERTY keyboard. The most familiar example is the Musical Typing feature in Apple GarageBand.

GarageBand musical typing

Musical Typing makes it possible to play software instruments without an external MIDI controller, which is convenient and useful. However, its layout counterintuively follows the piano keyboard, which is an awkward fit for the computer keyboard. There is no easy way to distinguish the black and white keys, and even expert users find themselves inadvertantly hitting the keyboard shortcut for recording while hunting for F-sharp.

The aQWERTYon is a web interface developed by the NYU Music Experience Design Lab specifically intended to address the shortcomings of Musical Typing.

aQWERTYon screencap

Rather than emulating the piano keyboard, the aQWERTYon draws its inspiration from the chord buttons of an accordion. It fills the entire keyboard with harmonically related notes in a way that supports discovery by naive users. Specifically, it maps scales across the rows of keys, staggered by intervals such that each column forms a chord within the scale. Root notes and scales can be set from pulldown menus within the interface, or preset using URL parameters. It can be played as a standalone instrument, or as a MIDI controller in conjunction with a DAW. Here is a playlist of music I created using the aQWERTYon and GarageBand or Ableton Live:

The aQWERTYon is a completely tactile experience. Sighted users can carefully match keys to note names using the screen, but more typically approach the instrument by feel, seeking out patterns on the keyboard by ear. A blind user would need assistance loading the aQWERTYon initially and setting the scale and root note parameters, but otherwise, it is perfectly accessible. The present project was motivated in large part by a desire to make exploration of rhythm as playful and intuitive as the aQWERTYon makes exploring chords and scales.

Soundplant

The QWERTY keyboard can be turned into a simple drum machine quite easily using a free program called Soundplant. The user simply drags audio files onto a graphical key to have it triggered by that physical key. I was able to create a TR-808 kit in a matter of minutes:

Soundplant with 808 samples

After it is set up and configured, Soundplant can be as effortlessly accessible as the aQWERTYon. However, it does not give the user any rhythmic assistance. Drumming in perfect time is an advanced musical skill, and playing drum machine samples out of time is not much more satisfying than banging on a metal bowl with a spoon out of time. An ideal drum interface would offer beginners some of the rhythmic scaffolding and support that Ableton provides via Session view, arpeggiators, and the like.

The Groove Pizza

Drum machines and their software counterparts offer an alternative form of rhythmic scaffolding. The user sequences patterns in a time-unit box system or piano roll, and the computer performs those patterns flawlessly. The MusEDLab‘s Groove Pizza app is a web-based drum sequencer that wraps the time-unit box system into a circle.

Groove Pizza - Bembe

The Groove Pizza was designed to make drum programming more intuitive by visualizing the symmetries and patterns inherent in musical-sounding rhythms. However, it is totally unsuitable for blind or low-vision users. Interaction is only possible through the mouse pointer or touch, and there are no standard user interface elements that can be parsed by screen readers.

Before ever considering designing for the blind, the MusEDLab had already considered the Groove Pizza’s limitations for younger children and users with special needs: there is no “live performance” mode, and there is always some delay in feedback between making a change in the drum pattern and hearing the result. We have been considering ways to make a rhythm interface that is more immediate, performance-oriented and tactile. One possible direction would be to create a hardware version of the Groove Pizza; indeed, one of the earliest prototypes was a hardware version built by Adam November out of a pizza box. However, hardware design is vastly more complex and difficult than software, so for the time being, software promises more immediate results.

Haenselmann-Lemelson-Effelsberg MIDI sequencer

This experimental interface is described in Haenselmann, T., Lemelson, H., & Effelsberg, W. (2011). A zero-vision music recording paradigm for visually impaired people. Multimedia Tools and Applications, 5, 1–19.

Haenselmann-Lemelson-Effelsberg MIDI sequencer

The authors create a new mode for a standard MIDI keyboard that maps piano keys to DAW functions like playback, quantization, track selection, and so on. They also add “earcons” (auditory icons) to give sonic feedback when particular functions have been activated that normally only give graphical feedback. For example, one earcon sounds when recording is enabled; another sounds for regular playback. This interface sounds promising, but there are significant obstacles to its adoption. While the authors have released the source code as a free download, that requires a would-be user to be able to compile and run it. This is presuming that they could access the code in the first place; the download link given in the paper is inactive. It is an all-too-common fate of academic projects to never get widespread usage. By posting our projects on the web, the MusEDLab hopes to avoid this outcome.

Statement

Music education philosophy

My project is animated by a constructivist philosophy of music education, which operates by the following axiomatic assumptions:

  • Learning by doing is better than learning by being told.
  • Learning is not something done to you, but rather something done by you.
  • You do not get ideas; you make ideas. You are not a container that gets filled with knowledge and new ideas by the world around you; rather, you actively construct knowledge and ideas out of the materials at hand, building on top of your existing mental structures and models.
  • The most effective learning experiences grow out of the active construction of all types of things, particularly things that are personally or socially meaningful, that you develop through interactions with others, and that support thinking about your own thinking.

If an activity’s challenge level is beyond than your ability, you experience anxiety. If your ability at the activity far exceeds the challenge, the result is boredom. Flow happens when challenge and ability are well-balanced, as seen in this diagram adapted from Csikszentmihalyi.

Flow

Music students face significant obstacles to flow at the left side of the Ability axis. Most instruments require extensive practice before it is possible to make anything that resembles “real” music. Electronic music presents an opportunity here, because even a complete novice can produce music with a high degree of polish quickly. It is empowering to use technologies that make it impossible to do anything wrong; it frees you to begin exploring what you find to sound right. Beginners can be scaffolded in their pitch explorations with MIDI scale filters, Auto-Tune, and the configurable software keyboards in apps like Thumbjam and Animoog. Rhythmic scaffolding is more rare, but it can be had via Ableton’s quantized clip launcher, by MIDI arpeggiators, and using the Note Repeat feature on many drum machines.

QWERTYBeats proposal

My project takes drum machine Note Repeat as its jumping off point. When Note Repeat is activated, holding down a drum pad triggers the corresponding sound at a particular rhythmic interval: quarter notes, eighth notes, and so on. On the Ableton Push, Note Repeat automatically syncs to the global tempo, making it effortless to produce musically satisfying rhythms. However, this mode has a major shortcoming: it applies globally to all of the drum pads. To my knowledge, no drum machine makes it possible to simultaneously have, say, the snare drum playing every dotted eighth note while the hi-hat plays every sixteenth note.

I propose a web application called QWERTYBeats that maps drums to the computer keyboard as follows:

  • Each row of the keyboard triggers a different drum/beatbox sound (e.g. kick, snare, closed hi-hat, open hi-hat).
  • Each column retriggers the sample at a different rhythmic interval (e.g. quarter note, dotted eighth note).
  • Circles dynamically divide into “pie slices” to show rhythmic values.

The rhythm values are shown below by column, with descriptions followed by the time interval as shown as a fraction of the tempo in beats per minute.

  1. quarter note (1)
  2. dotted eighth note (3/4)
  3. quarter note triplet (2/3)
  4. eighth note (1/2)
  5. dotted sixteenth note (3/8)
  6. eighth note triplet (1/3)
  7. sixteenth note (1/4)
  8. dotted thirty-second note (3/16)
  9. sixteenth note triplet (1/6)
  10. thirty-second note (1/8)

By simply holding down different combinations of keys, users can attain complex syncopations and polyrhythms. If the app is synced to the tempo of a DAW or music playback, the user can perform good-sounding rhythms over any song that is personally meaningful to them.

The column layout leaves some unused keys in the upper right corner of the keyboard: “-“, “=”, “[“, “]”, “”, etc. These can be reserved for setting the tempo and other UI elements.

The app defaults to Perform Mode, but clicking Make New Kit opens Sampler mode, where users can import or record their own drum sounds:

  • Keyboard shortcuts enable the user to select a sound, audition it, record, set start and end point, and set its volume level.
  • A login/password system enables users to save kits to the cloud where they can be accessed from any computer. Kits get unique URL identifiers, so users can also share them via email or social media.

It is my goal to make the app accessible to users with the widest possible diversity of abilities.

  • The entire layout will use plain text, CSS and JavaScript to support screen readers.
  • All user interface elements can be accessed via the keyboard: tab to change the keyboard focus, menu selections and parameter changes via the up and down arrows, and so on.

Perform Mode:

QWERTYBeats concept images - Perform mode

Sampler Mode:

sampler-mode

Mobile version

The present thought is to divide up the screen into a grid mirroring the layout of the QWERTY keyboard. User testing will determine whether this will produce a satisfying experience.

QWERTYDrum - mobile

Prototype

I created a prototype of the app using Ableton Live’s Session View.

QWERTYBeats - Ableton prototype

Here is a sample performance:

There is not much literature examining the impact of drum programming and other electronic rhythm sequencing on students’ subsequent ability to play acoustic drums, or to keep time more accurately in general. I can report anecdotally that my own time spent sequencing and programming drums improved my drumming and timekeeping enormously (and mostly inadvertently.) I will continue to seek further support for the hypothesis that electronically assisted rhythm creation builds unassisted rhythmic ability. In the meantime, I am eager to prototype and test QWERTYBeats.

Rohan lays beats

The Ed Sullivan Fellows program is an initiative by the NYU MusEDLab connecting up-and-coming hip-hop musicians to mentors, studio time, and creative and technical guidance. Our session this past Saturday got off to an intense start, talking about the role of young musicians of color in a world of the police brutality and Black Lives Matter. The Fellows are looking to Kendrick Lamar and Chance The Rapper to speak social and emotional truths through music. It’s a brave and difficult job they’ve taken on.

Eventually, we moved from heavy conversation into working on the Fellows’ projects, which this week involved branding and image. I was at kind of a loose end in this context, so I set up the MusEDLab’s Push controller and started playing around with it. Rohan, one of the Fellows, immediately gravitated to it, and understandably so.

Indigo lays beats

Rohan tried out a few drum sounds, then some synths. He quickly discovered a four-bar synth loop that he wanted to build a track around. He didn’t have any Ableton experience, however, so I volunteered to be his co-producer and operate the software for him.

We worked out some drum parts, first with a hi-hat and snare from the Amen break, and then a kick, clap and more hi-hats from Ableton’s C78 factory instrument. For bass, Rohan wanted that classic booming hip-hop sound you hear coming from car stereos in Brooklyn. He spotted the Hip-Hop Sub among the presets. We fiddled with it and he continued to be unsatisfied until I finally just put a brutal compressor on it, and then we got the sound he was hearing in his head.

While we were working, I had my computer connected to a Bluetooth speaker that was causing some weird and annoying system behavior. At one point, iTunes launched itself and started playing a random song under Rohan’s track, “I Can’t Realize You Love Me” by Duke Ellington and His Orchestra, featuring The Harlem Footwarmers and Sid Garry.

Rohan liked the combination of his beat and the Ellington song, so I sampled the opening four bars and added them to the mix. It took me several tries to match the keys, and I still don’t think I really nailed it, but the hip-hop kids have broad tolerance for chord clash, and Rohan was undisturbed.

Once we had the loops assembled, we started figuring out an arrangement. It took me a minute to figure out that when Rohan refers to a “bar,” he means a four-measure phrase. He’s essentially conflating hypermeasures with measures. I posted about it on Twitter later and got some interesting responses.

In a Direct Message, Latinfiddler also pointed out that Latin music calls two measures a “bar” because that’s the length of one cycle of the clave.

Thinking about it further, there’s yet another reason to conflate measures with hypermeasures, which is the broader cut-time shift taking place in hip-hop. All of the young hip-hop beatmakers I’ve observed lately work at half the base tempo of their DAW session. Rohan, being no exception, had the session tempo set to 125 bpm, but programmed a beat with an implied tempo of 62.5 bpm. He and his cohort put their backbeats on beat three, not beats two and four, so they have a base grid of thirty-second notes rather than sixteenth notes. A similar shift took place in the early 1960s when the swung eighth notes of jazz rhythm gave way to the swung sixteenth notes of funk.

Here’s Rohan’s track as of the end of our session:

By the time we were done working, the rest of the Fellows had gathered around and started freestyling. The next step is to record them rapping and singing on top. We also need to find someone to mix it properly. I understand aspects of hip-hop very well, but I mix amateurishly at best.

All the way around, I feel like a learn a ton about music whenever I work with young hip-hop musicians. They approach the placement of sounds in the meter in ways that would never occur to me. I’m delighted to be able to support them technically in realizing their ideas, it’s a privilege for me.

Project-based music technology teaching

I use a project-based approach to teaching music technology. Technical concepts stick with you better if you learn them in the course of making actual music. Here’s the list of projects I assign to my college classes and private students. I’ve arranged them from easiest to hardest. The first five projects are suitable for a beginner-level class using any DAW–my beginners use GarageBand. The last two projects are more advanced and require a DAW with sophisticated editing tools and effects, like Ableton Live. If you’re a teacher, feel free to use these (and let me know if you do). Same goes for all you bedroom producers and self-teachers.

The projects are agnostic as to musical content, style or genre. However, the computer is best suited to making electronic music, and most of these projects work best in the pop/hip-hop/techno sphere. Experimental, ambient or film music approaches also work well. Many of them draw on the Disquiet Junto. Enjoy.

Tristan gets his FFT on

Loops

Assignment: Create a song using only existing loops. You can use these or these, or restrict yourself to the loops included with your DAW. Do not use any additional sounds or instruments.

For beginners, I like to separate this into two separate assignments. First, create a short (two or four bar) phrase using four to six instrument loops and beats. Then use that set of loops as the basis of a full length track, by repeating, and by having sounds enter and exit.

Concepts:

  • Basic DAW functions
  • Listening like a producer
  • Musical form and song structures
  • Intellectual property, copyright and authorship

Hints:

  • MIDI loops are easier to edit and customize than audio loops.
  • Try slicing audio loops into smaller segments. Use only the front or back half of the loop. Or rearrange segments into a different order.

final song

MIDI

Assignment: Create a piece of music using MIDI and software instruments. Do not record or import any audio. You can use MIDI from any source, including: playing keyboards, drum pads or other interfaces; drawing in the piano roll; importing scores from notation programs; downloading MIDI files from the internet (for example, from here); or using the Audio To MIDI function in your DAW. 

I don’t treat this as a composition exercise (unless students want to make it one.) Feel free to use an existing piece of music. The only requirement is that the end result has to sound good. Simply dragging a classical or pop MIDI into the DAW is likely to sound terrible unless you put some thought into your instrument choices. If you do want to create something original, try these compositional prompts.

Concepts:

  • MIDI recording and editing
  • Quantization, swing, and grooves
  • “Real” vs “fake” instruments
  • Synthesized vs sampled sounds
  • Drum programming
  • Interfaces and controllers

Hints:

  • For beginners, see this post on beatmaking fundamentals.
  • Realism is unattainable. Embrace the fakeness.
  • Find a small segment of a classical piece and loop it.
  • Rather than playing back a Bach keyboard piece on piano or harpsichord, set your instrument to drums or percussion, and get ready for joy.

Montclair State Music Tech 101

Found sound

Assignment: Record a short environmental sound and incorporate it into a piece of music. You can edit and process your found sound as you see fit. Variation: use existing sounds from Freesound.

Concepts:

  • Audio recording, editing, and effects
  • The musical potential of “non-musical” sounds

Hints:

  • Students usually record their sounds with their phones, and the resulting recording quality is usually bad. Try using EQ, compression, delay, reverb, distortion, and other effects to mitigate or enhance poor sound quality and background noise.

pyt stems

Peer remix

Assignment: Remix a track by one of your classmates (or friends, or a stranger on the internet.) Feel free to incorporate other pieces of music as well. Follow your personal definition of the word “remix.” That might mean small edits and adjustments to the mix and effects, or a radical reworking leading to complete transformation of the source material.

There are endless variations on the peer remix. Try the “metaremix,” where students remix each others’ remixes, to the nth degree as time permits. Also, do group remix activities like Musical Shares or FX Roulette.

Concepts:

  • Collaboration and authorship
  • Sampling
  • Mashups
  • Evolution of musical ideas
  • Musical critique using musical language

Hints:

  • A change in tempo can have dramatic effects on the mood and feel of a track.
  • Adding sounds is the obvious move, but don’t be afraid to remove things too.

Self remix

Assignment: Remix one of your own projects, using the same guidelines as the peer remix. This is a good project for the end of the semester/term.

Song transformation

Assignment: Take an existing song and turn it into a new song. Don’t use any additional sounds or MIDI.

Concepts:

  • Advanced audio editing and effects
  • Musical form and structure
  • The nature of originality

Hints:

  • You can transform short segments simply by repeating them out of context. For example, try taking single chords or lyrical phrases and looping them.

Serato

Shared sample

Assignment: Take a short audio sample (five seconds or less) and build a complete piece of music out of it. Do not use any other sounds. This is the most difficult assignment here, and the most rewarding one if you can pull it off successfully.

Concepts:

  • Advanced audio editing and effects
  • Musical form and structure
  • The nature of originality

Hints:

  • Pitch shifting and timestretching are your friends.
  • Short bursts of noise can be tuned up and down to make drums.
  • Extreme timestretching produces great ambient textures.

Mobile music at IMPACT

Writing assignments

I like to have students document their process in blog posts. I ask: What sounds and techniques did you use? Why did you use them? Are you happy with the end result? Given unlimited time and expertise, what changes would you make? Do you consider this to be a valid form of musical creativity?

This semester I also asked students to write reviews of each others’ work in the style of their preferred music publication. In the future, I plan to have students write a review of an imaginary track, and then assign other students to try to create the track being described.

The best way to learn how to produce good recordings is to do critical listening exercises. I assign students to create musical structure and space graphs in the spirit of William Moylan.

Further challenges

The projects above were intended to be used for a one-semester college class. If I were teaching over a longer time span or I needed more assignments, I would draw from the Disquiet JuntoMaking Music by Dennis DeSantis, or the Oblique Strategies cards. Let me know in the comments if you have additional recommendations.

Inside the aQWERTYon

The MusEDLab and Soundfly just launched Theory For Producers, an interactive music theory course. The centerpiece of the interactive component is a MusEDLab tool called the aQWERTYon. You can try it by clicking the image below.

aQWERTYon screencap

In this post, I’ll talk about why and how we developed the aQWERTYon.

One of our core design principles is to work within our users’ real-world technological limitations. We build tools in the browser so they’ll be platform-independent and accessible anywhere there’s internet access (and where there isn’t internet access, we’ve developed the “MusEDLab in a box.”) We want to find out what musical possibilities there are in a typical computer with no additional software or hardware. That question led us to investigate ways of turning the standard QWERTY keyboard into a beginner-friendly instrument. We were inspired in part by GarageBand’s Musical Typing feature.

GarageBand musical typing

If you don’t have a MIDI controller, Apple thoughtfully made it possible for you to use your computer keyboard to play GarageBand’s many software instruments. You get an octave and a half of piano, plus other useful controls: pitch bend, modulation, sustain, octave shifting and simple velocity control. Many DAWs offer something similar, but Apple’s system is the most sophisticated I’ve seen.

Handy though it is, Musical Typing has some problems as a user interface. The biggest one is the poor fit between the piano keyboard layout and the grid of computer keys. Typing the letter A plays the note C. The rest of that row is the white keys, and the one above it is the black keys. You can play the chromatic scale by alternating A row, Q row, A row, Q row. That basic pattern is easy enough to figure out. However, you quickly get into trouble, because there’s no black key between E and F. The QWERTY keyboard gives no visual reminder of that fact, so you just have to remember it. Unfortunately, the “missing” black key happens to be the letter R, which is GarageBand’s keyboard shortcut for recording. So what inevitably happens is that you’re hunting for E-flat or F-sharp and you accidentally start recording over whatever you were doing. I’ve been using the program for years and still do this routinely.

Rather than recreating the piano keyboard on the computer, we drew on a different metaphor: the accordion.

The accordion: the user interface metaphor of the future!

We wanted to have chords and scales arranged in an easily discoverable way, like the way you can easily figure out the chord buttons on the accordion’s left hand. The QWERTY keyboard is really a staggered grid four keys tall and between ten and thirteen keys wide, plus assorted modifier and function keys. We decided to use the columns for chords and the rows for scales.

For the diatonic scales and modes, the layout is simple. The bottom row gives the notes in the scale starting on 1^. The second row has the same scale shifted over to start on 3^. The third row starts the scale on 5^, and the top row starts on 1^ an octave up. If this sounds confusing when you read it, try playing it, your ears will immediately pick up the pattern. Notes in the same column form the diatonic chords, with their roman numerals conveniently matching the number keys. There are no wrong notes, so even just mashing keys at random will sound at least okay. Typing your name usually sounds pretty cool, and picking out melodies is a piece of cake. Playing diagonal columns, like Z-S-E-4, gives you chords voiced in fourths. The same layout approach works great for any seven-note scale: all of the diatonic modes, plus the modes of harmonic and melodic minor.

Pentatonics work pretty much the same way as seven-note scales, except that the columns stack in fourths rather than fifths. The octatonic and diminished scales lay out easily as well. The real layout challenge lay in one strange but crucial exception: the blues scale. Unlike other scales, you can’t just stagger the blues scale pitches in thirds to get meaningful chords. The melodic and harmonic components of blues are more or less unrelated to each other. Our original idea was to put the blues scale on the bottom row of keys, and then use the others to spell out satisfying chords on top. That made it extremely awkward to play melodies, however, since the keys don’t form an intelligible pattern of intervals. Our compromise was to create two different blues modes: one with the chords, for harmony exploration, and one just repeating the blues scale in octaves for melodic purposes. Maybe a better solution exists, but we haven’t figured it out yet.

When you select a different root, all the pitches in the chords and scales are automatically changed as well. Even if the aQWERTYon had no other features or interactivity, this would still make it an invaluable music theory tool. But root selection raises a bigger question: what do you do about all the real-world music that uses more than one scale or mode? Totally uniform modality is unusual, even in simple pop songs. You can access notes outside the currently selected scale by pressing the shift keys, which transposes the entire keyboard up or down a half step. But what would be really great is if we could get the scale settings to change dynamically. Wouldn’t it be great if you were listening to a jazz tune, and the scale was always set to match whatever chord was going by at that moment? You could blow over complex changes effortlessly. We’ve discussed manually placing markers in YouTube videos that tell the aQWERTYon when to change its settings, but that would be labor-intensive. We’re hoping to discover an algorithmic method for placing markers automatically.

The other big design challenge we face is how to present all the different scale choices in a way that doesn’t overwhelm our core audience of non-expert users. One solution would just be to limit the scale choices. We already do that in the Soundfly course, in effect; when you land on a lesson, the embedded aQWERTYon is preset to the appropriate scale and key, and the user doesn’t even see the menus. But we’d like people to be able to explore the rich sonic diversity of the various scales without confronting them with technical Greek terms like “Lydian dominant”. Right now, the scales are categorized as Major, Minor and Other, but those terms aren’t meaningful to beginners. We’ve been discussing how we could organize the scales by mood or feeling, maybe from “brightest” to “darkest.” But how do you assign a mood to a scale? Do we just do it arbitrarily ourselves? Crowdsource mood tags? Find some objective sorting method that maps onto most listeners’ subjective associations? Some combination of the above? It’s an active area of research for us.

This issue of categorizing scales by mood has relevance for the original use case we imagined for the aQWERTYon: teaching film scoring. The idea behind the integrated video window was that you would load a video clip, set a mode, and then improvise some music that fit the emotional vibe of that clip. The idea of playing along with YouTube videos of songs came later. One could teach more general open-ended composition with the aQWERTYon, and in fact our friend Matt McLean is doing exactly that. But we’re attracted to film scoring as a gateway because it’s a more narrowly defined problem. Instead of just “write some music”, the challenge is “write some music with a particular feeling to it that fits into a scene of a particular length.

Would you like to help us test and improve the aQWERTYon, or to design curricula around it? Would you like to help fund our programmers and designers? Please get in touch.

Theory for Producers

I’m delighted to announce the launch of a new interactive online music course called Theory for Producers: The Black Keys. It’s a joint effort by Soundfly and the NYU MusEDLab, representing the culmination of several years worth of design and programming. We’re super proud of it.

Theory for Producers: The Black Keys

The course makes the abstractions of music theory concrete by presenting them in the form of actual songs you’re likely to already know. You can play and improvise along with the examples right in the web browser using the aQWERTYon, which turns your computer keyboard into an easily playable instrument. You can also bring the examples into programs like Ableton Live or Logic for further hands-on experimentation. We’ve spiced up the content with videos and animations, along with some entertaining digressions into the Stone Age and the auditory processing abilities of frogs.

So what does it mean that this is music theory for producers? We’re organizing the material in a way that’s easiest and most relevant to people using computers to create the dance music of the African diaspora: techno, hip-hop, and their various pop derivatives. This music carries most of its creative content outside of harmony: in rhythm, timbre, and repetitive structure. The harmony is usually static, sitting on a loop of a few chords or just a single mode. Alongside the standard (Western) major and minor scales, you’re just as likely to encounter more “exotic” (non-Western) sounds.

Music theory classes and textbooks typically begin with the C major scale, because it’s the easiest scale to represent and read in music notation. However, C major is not necessarily the most “basic” or fundamental scale for our intended audience. Instead, we start with E-flat minor pentatonic, otherwise known as the black keys on the piano. The piano metaphor is ubiquitous both in electronic music hardware and software, and pentatonics are even easier to play on piano than diatonic scales. E-flat minor pentatonic is more daunting in notated form than C major, but since dance and hip-hop producers tend not to be able to read music anyway, that’s no obstacle. And if producers want to use keys other than E-flat minor (or G-flat major), they can keep playing the black keys and then transpose the MIDI later.

The Black Keys is just the first installment in Theory For Producers. Next, we’ll do The White Keys, otherwise known as the modes of C major. We’re planning to start that course not with C major itself, but with G Mixolydian mode, because it’s a more familiar sound in Afrodiasporic music than straight major. After that, we’ll do a course about chords, and one about rhythm. We hope you sign up!

Update: oh hey, we’re on Lifehacker