Making it Easier to be Musical in Scratch

One of our summer research projects has focused on the refinement of audio, sound and music blocks and strategies for Scratch 2.0, the visual programming environment for kids developed by the Lifelong Kindergarten Group at the MIT Media Lab. Out of the box, Scratch provides some basic sound and audio functionality via the following blocks of the left hand side:

Scratch Sound Blocks

These blocks allow the user to play audio files selected from a built-in set of sounds or from user-imported MP3 or WAV files, play MIDI drum and instrument sounds and rests, and change and set the musical parameters of volume, tempo, pitch, and duration. Most Scratch projects that involve music utilize the “play sound” blocks for triggering sound effects or playing MP3s in the background of interactive animation or game projects.

This makes a lot of sense. Users have sound effects and music files that have meaning to them, and these blocks make it easy to insert them into their projects where they want.

What’s NOT easy in Scratch for most kids is making meaningful music with a series of “play note”, “rest for”, and “play drum” blocks. These blocks provide access to music at the  phoneme rather than morpheme levels of sound. Or, as Jeanne Bamberger puts it, at the smallest musical representations (individual notes, rests, and rhythms) rather than the simplest musical representations (motives, phrases, sequences) from the perspective of children’s musical cognition. To borrow a metaphor from chemistry, yet another comparison would be the atomic/elemental vs. molecular levels of music.

To work at the individual note, rest, and rhythms levels requires quite a lot of musical understanding and fluency. It can often be hard to “start at the very beginning.” One needs to understand and be able to dictate proportional rhythm, as well as to divine musical metadimensions by ear such as key, scale, and meter. Additionally, one needs to be fluent in chromatic divisions of the octave, and that in MIDI “middle C” = the note value 60. In computer science parlance, one could describe the musical blocks included with Scratch as “low level” requiring a lot of prior knowledge and understanding with which to work.

To help address this challenge within Scratch, our research group has been researching ways of making it easier for users to get musical ideas into Scratch, exploring what musical data structures might look like in Scratch, and developing custom blocks for working at a higher, morpheme level of musical abstraction. The new version of scratch (2.0) enables power users to create their own blocks, and we’ve used that mechanism for many of our approaches. If you want to jump right in to the work, you can view our Performamatics @ NYU Scratch Studio, play with, and remix our code.

Here’s a quick overview of some of the strategies/blocks we’ve developed:

  • Clap Engine – The user claps a rhythm live into Scratch using the built-in microphone on the computer. If the claps are loud enough, Scratch samples the time the clap occurred and stores that in one list, as well as the intensity of the clap in a second list. These lists are then available to the user as a means for “playing back” the claps. The recorded rhythm and clap intensities can be mapped to built in drum sounds, melodic notes, or audio samples. The advantage of this project is that human performance timing is maintained, and we’ve provided the necessary back-end code to make it easy for users to play back what they’ve recorded in.
  • Record Melody in List – This project is a presentation of a strategy developed by a participant in one of our interdisciplinary Performamatics workshops for educators. The user can record a diatonic melody in C major using the home row on the computer keyboard. The melody performed is then added to a list in Scratch, which can then be played back. This project (as of now) only records the pitch information, not rhythm). It makes it easier for users to get melodies into computational representation (i.e., a Scratch list) for manipulation and playback.
  • play chord from root pitch block – This custom block enables the user to input a root pitch (e.g., middle C = 60), a scale type (e.g., major, minor, dim7, etc.), and duration to generate a root position chord above the chosen root note. Playing a chord now only takes 1 “play chord” block, rather than 8-9 blocks.
  • play drum beats block – This block enables the user to input a string of symbols representing a rhythmic phrase. Modelled after the drum notation in the Gibber Javascript live coding environment, the user works at the rhythmic motive or phrase level by editing symbols that the Scratch program interprets as rhythmic sounds.
  • play ‘ya’ beats block –  This block is very similar in design to the ‘play drum beats’ block in that the works with short strings of text, but instead triggers a recorded sound file. The symbols used to rhythmically trigger audio samples in this block are modelled after Georgia Tech’s EarSketch project for teaching Python through Hip-Hop beats.
  • Musical Typing with Variable Duration – This project solves a problem faced by our group for a long time. If one connects a computer keyboard key to a play note block, an interesting behavior happens: The note is played, but as the key is held down the note is restarted multiple times in rapid-fire succession. To help solve this, we needed to write some code that would “debounce” the computer key inputs, but keep sustaining the sound until the key is released. We did this with a piece of Scratch code that “waits until the key is not pressed” followed by a “stop all” command to stop the sounds. It’s a bit of a hack, but it works.
  • MIDI Scratch Alpha Keyboard – This project implements the new Scratch 2.0 Extension Mechanism to add external MIDI functionality. The project uses a new set of custom MIDI blocks to trigger sounds in either the browser’s built-in Java synthesizer, or any software or hardware synthesizer or sample you have in or connected to your computer. With these blocks, you now have access to full quality sampled sounds, stereo pan control, access to MIDI continuous controllers and pitch bend, and fine grained note on/note off. Read more about this on our research page.

I hope you find these strategies & blocks useful in your own Scratch/Computing+Music work.

Hack Days as a Model for Innovation in Schools of Music

This past weekend saw the first Music Education Hack event hosted by the Spotify streaming music service and the NYC Department of Education’s iZone/InnovateNYC program. I’ve been to several music-themed Hack Days in the past, but this was the first event focusing specifically on hacking new technologies in service to music education.

This post is the first of several reflecting on the Music Ed Hack experience. Since the concept of a Hack Day may be foreign to many of my readers, I will start this post off with a description of what a Hack Day actually is, and put forward a vision of how collegiate schools of music (and even K-12 schools) could adopt this model as a way of building community among their students and reinforcing that music is a living, creative art. I’d love to hear what you think about that.

What is a Hack Day?

Hack Days and Hackathons are now common events within large technology companies like Google, technology startups, and in major technology innovation hubs like New York City, Austin, Boston, and Silicon Valley. The purpose of these events is to spawn innovation by giving coder/programmers 24 hours to a couple days to work as teams to create a new product, or technology, often around a specific theme or problem. These events are often sponsored by a single host company or a group of companies. The structure of these events are pretty similar in that interested coders assemble at a particular time and are introduced to the theme/challenge of the hack. The coders then often listen to short presentations/demos from sponsor companies around their Application Programming Interfaces (APIs). Most of the “hacking” that happens at these events is in the web-based and online realms, rather than hardware space. However, every Hack Day I’ve attended around music has always had some people playing with hardware such as Arduino boards, Microsoft Kinect controllers, & most recently Leap Motion, for building new physical interfaces.

After the API presentations finish, there is often an “open call” for collaboration where attendees can get up in front of the group and float the idea they have in hopes of soliciting other interested attendees in joining their team. Once that’s finished, the newly formed teams have approximately 24 hours to create their “hack.” Many teams work through the night, are well fed, and also have opportunities to meet with developers and technical experts from the sponsor companies to get advice on how to build their designs.

These events are not only for pure coders and developers. Graphic and website developers, entrepreneurs, musicians, and other interested people often show up and join teams to lend their expertise in User Experience, marketing spin, or knowledge of the application context. After about 24 hours of hacking, the deadline comes to pass and teams submit their “hacks” and present them as live demonstrations in front of the audience of programmers and other interested people. The demo sessions are often also open to the general public for those interested parties who don’t want to pull an all-nighter with the programming teams. There is palpable excitement during these demo sessions (and throughout the whole Hack Day, really). The audience gets to see brand new, emerging technologies, and the teams finally get a release of energy in sharing their ideas with the crowd.

The sponsors of the Hack Day, along with companies that provide API support for the event often give out prizes to the Hack teams that create the best hack or make the best use of their APIs. These prizes can range from nice cash sums of  upwards of $10,000 to iPad minis, to web credits, to concert tickets. Yes, there is a corporate/competitive context that surrounds these Hack Days, but as a participant in a few of them, I can also say that there is a strong intrinsic reward for creating something new that solves a challenge or puts forward a new idea. Aside from the prizes, most hacks never directly turn into a marketable product or service. However, they do influence future product design and a few do make it to the startup phase.

Hack Days as a Model for Innovation in Schools of Music?

I often wonder what a parallel event might look like in the formal music school space. Would it be a 24 hour challenge to bring together composers, producers, and performers to create/improvise/produce new chamber works? What could be gained from such an approach as an alternative to traditional band/choir/orchestra/chamber music festivals and competitions in high schools and in schools of music? I think it would be very cool to structure a new music festival hack day in every collegiate school of music as a way of building community and reinforcing music as a living, creative art. Students enrolled in the school across all music majors could compete for scholarships, or even sponsored prizes from publishers, instrument manufacturers, digital equipment companies, and music services – or participate just for the intrinsic fun of the event. Students would have 24 hours to form teams, create, rehearse, and refine their pieces. The demo sessions would be in the form of a concert of the newly created pieces.  As happens at a technology-based Hack Day, some demos fail to come together and others blow the audience away. “Failure” is seen as a necessary, positive learning experience within the tech/startup world. In order to have big rewards, big risks need to be taken and these Hack Days are a small, semi-controlled safe settings for those failures to occur. Sure, not every piece created would be a masterpiece, but isn’t that ok? Isn’t there a lot to be learned through trying and putting your ideas out there? Will the “musical academy” allow for this kind of disruptive innovation within their walls? Can they afford not to?

My favorite hacks from Music Education Hack 2013

Music Education Hack 2013 saw the presentation of 44 hacks created by around 200 participants throughout the event. As explained in my first post about Hack Days, not every hack (or presentation of the hack!) is successful. However, there is never a shortage of cool, new, and innovative ideas. Some appear at the demo session fully realized, yet others remain mere glimpses of what might come in the future. Nonetheless, it’s an exhilarating experience to just attend one of these events, let alone engage as a participant.

Here’s a list of my 14 favorites, in no particular order: (Note: Not every hack has a working demo).

  • Exemplify – (FIRST PRIZE – $10,000) – Online tool for teaching students around a streaming piece of music. Exemplify uses a variety of APIs to automatically provide historical context articles about the piece of music or composer, provides a built in comment or quiz tool to be tied to specific times within a song, and enables teacher to pause song, etc. 
  • Poke-a-Text – (EchoNest API PRIZE – iPad Mini) – Teach grammar while listening to music. The user selects a favorite song and the app presents streaming phrases from the song’s lyrics with varying degrees of grammatical correctness. The user selects the version of each lyric line they think is grammatically correct and their choices are graded. Scores can be sent back to the teacher to monitor progress.
  • Rock Steady – Mobile phone app for pulse/rhythm training in the context of your favorite song. Using the built in accelerometer in your phone, control the tempo of your favorite song or try to follow along. Keeps accuracy score. A cool way of practicing pulse in a contextually meaningful way.
  • JamAlong – This is a Spotify app that creates a simple diatonic xylophone interface that automatically maps onto the key of your favorite song within Spotify. It queries the key and mode of the song using a variety of APIs, and maps out the diatonic scale that best matches the song. The user can “jam” with their favorite tunes automatically through playing a virtual diatonic xylophone mapped to the solfege of the particular mode with their computer mouse.
  • Spotifact – This app enables a teacher to create affinity groups based on musical preferences. Have multiple friends go to the demo and enter “hack” as the class code. The app links to Facebook and joins users together into groups based on listening preferences as identified in Facebook. Use this to form groups within large gatherings of people. The app can run on mobile devices in the web browser.
  • Map That Music – App for learning geography through music and vice versa. Listen to a Spotify song and guess the country of origin. Also, explore a world map to hear songs from that particular country. A concept similar to a prior Music Hack Day hack by Paul Lamere: Roadtrip Mixtape.
  • RosettaTone – (THIRD PRIZE – $1000 in Amazon AWS credits) – Teaching a foreign language through music videos. Users watch foreign language music videos with live lyric translation in original and second language.
  • Kashual – Trigonometry functions mapped to music synthesis, with interactive performance controls. See the actual functions for various musical samples, and adjust the mathematical function values to create new tones. Play those tones on a virtual keyboard. Inspired by a direct request from a NYC high school math teacher.
  • Parrot Lunaire – (Peachnote API PRIZE – $100 gift certificate to Carnegie Hall) – Search the classical musical score corpus by singing or playing in the theme.
  • AirTrainer – Leap motion kinaesthetic tone matching program created with Max/MSP. Move your finger up or down in the air to play and match tones by ear and hand.
  • SuperTonic – Active listening app with Noteflight. Students click a button during “interesting” parts of a song. Creates an interactive graph for the teacher to use to document listener engagement.
  • Teach Beats – App for linking buskers in NYC with students who want to take lessons from them.
  • TapeTest – (NYU SPECIAL PRIZE to an Educator/Hacker – Nick Jaworski – a MaKey MaKey kit) – a simple web-based app for teachers to assign students to record and submit playing tests for individual playing assessments.
  • Remixing Your Musical World: The MaKey MaKey Musical Construction Kit  – (SECOND PRIZE – $2000 in Amazon AWS Credits & 25 hours of mentoring from NYC Dev Shop) – A musical construction kit based on the MaKey MaKey and MIT’s Scratch visual programming language. The hack completed at Music Ed Hack was the MaKey MaKey Chord Board – a demo project for exploring chords and their inversions creatively.

My role at the Music Ed Hack event was first as a prize sponsor. My research group at NYU along with MaKey MaKey sponsored an educator/hacker prize of a MaKey MaKey kit awarded to the educator(s) who served as active collaborators on a great hack. I wanted to especially encourage and award educators who got their hands dirty in providing active input into the development and realization of a hack. Congratulations again to Nick Jaworski for his involvement in the Tape Test app! In addition, all educator participants were invited to participate in my research group’s Music Technology Educator Meetup (link coming soon) to be held monthly at NYU starting in October. Each educator who actively participated in Music Education Hack and attends the monthly meetup will also receive a MaKey MaKey for use in their classrooms.

I also attended as an observer and as an informal mentor at the event. Spotify and the NYC Department of Education assembled a great team of formal mentors, including music educators Barbara Freedman from the Greenwich, CT schools, Robert Lamont from Gramercy Arts High School in NYC, and Darla Hanley from Berklee. A full list of mentors can be found by scrolling to the bottom of this page: http://musicedhack.com/.

I am also extremely proud of my summer co-op scholar research students Graham Allen and Matt Cohen from UMass Lowell! Their hack won 2nd price for their MaKey MaKey Chord Board, a part of the MaKey MaKey Musical Construction Kit that my research group is currently developing. What’s significant about their hack from my perspective is that they did all of their development using a MaKey MaKey kit ($50), $12 worth of common household items, and did all of the software programming using Scratch 2.0 – a free web-based visual programming environment developed by the Lifelong Kindergarten Group at the MIT Media Lab for kids. All of the materials they assembled for the hack are meant to be remixed and reused by students. All of the code and hardware they created can be viewed and customized by students freely, encouraging users to think and create musically, computationally, mathematically, while exploring engineering design. Graham and Matt competed against professional developers using professional tools from all around the country and came in 2nd with their project and great presentation.

Their project is as much STEM (Science, Technology, Engineering, & Math) as it is Music and Design. I think it’s a great interdisciplinary model for educators of all levels. The Scratch environment enables K-12 educators to bring the process of Hack Days to their own students. Not only are students exploring creating, performing, responding and connecting (the new Arts Standards framework), but they are also working as instrument builders, designers, and engineers. If you are interested in other ideas for exploring expanded and reformed visions of music education pedagogy and curriculum, check out Evan Tobias’s work exploring ways of teaching popular music through producing, songwriting and composing. Our Experiencing Audio research group plans on releasing the MaKey MaKey Musical Construction Kit plans as a completely open-source, open-hardware project, while also making it available for purchase in the near future. We’re also working with various Music API providers to create custom Scratch 2.0 blocks enabling Scratch users to “hack” their own music apps using commercial APIs.

I really hope that Music Education Hack will become an annual event.

Designing Technology & Experiences for Music Making, Learning, & Engagement

This Fall I will be teaching a graduate course at NYU called Designing Technologies and Experiences for Music Making, Learning, and Engagement. This course is heavily inspired by the Hack Day process, but applied over the span of a semester-long course. Students from across the many programs within the NYU Department of Music and Performing Arts Professions will work together individually and in teams to develop a technology and/or experience that that they will iterate at least twice over the course of the semester with a specified audience/group of stakeholders. Students will read articles about and case studies of best practices in music education, meaningful engagement, experience design, technology development and entrepreneurialism, and meet regularly with guest presenters from industry and education. At the end of the course, students will present their projects to a panel of music educators and industry representatives for feedback. Selected students will have the opportunity to compete for scholarships to work within my research group and some of the industry sponsors during the Spring 2014 semester to potentially license and commercialize their ideas and projects.

In this course, we will be implementing a research & development process designed by Andrew R. Brown called Software Development as (Music Education) Research (SoDaR). This process was piloted and used throughout the development of the Jam2Jam networked media jamming software project led by the late Steve Dillon. This process actively involves the end users of a particular piece of software in the design process at all stages. The field of music education technology is just now starting to move toward this end, where in the past educators were often marketed music technologies designed for professional musicians (e.g., professional keyboard synthesizer, Finale, Sibelius, Reason, ProTools, Ableton, etc.). It’s notable that relatively new technologies NoteflightMusicFirst, and MusicDelta have engaged educators in the design and refinement of their tools, and see music educators and students as their primary user audience.

Music, Creativity and Technology: An Example from Actual Practice

Step 1: Find music notation scribbled on a bathroom wall.

Step 2: Take a picture of it with your cell phone camera and post it online to ImageShack:

http://img120.imageshack.us/my.php?image=picture1ix9.jpg

Step 3: Post a link to that photo on Reddit for others to see and discuss:

http://www.reddit.com/r/pics/comments/7bqjc/music_found_in_the_toilet/

Step 4: Transcribe melody into Noteflight and create a custom arrangement of “Toilet Melody”:

click on Noteflight logo to launch score at Noteflight.com

Step 5: Repost Noteflight arrangement back on Reddit for others to discuss:

http://www.reddit.com/r/pics/comments/7bqjc/music_found_in_the_toilet/

Credits:
Photo = TrippingChilly
Music Arrangement = Skynare

If this is how young people are using technology in their lives, how can we draw on this in the classes we teach?

Bonus points for identifying the origin of the tune in “Toilet Melody.” :)

Using Noteflight in and outside of the music classroom

Earlier this month Evan Tobias posted about Noteflight, a new online flash-based notation application available at http://www.noteflight.com/. Over the past few weeks I have been exploring this software with college students in my Technology in Music Education course and with high school students enrolled in a beginning piano class at Lowell High School (LHS). Those of you who know me know how apprehensive I am when it comes to using notation software with students in general music or other technology classes in K-12 schools. Most of my concern centers around the common conflation of “notation software” with “composing software.” All too often I see teachers using notation software as a technological endpoint rather than as a means to the musical end of live performance. However, Noteflight is not your ordinary notation software.

What interests me about Noteflight is not the notation component. Instead, it is in the social tools that surround the notation engine. When you sign up at Noteflight.com (currently free) you create personal profile, just like you would at a social networking site like Facebook, MySpace or custom sites created at Ning.com. Once signed in, you can create a new score, view existing scores, or scores created by other users.

Picture 2Built in to the web application is the ability to share your scores with other users. These scores can be easily embedded just like a YouTube video in a class website. The embedded score can be played back by clicking on the play button and additional interactive functions are being planned which could be helpful in guided listening activities. Coming from a constructivist perspective, this functionality enables teachers to give students the opportunity to share their musical understanding in interactive ways within and beyond class time. For example, a band director could post a Noteflight score without added articulation. Students could then be assigned to add their own articulations to the score. During the next class, the students and director could choose a few scores to play through. This approach gives students the opportunity to make creative articulation decisions as composers, rather than traditionally learning it through listening and performing.

A variation on this assignment could be to post an audio file of a musical line performed with different articulations. Below the audio file, a director could post the notation for that performance, but again without articulation added. As an assessment, students could then open the score and add articulations that in their mind matched the recorded performance.

Right now, there are some limitations to accomplishing this, but I’ve been assured by Joe Berkovitz, CEO of Noteflight, that these functions are currently in development.

Picture 1
This screenshot shows the “version” function for Noteflight. As you work on a score in Noteflight, it periodically saves a snapshot of your piece and gives you access to it as a different “Version.” If you open your score up to be added to by others, their versions show up in this box as well. At any point you can go back (revert) to a prior version. This is a cool function, not only because you can go back, but as a window into your students’ compositional processes. Though not a full account of their process, these snapshots can provide an opportunity to have discussions with your students about the changes they made in their composition and are great starting points for assessment.

Here’s a short piece I notated in Noteflight:

Right now, the interactivity is limited to simple whole piece playback and playback within measures (click above the measure). Soon, functions will be added that will enable the composer to add additional interactivity through scripting. Very cool. :)

My students and Noteflight

My college students have been using Noteflight with beginning piano students at Lowell High School (LHS) for the past few weeks. Students in my class created incomplete duets to be co-composed and performed with their partner students at LHS. The music teacher at LHS has for the most part have been using Alfred’s Adult Beginner Piano book to structure the curriculum. My college students wanted to add a composing/creativity aspect to the lessons. To do this, they created simple piano scores with either a chord progression in the bass clef or a melody in the treble clef (or some combination of the two) as a compositional frame to help scaffold the LHS students. Because the scores are online and viewable by the LHS students and my college students, both can practice alone and make edits to their duet scores. Tomorrow, they will meet again in person for a final run through and performance for the class. I’ll post some of the pieces and performances here soon.

Because Noteflight is an online application, the potential for collaborative work and learning with other students is high. I’m in the middle of planning a distance composing project with another school later in the term through Noteflight. Facilitated by a custom Ning.com social network, students at LHS will notate compositions in Noteflight and share them with other students at a distance site. Ning will enable them to post their files and provide peer comment and critique. This use is inspired in part by the work at the Vermont MIDI Project, but instead centers on the students as providers of compositional critique and feedback, rather than professional composers.

I’m very excited to see how this technology develops. If you are interested in collaborative projects using Noteflight with your students, drop me an email.

Copyright: Ben Stein vs. Yoko Ono – Implications for “fair use” in music education?

Caveat # 1: I am not a lawyer and do not pretend to be one.

Today, I read an article posted on Ars Technica written by Timothy Lee detailing a recent “fair use” Copyright decision by Judge Sidney Stein of the U.S. District Court – Southern District of New York.

From the article:

Judge Stein’s task wasn’t to critique the dubious logic of this segment, but to evaluate the narrower question of whether the film’s use of “Imagine” is fair under copyright law. He noted that the film was focused on a subject of public interest, and that the film was commenting on Lennon’s anti-religious message. The excerpting of copyrighted works for purpose of “comment and criticism” is explicitly protected by the Copyright Act, and Judge Stein ruled that this provision applied in this case.

The decision quotes extensively from Bill Graham Archives v. Dorling Kindersley, a 2006 decision that allowed the reprinting of reduced-size versions of several historical posters used in a coffee-table book about the Grateful Dead. In that case, as in this one, the alleged infringers had used the works in a commercial product, but the US Court of Appeals for the Second Circuit found that “courts are more willing to find a secondary use fair when it produces a value that benefits the broader public interest.” Whatever the merits of its argument, Expelled is clearly commentary on an issue of public concern, and the use of “Imagine” was central to its argument. Those facts weighed heavily in favor of a finding of fair use.

Stein and company were defended by lawyers from Stanford’s Fair Use Project. In a blog post announcing their decision to take the case, executive director Anthony Falzone wrote that “The right to quote from copyrighted works in order to criticize them and discuss the views they represent lies at the heart of the fair use doctrine,” and argued that Ono’s actions threaten free speech.

This decision and the 2006 decision referenced above cause me to ask a few questions regarding the implications for music education:

In the 2006 decision, the use of reduced sized Grateful Dead posters was upheld as “fair use” within a commercial product because “courts are more willing to find a secondary use fair when it produces a value that benefits the broader public interest.”

In the Sidney Stein decision, the use of an excerpt from John Lennon’s Imagine used in a commercial film for the purpose of criticizing and commenting on issues that “benefit the broader public interest.”

So, what are the implications of using copyrighted samples or excerpts of commercial music or videos as part of our students’ educational pursuits? Is careful musical and educational use of commercial music and video in school projects of “benefit to the broader public interest?” If our students are utilizing these materials (including YouTube videos) for the purpose of artistic, musical “comment and criticism,” would that not also be considered “fair use” in light of these decisions?

What is particularly interesting to me is that both of the approved uses described above – using a copyrighted image in reduced resolution and using an excerpt of a copyrighted and performance-righted musical recording – were found to be “fair use” in two commerical settings. Also, both uses of copyrighted material seem to have been interpreted b the Judges as a “transformative” use (see Wikipedia entry on Fair Use). It would seem to me (again I am NO lawyer) that similar uses and creation of original multimedia using music and popular commercial and non-commercial video for “comment and criticism” of “benefit to the broader public interest” where the work has been “transformed” and not wholly-duplicated within an non-profit educational setting of a school would now be permissible as documented by the above case law.

Let’s take a look at the “fair use” section of the Copyright Act of 1976, 17 U.S.C. § 107:

Notwithstanding the provisions of sections § 106 and § 106A, the fair use of a copyrighted work, including such use by reproduction in copies or phonorecords or by any other means specified by that section, for purposes such as criticism, comment, news reporting, teaching (including multiple copies for classroom use), scholarship, or research, is not an infringement of copyright. In determining whether the use made of a work in any particular case is a fair use the factors to be considered shall include:

  1. the purpose and character of the use, including whether such use is of a commercial nature or is for nonprofit educational purposes;
  2. the nature of the copyrighted work;
  3. the amount and substantiality of the portion used in relation to the copyrighted work as a whole;
  4. and the effect of the use upon the potential market for or value of the copyrighted work.

The fact that a work is unpublished shall not itself bar a finding of fair use if such finding is made upon consideration of all the above factors.

Since the above uses were found to be “fair use” within commercial settings, factor #1 in the Copyright Act of 1976 would seem to provide students and teachers working in an educational context even more protection under “fair use.” I find the Sidney Stein ruling of particular importance to music educators because it provides case law that extends the “fair use” of images to copyrighted and performance-righted musical recordings.

In light of the cases described here, I feel more comfortable letting my students use copyrighted images and musical excerpts in the creative and educational work they do in my K-College music and music ed courses, with the following caveats:

  1. The use of the works is in part, and not in whole (e.g., reduced resolution or size)
  2. The use of the works for the purpose of “criticism and commentary”
  3. The use and creation of the works results in a “value that benefits the public interest”
  4. The use of the works is “transformative” such as in a parody or for “criticism and commentary”
  5. The use of the works do not devalue or negatively impact the market of the original copyrighted works

And, I might even be inclined to allow them to put together a compilation CD or DVD and sell them as a fundraiser….

What do you think?