One of our summer research projects has focused on the refinement of audio, sound and music blocks and strategies for Scratch 2.0, the visual programming environment for kids developed by the Lifelong Kindergarten Group at the MIT Media Lab. Out of the box, Scratch provides some basic sound and audio functionality via the following blocks of the left hand side:
These blocks allow the user to play audio files selected from a built-in set of sounds or from user-imported MP3 or WAV files, play MIDI drum and instrument sounds and rests, and change and set the musical parameters of volume, tempo, pitch, and duration. Most Scratch projects that involve music utilize the “play sound” blocks for triggering sound effects or playing MP3s in the background of interactive animation or game projects.
This makes a lot of sense. Users have sound effects and music files that have meaning to them, and these blocks make it easy to insert them into their projects where they want.
What’s NOT easy in Scratch for most kids is making meaningful music with a series of “play note”, “rest for”, and “play drum” blocks. These blocks provide access to music at the phoneme rather than morpheme levels of sound. Or, as Jeanne Bamberger puts it, at the smallest musical representations (individual notes, rests, and rhythms) rather than the simplest musical representations (motives, phrases, sequences) from the perspective of children’s musical cognition. To borrow a metaphor from chemistry, yet another comparison would be the atomic/elemental vs. molecular levels of music.
To work at the individual note, rest, and rhythms levels requires quite a lot of musical understanding and fluency. It can often be hard to “start at the very beginning.” One needs to understand and be able to dictate proportional rhythm, as well as to divine musical metadimensions by ear such as key, scale, and meter. Additionally, one needs to be fluent in chromatic divisions of the octave, and that in MIDI “middle C” = the note value 60. In computer science parlance, one could describe the musical blocks included with Scratch as “low level” requiring a lot of prior knowledge and understanding with which to work.
To help address this challenge within Scratch, our research group has been researching ways of making it easier for users to get musical ideas into Scratch, exploring what musical data structures might look like in Scratch, and developing custom blocks for working at a higher, morpheme level of musical abstraction. The new version of scratch (2.0) enables power users to create their own blocks, and we’ve used that mechanism for many of our approaches. If you want to jump right in to the work, you can view our Performamatics @ NYU Scratch Studio, play with, and remix our code.
Here’s a quick overview of some of the strategies/blocks we’ve developed:
- Clap Engine – The user claps a rhythm live into Scratch using the built-in microphone on the computer. If the claps are loud enough, Scratch samples the time the clap occurred and stores that in one list, as well as the intensity of the clap in a second list. These lists are then available to the user as a means for “playing back” the claps. The recorded rhythm and clap intensities can be mapped to built in drum sounds, melodic notes, or audio samples. The advantage of this project is that human performance timing is maintained, and we’ve provided the necessary back-end code to make it easy for users to play back what they’ve recorded in.
- Record Melody in List – This project is a presentation of a strategy developed by a participant in one of our interdisciplinary Performamatics workshops for educators. The user can record a diatonic melody in C major using the home row on the computer keyboard. The melody performed is then added to a list in Scratch, which can then be played back. This project (as of now) only records the pitch information, not rhythm). It makes it easier for users to get melodies into computational representation (i.e., a Scratch list) for manipulation and playback.
- play chord from root pitch block – This custom block enables the user to input a root pitch (e.g., middle C = 60), a scale type (e.g., major, minor, dim7, etc.), and duration to generate a root position chord above the chosen root note. Playing a chord now only takes 1 “play chord” block, rather than 8-9 blocks.
- play ‘ya’ beats block – This block is very similar in design to the ‘play drum beats’ block in that the works with short strings of text, but instead triggers a recorded sound file. The symbols used to rhythmically trigger audio samples in this block are modelled after Georgia Tech’s EarSketch project for teaching Python through Hip-Hop beats.
- Musical Typing with Variable Duration – This project solves a problem faced by our group for a long time. If one connects a computer keyboard key to a play note block, an interesting behavior happens: The note is played, but as the key is held down the note is restarted multiple times in rapid-fire succession. To help solve this, we needed to write some code that would “debounce” the computer key inputs, but keep sustaining the sound until the key is released. We did this with a piece of Scratch code that “waits until the key is not pressed” followed by a “stop all” command to stop the sounds. It’s a bit of a hack, but it works.
- MIDI Scratch Alpha Keyboard – This project implements the new Scratch 2.0 Extension Mechanism to add external MIDI functionality. The project uses a new set of custom MIDI blocks to trigger sounds in either the browser’s built-in Java synthesizer, or any software or hardware synthesizer or sample you have in or connected to your computer. With these blocks, you now have access to full quality sampled sounds, stereo pan control, access to MIDI continuous controllers and pitch bend, and fine grained note on/note off. Read more about this on our research page.
I hope you find these strategies & blocks useful in your own Scratch/Computing+Music work.