mostlynoise

noise5

A blog following my musical activities

You can scroll the shelf using and keys

The Graphic Score as Instrument

March 31, 2012

Sound Art project 2012, Loyola High School Music Appreciation

Because music is ephemeral, invisible, and abstract, it is peculiarly difficult to talk or write about it. These complexities are never more apparent than when trying to compose. Devising a way to efficiently and explicitly communicate sonic intention to another person borders on being a black art! In my music appreciation classes, we examined how scores convey the invisible abstractions of music. We also examined the limitations of traditional notation and studied other composer’s graphic solutions for unique compositional problems.

We spent a week doing some “Deep Listening” exercises as crafted by Pauline Oliveros, reading Michel Chion’s essay on the nature of sound, and examining the notion of soundscape as defined by R. Murray Schafer.

To synthesize these concepts, I had the students make graphic scores of one of their “Deep Listening” experiences. Creating these scores was an opportunity for them to consider compositional issues. They asked themselves questions like “What sounds are demanding my attention and what sounds are in the background?” “How do I notate the Doppler effect of a city bus driving by?” “What sounds are analogs to the musical notion of key?”

As we proceed, the kids blog about their experiences and expressive processes. The projects in this post’s embedded videos represent an important leap: the move from perception to intention. I asked them to score a piece for pencil theremin and electronics where the graphic score is also the instrument.

A small circuit board with a battery and speaker is attached to a regular wooden pencil. Copper tape is attached to one end of the pencil and wrapped around the pencil where it is grasped. More tape connects the tope end of the board to the top of the pencil. A metal pushpin is stuck through the tape and into the pencil lead itself. Therefore, when one draws with the pencil and touches the graphite on the paper, the circuit is completed and a sound is emitted from the speaker. The greater the distance between the contact points, the lower the pitch!

Using the Pure Data patch host provided by the folks at RJDJ, a reactive sound app company, the students composed reactive recorder/distorters to augment the variety of sounds as well as the reality of the live performance.

In spite of the limitations imposed by the pencil Theremin’s limited sonic palette, some students devised interesting modes of interaction with their scores and apps.

One of the interesting challenges this project presented was the video creation. The composers could choose whether they would play their own piece or be one of two cameramen filming the performance. If they played, the nature of the movie was left in someone else’s hands, and vice versa. They had to abandon complete control of their piece one way or another.

Fiskabur Samples released into the wild via Creative Commons

October 27, 2011

After my annual “Physics of Music” lecture/demonstration I give each Fall to our science students, some of the kids create musical instruments based on these acoustic principles.

This year, my music students borrowed some of those instruments and recorded improvisations with contact mics they built from piezo discs.


These loops, some distorted, some not, were loaded onto my Soundcloud account with a Creative Commons license that allows for derivative works on the condition that subsequent works are similarly licensed.

My students are writing music inspired by the creatures living at the Aquarium of the Pacific using these loops as source material.

Quote of the Week: John Cage and some Fiskabur Thoughts

October 15, 2011

by Lars Ploughman Creative Commons

    Ideas are one thing and what happens is another.

    -John Cage

Within the RJDJ scene creation software, RJC-1000, I can make four pages for my iPhone, each hosting samples or Pure Data patches. I am reconsidering exactly what to assign my students to compose for each of their pages. At first, I gave them free reign to compose whatever they wanted. After some consideration, I think there is more artistic and pedagogical value to stipulating an exact structure for each section while leaving the subject of the composition open to interpretation.

I have decided to do more work with them with metaphor, Cagean-aesthetics of chance, and some of the music-installation work of David Tudor. Perhaps we will go to the Getty to see David Tudor’s “Sea Tails”, which was on permanent display. I believe in this kind of intermedia work when teaching this kind of abstraction because there are multiple opportunities for my students to enter into the aesthetic.

    My current plan for the pages is as follows:

    Page 1- Close view of a sea creature. Through-composed soundtrack sample in MIDI inspired by animal locomotion and/or behavior.

    Page 2- Sonic-portrait of the animal created entirely from samples recorded and digitally manipulated by the students using homemade piezo microphones, digital recorders, and computers.

    Page 3- General portrait of the entire display including plant life, rocks, light, etc… Students may use MIDI, samples, anything at the student’s disposal. Especially want them to search for metaphor in their expression.

    Page 4- Student’s choice. Students may express anything they like about the creature in sound.

    For each of the pages, the level of interactivity or augmented reality is left to the student, as long as they can explain their choice.

I am excited to see the results of their work. Next, I will meet with the student-filmakers to explain the project and develop some ideas with them for a documentary about the process and art films.

Aquarium Reactive App Project Overview: 2011

October 13, 2011 2 Comments


One of my music appreciation classes is beginning a new project: writing music inspired by sea creatures, then embedding those songs in an audio augmented-reality app.

The project began with students going to the Long Beach Aquarium of the Pacific to shoot images and gather film-footage of some of the creatures living there. The footage is for reference inspiration when composing.


Link to the YouTube Aquarium Playlist

I have assembled the clips into small movies in iMovie, removing the sound. The short films are loaded to our YouTube channel to be available for my students at school and home.

The photos are edited and loaded to the app so the user can match the visual on the phone to the actual display in the aquarium.

I am building on the compositional skills learned in the “Krapp’s Last Tape” project in the creation of the apps. Using the RJC-1000 software by the RJDJ team, each app can host up to four samples as well as little reactive modules written in Pure Data.

RJDJ, as hosted on an iPhone or iPad, allows programmable access to the microphone, gyroscope, etc…, to modify sounds and filters written in Pure Data. Each “page” of a “scene” can be labeled with an image. Four pages are possible per scene. The project I assigned my students is to compose four reactive compositions/soundscapes, each of which is inspired by a different aspect of a single exhibit at the Aquarium of the Pacific, then perform and record the piece at the aquarium in collaboration with the animals.

Once the apps are written and loaded onto devices, we will return to the aquarium, film the composer using the app in front of the display, and record the resulting reactive composition. The composition will then become the soundtrack for the film. We plan to make a documentary as well as art films.

I have been in touch with Marilyn Padilla of the aquarium. She has been very helpful in negotiating releases for the educational uses of our filming and photography. The Aquarium has been very open to our project and approval for the work came in one day! Extraordinary!

Krapp’s Last Tape: Soundwalk 2011 Installation

October 3, 2011 1 Comment



Go to 20’21” to hear our contribution to the Audio Catalog.

The Soundwalk festival happened this past Saturday. The installation created by my students was very well-received. It was particularly interesting to watch the variety of interactions with the piece; I documented some of them in the above video.

(more…)

Krapp’s Last Tape: Mapping Fiducials to Ableton with ReacTIVision and some Links

September 29, 2011

With our Soundwalk installation date rapidly approaching, my students have just finished their soundscapes, hosted them in Ableton Live 8, and are learning to MIDI map them to ReacTIVision fiducials via OSCulator.

What I have enjoyed about this kind of project is seeing my students not only grapple with the abstractions of composition, but watching them consider multiple modes of interactivity. I think the important lesson is not just learning to execute the project by daisy-chaining software, but dealing with the metaphors of the objects and their associated sounds.

I am especially appreciative of Martin Kaltenbrunner and Ross Bencina for their open-source gift of ReacTIVision, the software that allows fiducial-tracking. This has opened up a creative use of technology to my high school students that otherwise would be well beyond their capabilities, not to mention mine!

Even the simplistic implementation we use has far-reaching implications for metaphoric and abstract thought.

Links:

ReacTIVision

Osculator

Ableton Live

Krapp’s Last Tape: Proof of Concept Number 1

September 26, 2011

Our Soundwalk entry this year is inspired by Beckett’s “Krapp’s Last Tape”. In the play, Krapp interacts with his recorded self separated by some thirty years.

Our installation seeks to explore the experience of memories through an interactive soundscape.

Participants will visit a table with a tape player and objects of childhood. Placing cassette tapes on the table triggers soundscapes designed by my students. The soundscapes recreate a memory from their past in the way they remember it.

At the same time, the installation itself is a soundscape. The tape player sets the keynote sounds establishing the time and place of the table in the late 20th-century. An omnidirectional microphone picks up the recording as well as any ambient sound. Projected in front of the table is a ghosted and delayed video capture of the installation itself.

The participants will see and hear the experience in realtime and delayed by three to five seconds. Our hope is to convey the distortion of time upon the memory of one’s experience.

This proof of concept is to check the delay capabilities of the computer and software over several hours, and to set basic parameters for this part of the project.

The software used for this portion of the installation includes Macam and Live Video Delay by Zach Poff.

From the Macam website:

    macam is a driver for USB webcams on Mac OS X. It allows hundreds of USB webcams to be used by many Mac OS X video-aware applications. The aim is to support as many webcams as possible.

    In addition, macam also supports downloading of images from some dual-mode cameras. macam especially tries to support those cameras not supported by Apple or by their manufacturers.

From Zach Poff’s Website:

    This app was made to answer a question from a student: “How come there’s no free software to delay a live video feed?” The original version used the GPU for fast video processing and storage, but VRAM is too limited for long delays, so version 2010-04-08 uses system memory to store the recorded frames. Stereo audio is also delayed, with independent delay-time so you can tweak the synchronization between picture and sound. Additionally, you can mix between the live and delayed video feeds with a crossfade slider. DV-NTSC runs at a solid 30fps on a MacBook Pro laptop, but HD resolutions might be faster using the older (more limited) version of the app.

Free ebook on Sound Art Creation in Processing: ComputerMusicBlog by Evan Merz

September 19, 2011

I discovered a wonderful computer music/sound art blog by Evan Merz. Today, he posted about his newly published book integrating the Processing language with Sound Art Creation. This opens up sound art creation possibilities in the digital visual arts without having to learn yet another language.

On his blog, Merz writes:

    Over the past year, I’ve had the pleasure of discovering Oliver Bown’s wonderful sound art library, Beads. Beads is a library for creating and analyzing audio in Processing or Java, and it is head-and-shoulders above the other sound libraries that are available for Processing. From the ground up, Beads is made for musicians and sound artists. It takes ideas popularized by CSound, Max and other popular sound art environments, and intuitively wraps them in the comfort of the Processing programming language.

Click here for a link to his post with links to the various libraries and books!

More Piano Videos Using Archive.org

Satie Gymnopedie #3

Bach Concerto in f-minor, mvt. 2

I created a music video by editing out clips of planes, then slowing them down by 75%. I think it mirrors the stillness of Satie’s harmonies.

For the Bach, I used a video made for a film class and posted on Archive.org with a CC license. Specifics are cited on the YouTube video comments.

More Piano Music Videos Using Archive.org

September 10, 2011

Krapp’s Last Tape Project: Asking My Students to Compare Schafer and Cage

September 8, 2011 4 Comments


In my music appreciation classes, we have been learning about the aesthetics and construction of sound-art and soundscapes as models for their individual contributions to “Krapp’s Last Tape.” Today we watched a short and fascinating video portrait of R. Murray Schafer, the composer and environmentalist.

Schafer coined the term “soundscape”. According to him, there are three main elements of a soundscape:

From Wikipedia article on Soundscapes:

    Keynote sounds
    This is a musical term that identifies the key of a piece, not always audible… the key might stray from the original, but it will return. The keynote sounds may not always be heard consciously, but they “outline the character of the people living there” (Schafer). They are created by nature (geography and climate): wind, water, forests, plains, birds, insects, animals. In many urban areas, traffic has become the keynote sound.
    Sound signals
    These are foreground sounds, which are listened to consciously; examples would be warning devices, bells, whistles, horns, sirens, etc.
    Soundmark
    This is derived from the term landmark. A soundmark is a sound which is unique to an area.

It is interesting to note how Schafer speaks about sound as music and uses the word “noise” pejoratively.

In the following video, John Cage also speaks of sound as music, but without judgement.

His philosophy about listening to sound seems to be codified by Michel Chion’s Three Modes of Listening and Pierre Schaffer’s Reduced-Listening. Cage derives pleasure from the transient qualities of sounds themselves.

My students watched and discussed these videos as they prepare to fabricate their contribution to “Krapp’s Last Tape”. They will make sound-compositions about a personal memory in a way that gives the listener a window into how they uniquely experience memories. They will consider Schafer’s aesthetic and structure when composing the piece as well as reduced-listening skills when evaluating the sound-material and placement for the composition.

%d bloggers like this: