mostlynoise

noise5

A blog following my musical activities

You can scroll the shelf using and keys

The Graphic Score as Instrument

March 31, 2012

Sound Art project 2012, Loyola High School Music Appreciation

Because music is ephemeral, invisible, and abstract, it is peculiarly difficult to talk or write about it. These complexities are never more apparent than when trying to compose. Devising a way to efficiently and explicitly communicate sonic intention to another person borders on being a black art! In my music appreciation classes, we examined how scores convey the invisible abstractions of music. We also examined the limitations of traditional notation and studied other composer’s graphic solutions for unique compositional problems.

We spent a week doing some “Deep Listening” exercises as crafted by Pauline Oliveros, reading Michel Chion’s essay on the nature of sound, and examining the notion of soundscape as defined by R. Murray Schafer.

To synthesize these concepts, I had the students make graphic scores of one of their “Deep Listening” experiences. Creating these scores was an opportunity for them to consider compositional issues. They asked themselves questions like “What sounds are demanding my attention and what sounds are in the background?” “How do I notate the Doppler effect of a city bus driving by?” “What sounds are analogs to the musical notion of key?”

As we proceed, the kids blog about their experiences and expressive processes. The projects in this post’s embedded videos represent an important leap: the move from perception to intention. I asked them to score a piece for pencil theremin and electronics where the graphic score is also the instrument.

A small circuit board with a battery and speaker is attached to a regular wooden pencil. Copper tape is attached to one end of the pencil and wrapped around the pencil where it is grasped. More tape connects the tope end of the board to the top of the pencil. A metal pushpin is stuck through the tape and into the pencil lead itself. Therefore, when one draws with the pencil and touches the graphite on the paper, the circuit is completed and a sound is emitted from the speaker. The greater the distance between the contact points, the lower the pitch!

Using the Pure Data patch host provided by the folks at RJDJ, a reactive sound app company, the students composed reactive recorder/distorters to augment the variety of sounds as well as the reality of the live performance.

In spite of the limitations imposed by the pencil Theremin’s limited sonic palette, some students devised interesting modes of interaction with their scores and apps.

One of the interesting challenges this project presented was the video creation. The composers could choose whether they would play their own piece or be one of two cameramen filming the performance. If they played, the nature of the movie was left in someone else’s hands, and vice versa. They had to abandon complete control of their piece one way or another.

Loyola High School Music Gathers STE(A)M

January 1, 2012

STEM, a govenment acronym for studies in Science, Technology, Engineering, and Mathmatics has gained another letter in education circles: A for Arts. I join John Maeda as a proponent of STEAM curriculum. Maeda, a former student and faculty member of MIT and current president of the Rhode Island School of Design, writes:

And so I’ve begun to wonder recently whether STEM needs something to give it some STE(A)M—an “A” for art between the engineering and the math to ground the bits and bytes in the physical world before us, to lift them up and make them human. What if America approached innovation with more than just technology? What if, just like STEM is made up of science, technology, engineering and math, we had IDEA, made of intuition, design, emotion, and art—all the things that make us humans feel, well, human? It seems to me that if we use this moment to reassess our values, putting just a bit of our humanity back into America’s innovation engines will lead to the most meaningful kind of progress. By doing so, we will find a way back to integrating thinking with making and being and feeling and living so that left- and right-brained creativity can lift our economy back into the sky.

– John Maeda in Seed Magazine

I recently discovered that my YouTube video of Steve Reich’s Pendulum music was embedded in a Scientific American blogpost about “cyborg yeast”. Reich’s Pendulum Music is a “process piece” which combines the acoustic phenomenon of a feedback loop in conjunction with the randomness of pendulum swings to create a landscape of slowly-shifting pitches and timbres.

Even though my video is only tangentially related to Christina Agapakis’ post-topic, I believe it serves as one model of how easily the arts may be integrated into STEM topics.

I am grateful that Ms. Agapakis used our music video to illustrate a scientific concept. I think she uses it effectively. Its inclusion introduces an artform and aesthetic to students and audiences that it might otherwise not meet.

Because art so easily illustrates science concepts, I fear that general artistic misunderstanding may make this particular example a STE(A)M curriculum model by being a path of least resistance. To be pithy, I imagine well-intentioned teachers assigning crafts and not arts. Art engages the heart and mind. Students deal with metaphor and expressing ideas. Artists grapple with technique and communicating the ephemeral minus the semantics of science while simultaneous applying scientific principles.

My colleagues at Loyola High School are very open to substantive STE(A)M work letting our students get their hands dirty in both the arts and sciences. My recent Fiskabur project and our joint Physics of Music lectures are great examples of STE(A)M work.

An Unwitting Collaboration: Props from Video Artist David Montgomery

December 13, 2011

While checking my email this morning, I found a nice note and blogpost from artist David Montgomery. He wrote about a video on which he unknowingly collaborated with me. I took audio from a concert where I played John Cage’s “In A Landscape” and by applying “Cagean” procedures to David’s Dandelion loops, created a music video.

I sent the short film to David. He was very complementary!

Steven Speciale is an inspired music teacher and choir director at Loyola High School in Los Angeles. When I post my work to the Internet Archive I think of it being used for VJing or video remix though I always hold out hope that it could be used in education. Steven Speciale managed to combine all of the above in brilliant fashion. I guess it shouldn’t be a surprise that someone from LA, home to LAVA (one of the most established video art coalitions in the US), who is a mainstay at SoundWalk in Long Beach would make the most innovative use of the Dandelion Free Culture video loops to date.

Personally, it is very satisfying and validating to have one’s work acknowledged by colleagues. The reality is that none of this would have happened without the culture of sharing promoted by the open-source movement, and by extension, social media. David Montgomery would appear to live in Florida; I live in California. Our paths only crossed in this virtual realm. Because they did, we both have benefited, and because we benefit, my students benefit.

Fiskabur 2011: A Milestone for Collaboration

December 7, 2011 2 Comments

Fiskabur 2011

With the export of Movement of Jellyfish by Loyola High School student Leopoldo Magana, Fiskabur 2011 has fulfilled my original intent: to create and execute a cross-curricular project where students across disciplines contribute to the final product, and the final product has “legs” as its existence furthers other educational objectives.

Fiskabur Ontogenesis

After a lecture to both the music classes and physics classes on the Physics of Music, the music students constructed piezoelectric microphones while some of the Physics students built musical instruments.

The music students borrowed the newly-made instruments, sampled them with their homebrewed-microphones, and composed loops. A small-group of the music students went to the Aquarium of the Pacific in Long Beach and filmed many exhibits which I put on our YouTube channel.

Drawing inspiration from these videos, the music students composed soundscapes and music with their composed loops. These loops were loaded onto reactive apps of their design through the RJC-1000 software. Along with still pictures of the fish that inspired the compositions, the music was given a layer of sonic augmented reality.

The apps were loaded onto iPhones, iPads, and iPods, taken back to the Aquarium of the Pacific, and “performed” and recorded in front of the fish that inspired the music. A team of student filmmakers accompanied the composers to gather footage of the project and the creatures themselves.

All of the resulting loops, videos, samples, and RJDJ-pieces were posted on the internet with Creative Commons licenses so all of the musicians and filmakers could share resources.

The Movement of Jellyfish video embedded above is the first all-student product to emerge from Fiskabur. It will be hosted on our YouTube channel, a website dedicated to Fiskabur 2011, and offered to the AP Biology classes as a resource for animal locomotion, one of the curricular points for the course.

Special thanks to my Loyola colleagues: Lance Oschner, Fr. John Quinn, & Craig Bouma for their participation and support. Thanks to the Long Beach Aquarium of the Pacific, and Marilyn Padilla, in particular, for their permission and support of our filming and recording on the premises.

Fiskabur Samples released into the wild via Creative Commons

October 27, 2011

After my annual “Physics of Music” lecture/demonstration I give each Fall to our science students, some of the kids create musical instruments based on these acoustic principles.

This year, my music students borrowed some of those instruments and recorded improvisations with contact mics they built from piezo discs.


These loops, some distorted, some not, were loaded onto my Soundcloud account with a Creative Commons license that allows for derivative works on the condition that subsequent works are similarly licensed.

My students are writing music inspired by the creatures living at the Aquarium of the Pacific using these loops as source material.

Quote of the Week: John Cage and some Fiskabur Thoughts

October 15, 2011

by Lars Ploughman Creative Commons

    Ideas are one thing and what happens is another.

    -John Cage

Within the RJDJ scene creation software, RJC-1000, I can make four pages for my iPhone, each hosting samples or Pure Data patches. I am reconsidering exactly what to assign my students to compose for each of their pages. At first, I gave them free reign to compose whatever they wanted. After some consideration, I think there is more artistic and pedagogical value to stipulating an exact structure for each section while leaving the subject of the composition open to interpretation.

I have decided to do more work with them with metaphor, Cagean-aesthetics of chance, and some of the music-installation work of David Tudor. Perhaps we will go to the Getty to see David Tudor’s “Sea Tails”, which was on permanent display. I believe in this kind of intermedia work when teaching this kind of abstraction because there are multiple opportunities for my students to enter into the aesthetic.

    My current plan for the pages is as follows:

    Page 1- Close view of a sea creature. Through-composed soundtrack sample in MIDI inspired by animal locomotion and/or behavior.

    Page 2- Sonic-portrait of the animal created entirely from samples recorded and digitally manipulated by the students using homemade piezo microphones, digital recorders, and computers.

    Page 3- General portrait of the entire display including plant life, rocks, light, etc… Students may use MIDI, samples, anything at the student’s disposal. Especially want them to search for metaphor in their expression.

    Page 4- Student’s choice. Students may express anything they like about the creature in sound.

    For each of the pages, the level of interactivity or augmented reality is left to the student, as long as they can explain their choice.

I am excited to see the results of their work. Next, I will meet with the student-filmakers to explain the project and develop some ideas with them for a documentary about the process and art films.

Aquarium Reactive App Project Overview: 2011

October 13, 2011 2 Comments


One of my music appreciation classes is beginning a new project: writing music inspired by sea creatures, then embedding those songs in an audio augmented-reality app.

The project began with students going to the Long Beach Aquarium of the Pacific to shoot images and gather film-footage of some of the creatures living there. The footage is for reference inspiration when composing.


Link to the YouTube Aquarium Playlist

I have assembled the clips into small movies in iMovie, removing the sound. The short films are loaded to our YouTube channel to be available for my students at school and home.

The photos are edited and loaded to the app so the user can match the visual on the phone to the actual display in the aquarium.

I am building on the compositional skills learned in the “Krapp’s Last Tape” project in the creation of the apps. Using the RJC-1000 software by the RJDJ team, each app can host up to four samples as well as little reactive modules written in Pure Data.

RJDJ, as hosted on an iPhone or iPad, allows programmable access to the microphone, gyroscope, etc…, to modify sounds and filters written in Pure Data. Each “page” of a “scene” can be labeled with an image. Four pages are possible per scene. The project I assigned my students is to compose four reactive compositions/soundscapes, each of which is inspired by a different aspect of a single exhibit at the Aquarium of the Pacific, then perform and record the piece at the aquarium in collaboration with the animals.

Once the apps are written and loaded onto devices, we will return to the aquarium, film the composer using the app in front of the display, and record the resulting reactive composition. The composition will then become the soundtrack for the film. We plan to make a documentary as well as art films.

I have been in touch with Marilyn Padilla of the aquarium. She has been very helpful in negotiating releases for the educational uses of our filming and photography. The Aquarium has been very open to our project and approval for the work came in one day! Extraordinary!

Krapp’s Last Tape: Soundwalk 2011 Installation

October 3, 2011 1 Comment



Go to 20’21” to hear our contribution to the Audio Catalog.

The Soundwalk festival happened this past Saturday. The installation created by my students was very well-received. It was particularly interesting to watch the variety of interactions with the piece; I documented some of them in the above video.

(more…)

Krapp’s Last Tape: Mapping Fiducials to Ableton with ReacTIVision and some Links

September 29, 2011

With our Soundwalk installation date rapidly approaching, my students have just finished their soundscapes, hosted them in Ableton Live 8, and are learning to MIDI map them to ReacTIVision fiducials via OSCulator.

What I have enjoyed about this kind of project is seeing my students not only grapple with the abstractions of composition, but watching them consider multiple modes of interactivity. I think the important lesson is not just learning to execute the project by daisy-chaining software, but dealing with the metaphors of the objects and their associated sounds.

I am especially appreciative of Martin Kaltenbrunner and Ross Bencina for their open-source gift of ReacTIVision, the software that allows fiducial-tracking. This has opened up a creative use of technology to my high school students that otherwise would be well beyond their capabilities, not to mention mine!

Even the simplistic implementation we use has far-reaching implications for metaphoric and abstract thought.

Links:

ReacTIVision

Osculator

Ableton Live

Krapp’s Last Tape: Proof of Concept Number 1

September 26, 2011

Our Soundwalk entry this year is inspired by Beckett’s “Krapp’s Last Tape”. In the play, Krapp interacts with his recorded self separated by some thirty years.

Our installation seeks to explore the experience of memories through an interactive soundscape.

Participants will visit a table with a tape player and objects of childhood. Placing cassette tapes on the table triggers soundscapes designed by my students. The soundscapes recreate a memory from their past in the way they remember it.

At the same time, the installation itself is a soundscape. The tape player sets the keynote sounds establishing the time and place of the table in the late 20th-century. An omnidirectional microphone picks up the recording as well as any ambient sound. Projected in front of the table is a ghosted and delayed video capture of the installation itself.

The participants will see and hear the experience in realtime and delayed by three to five seconds. Our hope is to convey the distortion of time upon the memory of one’s experience.

This proof of concept is to check the delay capabilities of the computer and software over several hours, and to set basic parameters for this part of the project.

The software used for this portion of the installation includes Macam and Live Video Delay by Zach Poff.

From the Macam website:

    macam is a driver for USB webcams on Mac OS X. It allows hundreds of USB webcams to be used by many Mac OS X video-aware applications. The aim is to support as many webcams as possible.

    In addition, macam also supports downloading of images from some dual-mode cameras. macam especially tries to support those cameras not supported by Apple or by their manufacturers.

From Zach Poff’s Website:

    This app was made to answer a question from a student: “How come there’s no free software to delay a live video feed?” The original version used the GPU for fast video processing and storage, but VRAM is too limited for long delays, so version 2010-04-08 uses system memory to store the recorded frames. Stereo audio is also delayed, with independent delay-time so you can tweak the synchronization between picture and sound. Additionally, you can mix between the live and delayed video feeds with a crossfade slider. DV-NTSC runs at a solid 30fps on a MacBook Pro laptop, but HD resolutions might be faster using the older (more limited) version of the app.
%d bloggers like this: