mostlynoise

noise5

A blog following my musical activities

You can scroll the shelf using and keys

Streaming Experimental Music on Earmeal

September 30, 2012

Streaming Experimental Music on Earmeal

Camera 2 on me playing toy piano music for the video podcast Earmeal on LA Artstream, September, 2012.

Because of my participation in a local Sound Art festival called Soundwalk, I was invited to be part of a video podcast called Earmeal, one of a number of great video podcasts about the Los Angeles Art scene.

The host and producer, Alan Nakagawa, offered me a 30-minute slot to perform anything I wished. His aim is to document the experimental music scene in Los Angeles. I decided to show some of the ways I incorporate experimental music and sound art into my music appreciation class curriculum at Loyola High School of Los Angeles.

After a quick setup and sound check, the cameras went live and I was on for a half-an-hour. Alan was wonderful about setting me at ease, but the two camera shoot still made me kind of nervous. Plus, I was using technology in my performances. Despite the best laid plans, gremlins often make appearances when you least want them.

I was really honored to be able to share and document some of my work with Alan. I am not teaching Electronic Music or Sound Art per se. I use these projects in a constructivist manner for Music Appreciation. I am introducing students, who are not necessarily musicians, to experimental music composition and performance.

Alan’s work on Earmeal inspires me. It is an important service he is providing for current and future artists, scholars, and enthusiasts. I hope to replicate Alan’s sense of community and altruism in the work I do with students at Loyola High.

The Graphic Score as Instrument

March 31, 2012

Sound Art project 2012, Loyola High School Music Appreciation

Because music is ephemeral, invisible, and abstract, it is peculiarly difficult to talk or write about it. These complexities are never more apparent than when trying to compose. Devising a way to efficiently and explicitly communicate sonic intention to another person borders on being a black art! In my music appreciation classes, we examined how scores convey the invisible abstractions of music. We also examined the limitations of traditional notation and studied other composer’s graphic solutions for unique compositional problems.

We spent a week doing some “Deep Listening” exercises as crafted by Pauline Oliveros, reading Michel Chion’s essay on the nature of sound, and examining the notion of soundscape as defined by R. Murray Schafer.

To synthesize these concepts, I had the students make graphic scores of one of their “Deep Listening” experiences. Creating these scores was an opportunity for them to consider compositional issues. They asked themselves questions like “What sounds are demanding my attention and what sounds are in the background?” “How do I notate the Doppler effect of a city bus driving by?” “What sounds are analogs to the musical notion of key?”

As we proceed, the kids blog about their experiences and expressive processes. The projects in this post’s embedded videos represent an important leap: the move from perception to intention. I asked them to score a piece for pencil theremin and electronics where the graphic score is also the instrument.

A small circuit board with a battery and speaker is attached to a regular wooden pencil. Copper tape is attached to one end of the pencil and wrapped around the pencil where it is grasped. More tape connects the tope end of the board to the top of the pencil. A metal pushpin is stuck through the tape and into the pencil lead itself. Therefore, when one draws with the pencil and touches the graphite on the paper, the circuit is completed and a sound is emitted from the speaker. The greater the distance between the contact points, the lower the pitch!

Using the Pure Data patch host provided by the folks at RJDJ, a reactive sound app company, the students composed reactive recorder/distorters to augment the variety of sounds as well as the reality of the live performance.

In spite of the limitations imposed by the pencil Theremin’s limited sonic palette, some students devised interesting modes of interaction with their scores and apps.

One of the interesting challenges this project presented was the video creation. The composers could choose whether they would play their own piece or be one of two cameramen filming the performance. If they played, the nature of the movie was left in someone else’s hands, and vice versa. They had to abandon complete control of their piece one way or another.

Loyola High School Music Gathers STE(A)M

January 1, 2012

STEM, a govenment acronym for studies in Science, Technology, Engineering, and Mathmatics has gained another letter in education circles: A for Arts. I join John Maeda as a proponent of STEAM curriculum. Maeda, a former student and faculty member of MIT and current president of the Rhode Island School of Design, writes:

And so I’ve begun to wonder recently whether STEM needs something to give it some STE(A)M—an “A” for art between the engineering and the math to ground the bits and bytes in the physical world before us, to lift them up and make them human. What if America approached innovation with more than just technology? What if, just like STEM is made up of science, technology, engineering and math, we had IDEA, made of intuition, design, emotion, and art—all the things that make us humans feel, well, human? It seems to me that if we use this moment to reassess our values, putting just a bit of our humanity back into America’s innovation engines will lead to the most meaningful kind of progress. By doing so, we will find a way back to integrating thinking with making and being and feeling and living so that left- and right-brained creativity can lift our economy back into the sky.

– John Maeda in Seed Magazine

I recently discovered that my YouTube video of Steve Reich’s Pendulum music was embedded in a Scientific American blogpost about “cyborg yeast”. Reich’s Pendulum Music is a “process piece” which combines the acoustic phenomenon of a feedback loop in conjunction with the randomness of pendulum swings to create a landscape of slowly-shifting pitches and timbres.

Even though my video is only tangentially related to Christina Agapakis’ post-topic, I believe it serves as one model of how easily the arts may be integrated into STEM topics.

I am grateful that Ms. Agapakis used our music video to illustrate a scientific concept. I think she uses it effectively. Its inclusion introduces an artform and aesthetic to students and audiences that it might otherwise not meet.

Because art so easily illustrates science concepts, I fear that general artistic misunderstanding may make this particular example a STE(A)M curriculum model by being a path of least resistance. To be pithy, I imagine well-intentioned teachers assigning crafts and not arts. Art engages the heart and mind. Students deal with metaphor and expressing ideas. Artists grapple with technique and communicating the ephemeral minus the semantics of science while simultaneous applying scientific principles.

My colleagues at Loyola High School are very open to substantive STE(A)M work letting our students get their hands dirty in both the arts and sciences. My recent Fiskabur project and our joint Physics of Music lectures are great examples of STE(A)M work.

An Unwitting Collaboration: Props from Video Artist David Montgomery

December 13, 2011

While checking my email this morning, I found a nice note and blogpost from artist David Montgomery. He wrote about a video on which he unknowingly collaborated with me. I took audio from a concert where I played John Cage’s “In A Landscape” and by applying “Cagean” procedures to David’s Dandelion loops, created a music video.

I sent the short film to David. He was very complementary!

Steven Speciale is an inspired music teacher and choir director at Loyola High School in Los Angeles. When I post my work to the Internet Archive I think of it being used for VJing or video remix though I always hold out hope that it could be used in education. Steven Speciale managed to combine all of the above in brilliant fashion. I guess it shouldn’t be a surprise that someone from LA, home to LAVA (one of the most established video art coalitions in the US), who is a mainstay at SoundWalk in Long Beach would make the most innovative use of the Dandelion Free Culture video loops to date.

Personally, it is very satisfying and validating to have one’s work acknowledged by colleagues. The reality is that none of this would have happened without the culture of sharing promoted by the open-source movement, and by extension, social media. David Montgomery would appear to live in Florida; I live in California. Our paths only crossed in this virtual realm. Because they did, we both have benefited, and because we benefit, my students benefit.

Fiskabur 2011: A Milestone for Collaboration

December 7, 2011 2 Comments

Fiskabur 2011

With the export of Movement of Jellyfish by Loyola High School student Leopoldo Magana, Fiskabur 2011 has fulfilled my original intent: to create and execute a cross-curricular project where students across disciplines contribute to the final product, and the final product has “legs” as its existence furthers other educational objectives.

Fiskabur Ontogenesis

After a lecture to both the music classes and physics classes on the Physics of Music, the music students constructed piezoelectric microphones while some of the Physics students built musical instruments.

The music students borrowed the newly-made instruments, sampled them with their homebrewed-microphones, and composed loops. A small-group of the music students went to the Aquarium of the Pacific in Long Beach and filmed many exhibits which I put on our YouTube channel.

Drawing inspiration from these videos, the music students composed soundscapes and music with their composed loops. These loops were loaded onto reactive apps of their design through the RJC-1000 software. Along with still pictures of the fish that inspired the compositions, the music was given a layer of sonic augmented reality.

The apps were loaded onto iPhones, iPads, and iPods, taken back to the Aquarium of the Pacific, and “performed” and recorded in front of the fish that inspired the music. A team of student filmmakers accompanied the composers to gather footage of the project and the creatures themselves.

All of the resulting loops, videos, samples, and RJDJ-pieces were posted on the internet with Creative Commons licenses so all of the musicians and filmakers could share resources.

The Movement of Jellyfish video embedded above is the first all-student product to emerge from Fiskabur. It will be hosted on our YouTube channel, a website dedicated to Fiskabur 2011, and offered to the AP Biology classes as a resource for animal locomotion, one of the curricular points for the course.

Special thanks to my Loyola colleagues: Lance Oschner, Fr. John Quinn, & Craig Bouma for their participation and support. Thanks to the Long Beach Aquarium of the Pacific, and Marilyn Padilla, in particular, for their permission and support of our filming and recording on the premises.

Fiskabur Samples released into the wild via Creative Commons

October 27, 2011

After my annual “Physics of Music” lecture/demonstration I give each Fall to our science students, some of the kids create musical instruments based on these acoustic principles.

This year, my music students borrowed some of those instruments and recorded improvisations with contact mics they built from piezo discs.


These loops, some distorted, some not, were loaded onto my Soundcloud account with a Creative Commons license that allows for derivative works on the condition that subsequent works are similarly licensed.

My students are writing music inspired by the creatures living at the Aquarium of the Pacific using these loops as source material.

Quote of the Week: John Cage and some Fiskabur Thoughts

October 15, 2011

by Lars Ploughman Creative Commons

    Ideas are one thing and what happens is another.

    -John Cage

Within the RJDJ scene creation software, RJC-1000, I can make four pages for my iPhone, each hosting samples or Pure Data patches. I am reconsidering exactly what to assign my students to compose for each of their pages. At first, I gave them free reign to compose whatever they wanted. After some consideration, I think there is more artistic and pedagogical value to stipulating an exact structure for each section while leaving the subject of the composition open to interpretation.

I have decided to do more work with them with metaphor, Cagean-aesthetics of chance, and some of the music-installation work of David Tudor. Perhaps we will go to the Getty to see David Tudor’s “Sea Tails”, which was on permanent display. I believe in this kind of intermedia work when teaching this kind of abstraction because there are multiple opportunities for my students to enter into the aesthetic.

    My current plan for the pages is as follows:

    Page 1- Close view of a sea creature. Through-composed soundtrack sample in MIDI inspired by animal locomotion and/or behavior.

    Page 2- Sonic-portrait of the animal created entirely from samples recorded and digitally manipulated by the students using homemade piezo microphones, digital recorders, and computers.

    Page 3- General portrait of the entire display including plant life, rocks, light, etc… Students may use MIDI, samples, anything at the student’s disposal. Especially want them to search for metaphor in their expression.

    Page 4- Student’s choice. Students may express anything they like about the creature in sound.

    For each of the pages, the level of interactivity or augmented reality is left to the student, as long as they can explain their choice.

I am excited to see the results of their work. Next, I will meet with the student-filmakers to explain the project and develop some ideas with them for a documentary about the process and art films.

Promoting my Aquarium App Project (code-named Fiskabur)

October 14, 2011 2 Comments

I have decided that if a tree falls in the woods and no one was around to hear it, it still fell. The important part is that no one was around to hear it!

I am learning how to promote my work. While I am not in it for profit, creating marketing-like materials helps me to build a portfolio/diary of the work I do, builds pride and community within the team of people or students sharing the project workload, and increases awareness and support of our activities.

I have begun to create posters, whether real or virtual, for as many projects as possible. I currently do not know how to use Photoshop or any of the Adobe design products. I have made all of my posters in Pages. I shoot pics with my phone or a point and shoot. I cobble together cartoon drawings with the shape tools.
(more…)

Aquarium Reactive App Project Overview: 2011

October 13, 2011 2 Comments


One of my music appreciation classes is beginning a new project: writing music inspired by sea creatures, then embedding those songs in an audio augmented-reality app.

The project began with students going to the Long Beach Aquarium of the Pacific to shoot images and gather film-footage of some of the creatures living there. The footage is for reference inspiration when composing.


Link to the YouTube Aquarium Playlist

I have assembled the clips into small movies in iMovie, removing the sound. The short films are loaded to our YouTube channel to be available for my students at school and home.

The photos are edited and loaded to the app so the user can match the visual on the phone to the actual display in the aquarium.

I am building on the compositional skills learned in the “Krapp’s Last Tape” project in the creation of the apps. Using the RJC-1000 software by the RJDJ team, each app can host up to four samples as well as little reactive modules written in Pure Data.

RJDJ, as hosted on an iPhone or iPad, allows programmable access to the microphone, gyroscope, etc…, to modify sounds and filters written in Pure Data. Each “page” of a “scene” can be labeled with an image. Four pages are possible per scene. The project I assigned my students is to compose four reactive compositions/soundscapes, each of which is inspired by a different aspect of a single exhibit at the Aquarium of the Pacific, then perform and record the piece at the aquarium in collaboration with the animals.

Once the apps are written and loaded onto devices, we will return to the aquarium, film the composer using the app in front of the display, and record the resulting reactive composition. The composition will then become the soundtrack for the film. We plan to make a documentary as well as art films.

I have been in touch with Marilyn Padilla of the aquarium. She has been very helpful in negotiating releases for the educational uses of our filming and photography. The Aquarium has been very open to our project and approval for the work came in one day! Extraordinary!

“Krapp’s Last Tape” on ReacTIVision Vimeo Channel

October 7, 2011

One of our “Krapp’s Last Tape” videos has been featured on the ReacTIVision Vimeo channel. I encourage you to surf around this channel and see the astounding variety of projects made possible by the open-source release of the ReacTIVision software.

%d bloggers like this: