Graphic notation evolved in the mid-20th century as composers sought to convey musical ideas which could not be effectively communicated via traditional musical notation. Composers such as Karlheinz Stockhausen, George Crumb, and John Cage incorporated new symbols, shapes, and visual devices into their scores.
György Ligeti composed the electronic piece “Artikulation” in 1956. In 1970, designer Rainer Wehinger revisited the piece with his own “Hörpartitur” (score for listening). Wehinger transcribed the piece mapping the x-axis to time, the y-axis to pitch, and shape & color to timbre.
Between 1963-67, composer and draughtsman Cornelius Cardew created his 193-page graphic score, “Treatise.” Unlike many other graphic scores, “Treatise” does not include any explicit instructions as to how its multitude of symbols are to be interpreted. Consequently, every performance of the piece is completely different.
In 2002, I created an electronic realization of this piece. My aim was to let the beautiful visual design speak for itself, in a sense, using a very straightforward interpretive paradigm. Sine waves are generated from the black areas of the score as it scrolls from right to left, with the y-axis corresponding to pitch. An imaginary vertical line in the center of the screen is the “sounding membrane”.
In 2002, I also created an app to generate an animated graphic score from musical input (reminiscent of Wehinger’s approach, creating a score from Ligeti’s music, rather than music from a score). “Correspondance” was made using Director and Flash to accept input from MIDI (Musical Instrument Digital Interface). It generates animated visuals from realtime MIDI note input (specifically pitch, velocity, and duration). “Correspondance” has a number of interpretation paradigms, mapping the MIDI input to various visual characteristics (x/y position, size, color, etc).
Correspondance also has a networked option where two performers can link to each other. Each performer sees the animation generated by the other player’s musical performance. An audiovisual feedback loop can be created if each performer treats the visuals an an animated graphic score. Unfortunately the app only works on Mac OS 9 so it’s not currently available.
In 2006, I created “Fixed and Fallen” using Flash. This app detects audio input via microphone and generates a graphic score which builds and decays simultaneously. It can generate a kind of map of the overall rhythm and dynamics of a piece of music.
Every fraction of a second, the loudness of incoming sound is analyzed while a grid of colored circles is generated. The size and color of each circle is determined by the degree of loudness at the moment it is generated (large & red = loud; small & blue = quiet). The entire grid also pulsates in realtime in response to loudness. After a while, each fixed circle eventually falls or drifts off the screen. I used an excerpt of Brian Eno’s “Ambient 1: Music for Airports” in this example because of its long stretches of quiet punctuated by piano notes.
Visual design and art can silently convey aspects of music by letting its architecture exist across time, rather than just the ever-changing present. This can provide insight and a kind of synesthetic experience; however, music’s power is also in its patterns of vibration felt through the ears, skin, and bones.