Audiovisual Live Performance

8 Laptop/Performativity

Visual performers who put themselves into the performance,[8] such as VJ Miixxy, prompt one of the questions that has arisen as laptop-based computer performance has gained in popularity: because the visual and sonic output of laptop performance often does not correlate with the visible physical gestures of the performer, what is the visual role of the performer? Musical performance on more traditional instruments might be considered audiovisual by nature, owing to the audience experience of watching the performers’ gestures integrated with the music. Laptop performers are now beginning to address the question of performativity. Some laptop musicians and visual performers are making their performances more gestural by using traditional computer music interfaces, like MIDI keyboards, or less conventional performance interfaces, such as Wii controllers and Tagtools.[9] Other performers, such as those who practice livecoding,[10] choose to project their screen interface as part of the show so that the audience can observe their actions on the screen. Still others, particularly visual artists, prefer that their performance actions be invisible so as to focus audience attention on the images and sound. Many VJs, DJs, and laptop musicians feel that projected visuals themselves take the place of watching a performer make music through gesture. Although most discussion of audiovisual integration in contemporary performance currently focuses on the relationship between the sounds and images generated by the performers, the visual relationship between the performer and the performance is likely to emerge in the near future as another important consideration in the audiovisual experience.

The author of this text performs as VJ Übergeek, a VJ who puts herself into the performance onstage rather than onscreen.  
Commercial product developers have begun to recognize electronic musicians’ and visual performers’ growing interest in gestural interfaces. JazzMutant’s Lemur, for example, is a multitouch interface which lets performers slide their fingers along a touchpad rather than click and scroll with a computer mouse.  
Musical instrument digital interface (MIDI) is an industry-standard digital protocol, defined in 1982, that enables electronic musical instruments such as keyboards and controllers to transmit control signals (event messages) to compatible soft- or hardware. MIDI signals can be used to control the pitch of an audio-synthesizer or the intensity of an electronic image by means of physical interfaces such as buttons, knobs, and sliders. Livecoding is the name given to the practice of writing software in real time as part of a performance.  
1
2
3
4
5
6
7
8