Audiovisual Parameter Mapping in Music Visualizations
5 The Live Factor
The original area of application for digital real-time visualizations encompassed both club visuals (VJing) and live performances within the scope of electronic music concerts (live cinema). While a club VJ often works on his or her own at the side of a DJ, visual artists explore specific connections between sounds and images in a production collective with audio artists. The close collaboration between musicians and visual artists leads to a favorable alignment of music and the visual system, especially in the live context. The use of visuals has now been extended to events of all kinds, from concerts to installations to the design of professional theater performances.
As a rule, the sets for the musical performance and the visuals are carefully prepared in the run-up to a live performance. The esthetics of the visual level is determined by the good choice of source material, such as photo stills, film sequences, text, geometric objects, and abstract elements, with the controllability of the elements and a high degree of interaction playing a special role here. Many artists change their sets for each concert and in so doing compile their personal tool box over time, which in practice and after repeated performances leads to the development of a whole repertoire of effects. This individual method results in a personal and, in the best cases, innovative style and thus has the added value of artistic exploration.
The control of the individual parameters can be carried out via mouse or keypad (virtual buttons or actuators on the screen, use of the data from the mouse position, or key combinations) or with the aid of external devices (to name just a few of the many possibilities: MIDI controller, joystick, WII remote control, iPhone applications via OSC, etc.).
Not every type of sound lends itself to every visual system and its immanent sound analysis. The complexity of a visual result based on audio data is dependent on the quantity and quality of the musical parameters. If one only analyzes the volume parameter, for example, an evenly low, noisy sound with short interposed high-frequency sounds will be difficult to differentiate at the visual level. A more precise frequency analysis of the sound enables partitioning of the sounds in different wave bands so that data values for high, medium, and low sounds can be determined and used for the generation of images.
During a live performance, the actual audio interpretation does not in fact take place until the performer acts as a filter and interpreter. The decisions on the use of the control options are made spontaneously and thus lead to a result that cannot be repeated and is unique — visual artists improvise live with their set. Audiovisual performances in electronic music call for a high degree of concentration on the part of the recipient and therefore often only last for 35 to 45 minutes.
Classic VJing in a club context is frequently only the first step for artists working in the area of audiovisual design options. An increasing differentiation of genres has occurred in recent years. The term sound sculptures, for instance, is also used to describe artistic works that consist of generated sounds and images.[15]
Not many contemporary artists exploring connections of sound and image perform in live contexts. They produce multimedia works that are presented at festivals or released as DVDs and/or create installations that develop their effects in a public space. Some institutions and organizations commission generative works of art that are presented on purpose-built objects together with screen and loudspeakers.
An increasing use of synthetic surfaces as potential screens can be detected in particular in public space. With the dissemination of LED technology, today there are numerous buildings and billboards with playable surfaces that are also used for artistic interventions.
Works: advanced beauty, iPhone, MIDI-Controller, Wii
People: Matt Pyke
Socialbodies: Universal Everything