Below are resources from the fifth session of the Digital Music Production Workshops at Carnegie Hall's Weill Music Institute provided in partnership with Building Beats.
The process by which multiple sounds are combined into one or more channels. In the process, the source signals' level, frequency content, dynamics, and panoramic position are manipulated in order to produce a mix that is more appealing to listeners. Often times adding audio effects such as reverb, delay, and compression are a huge part of this process.
Audio mixing is practiced for music, film, television and live sound. The process is generally carried out by a mixing engineer operating a mixing console or digital audio workstation.
A form of audio post-production, is the process of preparing and transferring recorded audio from a source containing the final mix to a data storage device (the master); the source from which all copies will be produced (via methods such as pressing, duplication or replication). Mastering is a crucial gateway between production and consumption and, as such, it involves technical knowledge as well as specific aesthetics. Mastering engineers may also need to apply corrective equalization and dynamic compression in order to optimize sound translation on all playback systems.
An audio engineer is concerned with the recording, manipulation, mixing and reproduction of sound. Many audio engineers creatively use technologies to produce sound for film, radio, television, music, electronic products and computer games. Alternatively, the term audio engineer can refer to a scientist or professional engineer who designs, develops and builds new audio technologies working within the field of acoustical engineering.
The persistence of sound after a sound is produced. A reverberation, or reverb, is created when a sound or signal is reflected causing a large number of reflections to build up and then decay as the sound is absorbed by the surfaces of objects in the space – which could include furniture and people, and air. This is most noticeable when the sound source stops but the reflections continue, decreasing in amplitude, until they reach zero amplitude.
Delay is an audio effect which records an input signal to an audio storage medium, and then plays it back after a period of time. The delayed signal may either be played back multiple times, or played back into the recording again, to create the sound of a repeating, decaying echo.
The process commonly used to alter the frequency response of an audio system using linear filters. Most hi-fi equipment uses relatively simple filters to make bass and treble adjustments. Graphic and parametric equalizers have much more flexibility in tailoring the frequency content of an audio signal. An equalizer is the circuit or equipment used to achieve equalization. Since equalizers, "adjust the amplitude of audio signals at particular frequencies," they are, "in other words, frequency-specific volume knobs.
Dynamic range compression (DRC) or simply compression reduces the volume of loud sounds or amplifies quiet sounds by narrowing or "compressing" an audio signal's dynamic range. Compression is commonly used in sound recording and reproduction and broadcasting and on instrument amplifiers. Audio compression reduces loud sounds which are above a certain threshold while quiet sounds remain unaffected. The dedicated electronic hardware unit or audio software used to apply compression is called a compressor. In recorded and live music, compression parameters may be adjusted by an audio engineer to change the way the effect sounds.
EXCERSISES / EXPLORATION: