The Mixdown Landscape

The Mixdown Landscape

Compártelo Facebook Twitter LinkedIn

The Mixdown Landscape

Facebook Twitter LinkedIn
Contributor:  Mike Overlin, Manager of Worship Resources
Yamaha Corporation


When it comes to recorded music, the mixdown is the crucial step where the individual recorded tracks of each instrument or vocal are combined to make the finished master.

Much of what we do when we're tracking a song is designed to give us as much flexibility and control as possible in the mix down stage, for example:

  • We close-mic instruments and isolate them from one another to minimize their interaction with each other.

  • We record our vocals separately so that only the voice (and nothing else) is recorded.


The end result is more control during mixdown, but all of that freedom can be overwhelming.

Think Visual
In addition to being a musician and an audio engineer, I'm also a photographer and it sometimes helps me to approach the mixdown process visually, imagining a virtual stage where I can place all the elements of my recording.

One of the most powerful aspects of the multitrack recording process is the control we have during mixdown. Isolation of the individual tracks allows us to place them almost anywhere we want. In fact, this freedom makes much of what we do in contemporary music pretty contrived.  If you sat in a performance venue and listened to a drummer onstage, the sound would have originated from one spot. But in recording, the way we place the individual elements of our drum kit our virtual stage makes that kit so much larger.

Three Dimensions
So let's look at this virtual stage. When I approach a mixdown, I see a structure where I can place elements of my recording.

First: The X Axis

 X Axis Structure for Recording


Aspect: Panorama

The most obvious is the left to right panorama (that's where we get the term 'pan'). This is the easiest one to understand because it corresponds so directly to the real world.

Using our pan control, we can place our recorded tracks anywhere on this virtual stage with little regard to the physical constraints of reality. Let's look at our drum kit example again. We will typically place the kick and snare in the center of our stage, but will often place the hi hat to the left or right. This can certainly make for a more interesting listening experience, but for this to occur in the real world, our drummer would have to be playing a hi hat 30 feet off to the side. The same thing is true with the toms. It is very common to hear a set of toms go from our left speaker to our right speaker in a recording, but to experience that in the live world, the kit would have to be 100 feet across.  We are actually more closely duplicating the experience of the drummer when he plays his kit than what the audience hears. We also take this same approach with acoustic piano when we record it in stereo and pan the instrument left and right. This results in a listening experience much closer to that of the pianist than what his or her audience would hear in the real world.

Second: The Y Axis

The Y Axis for Recording


Aspect: Frequency

I visualize it vertically from bottom to top.  All of our recorded tracks have frequency characteristics that will allow us to orient them vertically, up and down as well as the left to right that panning provides. Every engineer has their own approach to placing instruments and vocals during mix down. I usually start with the kick drum and bass, with a goal of wrapping the bass around the core of the kick drum. This is that driving force that defines a lot of pop music. I work my way up through the rest of the kit placing them vertically working up through the toms, snare, hats, cymbals etc. and horizontally as discussed earlier, panning the different drums wherever I desire on my stage. As you work through the rest of your tracks, you will begin to "see" how instruments and vocals can be placed horizontally and vertically across the virtual canvas of mixdown.

Third: The Z Axis

 The Z Axis for Recording


Aspect: Depth

The third dimension is the subtler concept of depth, or front to back on our stage. There are a number of tools available to us to give depth to our recordings. One of the most common and obvious tools is reverb. Its main purpose is to give a sense of space to a recording. In the real world, the closer we are to a sound source, the more dry and "in your face" it sounds. As we move away from it, it will begin to take on more of the characteristics of the space around it. You can define how reverberant your virtual stage is. How much reverb you apply to each sound source will help to place it on the stage by varying perceived distance from the listener.

Another subtle way to approach distance and depth is EQ, as sounds very close to us will be distinct with a lot of presence while those more distant will take on a bit of a muffled sound. By varying the tone of a sound source, we give it the characteristics of close proximity by providing it with well-defined presence while pulling down some of the high-end content can give it that muffled, more distant sound. Then of course, there's volume with voices and instruments closer to us louder than those further away.

The Virtual Stage

Recording, like so many of the arts, has rules that are really much more useful as loose guidelines. Some of the most important advances in any art form have come about when someone took a "rule" and at least bent, if not completely broke it. You may find that an entirely different approach to mix down works for you, but this virtual stage works well for me. It's easy for me to approach a recording in this way, placing my tracks like chess pieces on a board. Some of these positioning tricks are especially visible, such as placing instruments to the far right or far left in the mix.  Others, like EQ and reverb, are much subtler.

If nothing more, I hope this is a starting point to help you "see" your mix down as you place the building blocks of your recording and structure your finished master exactly the way you envisioned it.

Mike Overlin HeadshotMIKE OVERLIN is the Manager of Worship Resources for Yamaha Corporation of America, and a worship leader with the Free Methodist Church in Southern California. A professional musician all his life, he served two tours of duty with the United States Air Force Band, earned two gold records for audio engineering and is also an accomplished professional photographer.  You can see and hear Mike in action at upcoming Worship Arts Technology Summit (WATS) events, where he leads sessions in the Musician, Audio and Media Tracks.