PHASE

by Rachel Field

 

 

As a mastering engineer, I see the final culminating result of a lot of decisions made over the course of a recording and mix. One really common issue I come across is cancellation and filtering caused by phase incoherence. A lot of times, when discussing this with the recording and/or mix engineer, it becomes apparent that it wasn’t on the engineer’s radar as something to watch for, and many folks do not know how to effectively manage phase. I’ve seen this come up with DIYers, hobbyists, and professionals alike. I am truly amazed at the sheer amount of people that do not know about phase. I’ve decided to crusade against this lack of knowledge, and hopefully, abate this far too common issue! So please read this and if you learned something from it, share it with your audio friends! I want your mixes to sound better and I hope I can help!

 

What is Phase? 

Let’s start by digging into the concept of phase. What is it? Why do I need to think about it? What are the problems caused by inattention to phase coherence?

 

In audio, phase is one of the properties of a waveform. It describes where the waveform is in its cycle and is measured in degrees. One full cycle is 360 degrees. If you display a sine wave across a timeline, which is a typical visual representation of a waveform, you will see that the waveform alternates positive (above the zero line) and negative (below the zero line) in equal amounts and frequency. If two signals that have coherent phase alignment are added together, the amplitude is increased.

 

(Image ©2017-18 Prof. Jeffrey Hass)

 

If youou take the same waveform and flip the phase, you will then have negative phase where positive used to be, and vice versa.

 

If you add the original waveform to the flipped waveform, all totals across the timeline will equal zero. This is silence. 

 

(Image ©2017-18 Prof. Jeffrey Hass)

 

Phase cancellation occurs with any degree of phase incoherence and can manifest as a filtering effect, a strange pulling sensation to the listener, or total cancellation. If the phase incoherent audio is panned out to the sides, the engineer may not notice, as the two signals will not be mathematically summed together in that case.

 

Common Ways Phase Incoherence Can Happen

 

If you have 2 signals of the same source while tracking, it is extremely important to check for phase coherence BEFORE you begin tracking. Phase incoherence happens at the time of recording. For example, let’s say you are recording a kick drum, and you are using a combination of two microphones at the head. If those signals are phase aligned with each other, then adding one to the other will sound louder and additive. If they are out of phase, it could sound thinner when you add them together. The closer to 180 degrees out of phase they are, the more of the signal that will be canceled by adding them together. The position (degree) of where the waveform is in it’s cycle during tracking is determined by the distance of the source to the diaphragm of the microphone (and the phase of the actual source). 

 

Sound travels through air at a known speed of about 1100 ft/sec, meaning a difference of even fractions of inches in distance from the source will impact relative phase between two microphones. You can start thinking about phase while placing your microphones and make them equidistant from the source. 

 

Another scenario in which to watch for phase coherence is when you are tracking the same guitar part through two different tube amps, or a solid state amp with a tube amp. Different tube amps can run 180 degrees out of phase from each other, depending on how many gain stages are within the circuitry of the amp. In these cases, if your mic diaphragms are the same distance from the cones, flipping one guitar track will help reduce filtering and/or cancellation.

 

A Challenging phase situation is when close-mic’ing an acoustic grand piano with two microphones. No matter what, you’ll have some frequencies hit each diaphragm at different phases in their cycles since you cannot get them equidistant from all sources (hammers on strings). When this is really bad, you will hear the piano sort of come and go as the player moves up and down the scale, depending on which notes are in or out of phase. Some of this can be mitigated by using the microphone’s polar pattern to reject some of the notes that will be picked up by the other mic. Otherwise, you can keep playing with the placement until you find a combination that feels acceptable and suitable to you. You can even tailor this based on the part (notes) that will be played if you have that luxury. With the mics panned wide in the stereo image, you may not notice any cancellation (unless in a mono playback system), but there is still going to be a shift in the feel of the piano to the listener at those places.

 

Consider a speaker. The speaker pushes air out during the positive portion of the sound wave’s cycle and sinks in during the negative part. If two speakers in the same stereo system are wired out of phase from each other, a good portion of the signal gets lost (most notably low end). And it just sounds…. weird and uncomfortable.

 

 

Methods for Checking Your Phase Coherence

 

  1. Bring both signals to center pan. With the faders down, solo both signals. Bring one of the faders up. Listen to how it sounds. Feel how loud it is. Note how much low end is in your signal. Next, bring the other fader up in level. As you bring that fader up, your signal should be getting louder and fuller, not quieter and thinner. 

  2. With both tracks centered, determine each fader position that gives you approximately the same monitoring volume in your speakers, one at a time. Solo both tracks together. Flip the phase on ONE of the tracks and determine which position sounds clearer, fuller and louder. Again, note the low-end content of your signal, before and after summing the two tracks together. If you notice a partial loss, you may want to play with microphone positioning until you notice a more distinct difference, Indicating you are either closer to phase coherent or closer to 180 out (if flipped sounds better). If you find that flipped sounds better, record it flipped. Flip it on the preamp if you can, or at some point before your tape machine or DAW.

  3. RECORD a sample of your two tracks of the same source into your DAW and literally LOOK at your two waveforms and see if they follow each other as they rise and fall across the timeline. Of course, you can’t do that if you’re tracking to tape, but it is a good visual confirmation of where you’re at regarding phase. 

  4. Use a phase meter! There are enough free plugins out there that it isn’t that hard to find a good phase meter. If you’re working in analog, some consoles have onboard phase meters. Use them. 

 

Best Practices for Phase Coherence

There is a common audio phase parameter known as the 3 to 1 ratio. This states that if you have your microphones 3 times as far apart from each other as they are from the source, you will have the best phase coherence across the frequency spectrum. While this is true and a good starting point, I wouldn’t rely solely on that rule. I recommend using your ears as much as possible, as well as the visual metering methods already described.

 

Issues Created by Phase Incoherence

Mono Incompatability

Your mix might sound fine in stereo, but in mono playback systems, you could lose key elements of your mix, like entire guitar parts, or a substantial portion of your overheads, for example. 

 

Audio Feels Pulled

Another side effect is when an element of a mix feels like it “pulls” the listener’s ears from side to side or feels like a weird pressure in the ear. I don’t know how else to describe it but you’ll know it when you feel it.

 

A Literal Broken Record

If you are potentially planning to put your album on vinyl, phase issues can affect how well a cutting lathe can cut grooves into a lacquer. If the phase incoherence is in the low end, the cutting stylus can sometimes be forced to cut deep enough that it also makes a groove so wide that it can intersect with an adjacent groove, which will then basically create an endless repeating loop (broken record), OR the cutting stylus can be forced to cut really shallow in some places, which kicks the playback needle out of the groove and causing a skip in playback. 
 

Loss in Mastering Quality

Finally, substantial phase incoherence can impose limitations on, you guessed it, your Mastering Engineer! Sometimes we implement tools that allow us to process side information separately from center information. This is much harder to do successfully when there are a lot of phase problems in the recording. Phase incoherence can make it all but impossible to retain any “punch” in drums”, and sometimes can even impact how loud your mastered audio will sound. Once phase is locked into your mix, we can’t correct it, we can only mitigate it, which comes with its own set of compromises. It just creates an unnecessary obstacle in allowing mastering to really make your mixes come to life.


 

It's important to note that here are times when phase incoherence is used to intentionally manipulate a sound, or create an effect in the mix. That is NOT what this post is discussing, and if you’re using phase in a deliberate way that makes you stoked on your mixes, then rock on!  Let us know about it in the comments, and we can discuss it more in an upcoming post.

 

Meanwhile, give these tips a try, and let me know how it goes! Reach out if you have any questions.

 

Happy Phase Coherent Recording!