Phase

xiaoken

New member
I've done a couple of recordings before, and in terms of phase, I didn't hear anything wrong.

That could just be my inexperienced ears.

So what are the methods you use to detect phase in your recordings?
 
My understanding is that phase is a consideration when:

- using multiple microphones to record a source

- duplicating a track, adding effects, then adding the new track
to the mix


When using more than one microphone to record a single source there is bound to be more time delay (time delay = phase shift) in one of signal paths and when the two signals are mixed there can be an audible "thinning" or "thickening" of the sound due to the signals either subtracting from or adding to each other. That's why when using multiple microphones folks recommend moving one of the microphones around while listening for phase cancellations (the sound starts to sound thin).

The second situation I mentioned is similar to the first in that the time delay induced by signal processing gear can result in two signals that when mixed either re-inforce or subtract from each other. Once again, let your ears be your guide.

I've probably only skimmed the surface...
 
You can't "detect phase," but you can hear phase differences between two signals that are basically the same. When two or more signals are the same but out of phase with one another, they cause destructive interference, which amounts to a lessening of the amplitude of the signal, basically.

If you have a software app like Sound Forge or SONAR, try this: record a track of something -- almost anything will do. Make a copy of it to a second track. Now take one of them and slide it forward in the track a little bit. Listen to it. Repeat.

As the tracks get out of phase with each other, you should hear what you recorded start to sound washed out, thin, less present, not as loud and crisp... If you can get them to be very close to 180 degrees out of phase, you should hear virtually nothing if the tracks are truly perfectly identical -- complete destructive interference. Wait, if you can invert waveforms in your audio software (I'm sure you can in Sound Forge, not sure about SONAR), you can do this by leaving the two copies lined up perfectly, but invert one of them.
 
Actually, if your DAW/software is properly designed, only inverting POLARITY will completely cancel any sound from the two identical (but one channel inverted) tracks.

Sliding one track in time will NEVER cancel out ALL the sound, because each frequency requires a DIFFERENT amount of time slip in order to be 180 degrees out of phase. As you time-slip one track, some frequencies will get louder due to addition of in-phase sounds, while other frequencies will get quieter due to cancellation of out-of-phase sounds. The term normally used for this phenomenon is comb filtering, because the graph of the resultant frequency response looks like the teeth of a comb.

The higher the frequency, the less change in either mic location or track slip is required to cause a noticeable phasing effect. This is because higher frequencies have shorter wavelengths, so it takes less shifting in time to cause that particular frequency to go through a 180 degree phase shift.

The confusion on this subject is so rampant, at least partly by the fact that some console manufacturers and software manufacturers, as well as some seasoned pro's STILL say "phase" when they really mean "polarity" - I still see consoles with the polarity switches labeled "phase", -

The POLARITY switch on an input channel WILL change the PHASE of every frequency at the input by 180 degrees, BUT time-slipping the input in order to cause a 180 degree phase change at a SPECIFIC frequency will NOT change the phase of ALL frequencies by 180 degrees, because the POLARITY switch changes NOTHING in TIME, only POLARITY. Each event in a complex audio signal is STILL happening at exactly the same TIME when you flip POLARITY, it's just going negative instead of positive. Re-read this last paragraph and thing about it -

There, that aughta bring on some spirited discussion... Steve
 
knightfly,

I was going to say, "well, if any arbitrary waveform can be represented by an infinite sum of sine waves, then you could slide them left or right relative to each other and..."

and of course then it hits me that the component sine waves are all different frequencies...

You are right, of course. Inverting the data -- switching the polarity -- is the only way to get the interference to be completely destructive unless you're recording a simple sine wave...

Sorry, guys, for the faulty suggestion... but you can still perform the experiment and listen to how a shift between the files sounds...

One more thing to mention -- of course even be seeking the thin, odd comb-filtered sound of a simple phase difference as a special effect... and we all are familiar with what happens if you shift these copies in time so that the phase shift keeps changing -- you get some very neat and desirable swirling effects -- flanging, phase shifting...
 
Damn, that was the shortest argument about phase I've seen here in a while - gonna have to start posting drunk again... Steve
 
My current favorite targets are people who have jobs for years and years yet cannot actually do what they supposedly are paid for, but nobody in the management chain is willing to deal with it... and people who get cracked audio software and then expect the folks here to tell them how to do every little thing
 
Back
Top