I don't understand phase distortion

DM1

New member
Specifically, when introduced by an EQ, what does it sound like?

I know what a phaser or phase-shift effect sounds like.

I know how music sounds through my monitors when I flip the polarity on one of them.

I even understand the basic principle behind EQ, that a signal is mixed with an out-of-phase version of itself to affect certain frequencies.

But for the life of me, I can't get my head around how phase distortion occurs. The EQ only spits out a single signal. Isn't that signal just the result of mixing the original version with the phase-shifted version? Is phase distortion a quality of that sound? Or is it something that happens because the EQ on a track has time-shifted it's input from the rest of the tracks (which, really, I know it isn't, but I'm trying to get my head around it here.)

Can anyone shed some light on this for me? Or maybe better yet, is there an EQ operation that's particularly sensitive to phase distortion? Maybe if I could try a contrived example myself, the effect would become clear.
 
Any waveform can be constructed of a sum of sine waves of different amplitudes, frequencies, and phases. Phase distortion occurs when some frequencies get through a system at a different time than other frequencies, modifying the phase relationship between the frequencies.

Consider a plucked guitar string. The fundamental tone and its harmonics have a phase relationship of 0°, ie, they act like they all started at the same time. If you do some processing like eq, it may be that some harmonics get phase-shifted a bit, probably delayed, so the resulting waveform has some differences that are a result of the phase changes. That would be phase distortion.
 
I imagine digital plugins and such have accounted for phase distortion in the way they work?
Or perhaps not, since the goal of a lot of plugins is to sound as analog as possible. But then there are the plugins that attempt to be as technically correct as possible.

But yeah, as apl said. Phase distortion caused by time delay. The time it takes for the filtered and inverted signal to be added to the output waveform will cause a tiny bit of phase distortion on that band.
The time this takes (how far/bad the distortion is) is not consistant among different equalizers. In fact different bands on the same EQ might have a different amount of delay causing different phase distortions.
Not to mention that different levels of boost/cut cause less or more extreme phase distortion. +/-0 should have no distortion.
Also there are parallel filter EQ's that have less phase distortion than series filter EQ's.
This is partly why some high quality equalizers sound much better then cheap equalizers.
Finally... keep in mind that while it is distortion, it is not like harmonic distortion, nor does it cause extreme non-linearities like harmonic distortion does (commonly caused by clipped signal or over-driven preamp, etc.). It's not usually "bad" sounding, it's more of a lack of "punch", "clarity", or "luster" that a high quality EQ could bring out. Or simply have a good signal to start with, from the instrument/microphone level.
 
In Matlab we occasionally used a function called FILTFILT that would run a signal through a digitial filter forwards then backwards so that there would be zero phase distortion. Not a concept that can be applied on the fly.
 
D,

> I can't get my head around how phase distortion occurs. <

First, phase shift is not the same thing as distortion. Distortion implies non-linearity, such as overdriving an amplifier or some other mechanism that creates new frequency components not present in the original signal. Phase shift is entirely linear, so it's not distortion even if some people use that term.

Here's a mini-article I wrote about EQ and phase shift, and maybe you'll find it useful:

www.ethanwiner.com/EQPhase.html

--Ethan
 
apl,

> Consider a plucked guitar string. The fundamental tone and its harmonics have a phase relationship of 0°, ie, they act like they all started at the same time. If you do some processing like eq, it may be that some harmonics get phase-shifted a bit <

You obviously have never looked at a plucked guitar string on an oscilloscope! :D

The phase relationship between a vibrating string's fundamental and harmonics is constantly changing. This is why it's so difficult to make a reliable guitar pitch-to-MIDI convertor. The reason the relationship changes is the string stretches slightly toward the extremes of each fundamental excursion, and that raises the pitch a bit for the harmonics. Each harmonic does likewise (to a lesser and lesser extent), which further shifts all subsequent higher harmonics.

--Ethan
 
Ethan Winer said:
apl,

> Consider a plucked guitar string. The fundamental tone and its harmonics have a phase relationship of 0°, ie, they act like they all started at the same time. If you do some processing like eq, it may be that some harmonics get phase-shifted a bit <

You obviously have never looked at a plucked guitar string on an oscilloscope! :D

The phase relationship between a vibrating string's fundamental and harmonics is constantly changing. This is why it's so difficult to make a reliable guitar pitch-to-MIDI convertor. The reason the relationship changes is the string stretches slightly toward the extremes of each fundamental excursion, and that raises the pitch a bit for the harmonics. Each harmonic does likewise (to a lesser and lesser extent), which further shifts all subsequent higher harmonics.

--Ethan

Thanks for the info, Ethan. I was trying to think of something simple to explain the phenomenon. If I had all the live long day, I'd build up a square wave and start sliding the upper harmonics back in time and the phase distortion would be obvious.

Always an honor to get your attention!
 
BRIEFCASEMANX said:
isn't any change in the signal a "distortion"? Maybe not a harmonic distortion.....
Yes.
Dictionary.com has this to say about distortion:

1. An undesired change in the waveform of a signal.
2. A consequence of such a change, especially a lack of fidelity in reception or reproduction.

So yes, distortion does not have to be harmonic distortion.

BUT it is really common when someone refers to distortion, that it is actually harmonic distortion of some sort.
 
That's the problem with using the term 'phase distortion'. To people who don't know what it is, it sounds like you have overdriven the phase into harmonic distortion. When all you are doing is changing the phase relationship of different parts of the waveform.

Same thing with crossover distortion.
 
TS,

Not to be argumentative, just to keep things in perspective:

> 1. An undesired change in the waveform of a signal. <

Phase shift may change the waveform, but not in an undesirable way. Why? Because phase shift alone is not audible. It becomes audible only when you combine the original and shifted versions, and then what's audible is the change in frequency response.

Orban is a company that caters to the broadcast market, and one of their products is a "phase rotator" that shifts the phase intentionally for announcers to allow transmitters to put out a louder signal without distorting. So that's one application of phase shift that doesn't affect the sound quality.

> 2. A consequence of such a change, especially a lack of fidelity in reception or reproduction. <

Likewise, phase shift will not change fidelity, so to me this excludes it from that definition.

--Ethan
 
Ethan Winer said:
Why? Because phase shift alone is not audible.

You should definitely not work in Monster Cable's marketing department!

Hidden Agenda: To poke fun at Monster Cable's claims.
 
BRIEFCASEMANX said:
What are their claims?

That there are huge fidelity gains to be made by using schmantzy speaker wires.

HA: Dodging the question because I don't really want to get into it because I grew up diggin' stereos back in the day where the best amps didn't have a way to hook up cables bigger than 12 gauge anyway.
 
BRIEFCASEMANX said:
What are their claims?
My favorite is how they label the input and output of the cable because the signal flows better in one direction than the other. Audio is AC, it 'moves' in both directions. Complete BS.
 
Farview said:
My favorite is how they label the input and output of the cable because the signal flows better in one direction than the other. Audio is AC, it 'moves' in both directions. Complete BS.
I also label my socks L and R :D

G.
 
Ethan Winer said:
TS,

Not to be argumentative, just to keep things in perspective:

> 1. An undesired change in the waveform of a signal. <

Phase shift may change the waveform, but not in an undesirable way. Why? Because phase shift alone is not audible. It becomes audible only when you combine the original and shifted versions, and then what's audible is the change in frequency response.

Orban is a company that caters to the broadcast market, and one of their products is a "phase rotator" that shifts the phase intentionally for announcers to allow transmitters to put out a louder signal without distorting. So that's one application of phase shift that doesn't affect the sound quality.

> 2. A consequence of such a change, especially a lack of fidelity in reception or reproduction. <

Likewise, phase shift will not change fidelity, so to me this excludes it from that definition.

--Ethan
Ethan, not to be argumentative here either. :)
Yeah that was my only beef with that definition. Distortion can be desirable. Like.... umm a Marshall stack maybe?
And yeah, a lot of ditsortions are impossible to detect by ear.

I always thought of distortion (or to distort something) means to change, skew, stretch something to make it different than the original. Wheather it's desirable is irrelevant.

So yeah I agree with you, but kind of not I suppose.
 
> I always thought of distortion (or to distort something) means to change, skew, stretch something to make it different than the original. Wheather it's desirable is irrelevant. <

Right. But if a change is inaudible, even if you can see the waveform squiggle in your DAW shift a little, then it's not really "different." Often a specialty field, like audio, will adapt a more specific meaning for a word that's already in common use. I think that's the case here.

--Ethan
 
apl,

> You should definitely not work in Monster Cable's marketing department! <

No kidding!

This is one of my all-time favorite lines:

The most important person in a company that sells audiophile cables is the head of marketing.

:D

--Ethan
 
Farview said:
My favorite is how they label the input and output of the cable because the signal flows better in one direction than the other. Audio is AC, it 'moves' in both directions. Complete BS.
Actually if you consider that the RED active wire is doing all the pushing and pulling of the signal, having the wires back to front will invert the sound. Even worse, having them right on one speaker and wrong on the other will give you out of phase playback... We test masters with such configurations for the benefit of the unknowing consumer......
 
Back
Top