Question About Low/High Pass Filters

Help Help

Member
Hello friends!

A little background, I've been mixing a somewhat busy mix. Everything sounds great or good enough on the monitors I'm using, however on checking it with other sources there's sometimes a lack of clarity in the vocals, as the song calls for a fairly bright piano(I eq'd the vocals and high end, but I don't want anymore vocal high end unless its my only option). The bass is the thing that seems to be the thing in the way, everything else is nicely placed, yet the bass isn't particularly loud. So one thing I had done was a Low Pass Filter on the bass guitar. The bass guitar is maybe the least integral aspect of this particular song, its more a background thing, it wasn't in the original mix etc. I had kept pushing it until I had it Low Passed at 2K.

One thing I've just realized was when I take off the low pass filter on the Bass or bring it up to 6K from 2K, the vocals become much brighter somehow. This is strange to me as I assumed it would be the other way around. Is what is happening that, by bringing the LPF too low, all the bass energy is being pushed into the main frequencies instead of being more spread out or something? Like does the energy or something remain the same somehow but without a low pass filter there isn't an excess of bass energy or signal to smother the higher frequencies? I can't seem to find this specific thing talked about much, and its long confused me and been something I've wondered about.

I've always been a little confused by Filters. Another example from the other side is when using a High Pass Filter, it can make something seem louder, at times even make an instrument actually register louder volume input with the fader being in the same place. The only explanation I've thought of is that by taking out some of the anchoring lower mud there's more to go to the other, more prominent frequencies. But this doesn't really make sense. Wouldn't the loss in lower frequencies always lower the volume being read, even if it made it seem louder?

I use Cubase, if it has something to do with that?

Please forgive my ignorance! Or if I made it sound more confusing than it is. I greatly value the knowledge that is shared in these parts. Thank you for any moment you have to spare!
 
Are you looking at the frequencies when applying EQ? I primarily use ReaEQ (in Reaper), and it has a graphic interface, so I can see the EQ curve as well as the tracks's frequency spectrum. Some low pass or hi pass filters actually have a 'bump' area where the volume of a certain frequency range is increased nearest the cut-off point as the Q is changed.
 
Filters change the phase of different frequencies so that the various harmonics add up differently. That will change the peak levels some and often can make it higher. If you use something like a clipper or saturator to guarantee some maximum peak level, and the put a filter after it, you will very likely see peaks higher than the clipper’s limit.

But it really is removing energy, not really redistributing it the way the OP seems to imply. Individual frequencies in the pass band should be the same level they were before the filter. We would expect the overall average level to go down.

Now that’s all assuming a linear path after the filter. If you’re slamming into a compressor, then yes removing energy - especially in the low end - will make the compressor not see as much input level, so not compress as much, so yes the remain frequencies can come out louder.

I wonder, though, if the OP isn’t just seeing the effects of masking and what we might call timbral relativity. Bright things always sound brighter next to something a little duller, and if there’s just a bunch of treble everywhere, it can start to be tough to recognize the contribution of any individual source.
 
use linear phase eq's wherever possible,
and especially at the mix bus or mastering stage.
 
Thank you guys so much for taking the time to respond, it is greatly appreciated!

Yeah I thought about much of what you're saying here. I added some extra examples, but I guess my main question is the one I'm working with in regards to the bass. Can anyone explain why, given the exact same everything going to an EQ post compression and all other processing, why leaving the Low Pass higher up(6K and over) is making the vocals brighter, and giving a wee bit more room for the piano to my ears? Cause what you say here ashcat is how I've always worked by:

I wonder, though, if the OP isn’t just seeing the effects of masking and what we might call timbral relativity. Bright things always sound brighter next to something a little duller, and if there’s just a bunch of treble everywhere, it can start to be tough to recognize the contribution of any individual source.

When I make the bass duller in this instance(Low Pass around 2K), the other things get less bright, and the bass seems to take up more space. When I make it brighter by moving the Low Pass to 6K or taking it off entirely, the bass seems to take up less space, and the vocals and other instruments more. This is what's confusing me. Of course, I can work with this and just use my ears, but I'm about to send something to mastering and hate feeling like I'm guessing, and later on hear what was really going on etc. In general, I'd just really like to know for the future. The only theory I've come up with is that by removing the higher frequencies from the bass, the low frequencies are more forward somehow and muddying up the mix a bit. But I don't know this to be a thing, and so was wondering if anyone else did.

You guys have a lot more experience than me, and if this doesn't gel with anything you know of, then perhaps there's something else I'm missing or some other detail that's confusing me, and there's no real answer anyone can give me. But, as the EQ is last in the chain, it's hard for me to see what that could be. If anyone has any guesses I'd be much obliged.

Thanks guys!
 
use linear phase eq's wherever possible,
and especially at the mix bus or mastering stage.

Interesting, I don't know much about that. I've been using the stock EQ on Cubase 10 Elements. I'm not sure if that is linear phase EQ? Also for slight additional cuts and stuff, I'm using the Renaissance EQ a bit back in the chain. But it is the Cubase stock EQ that I am experiencing the confusion on in regards to the Low Pass Filter. That could be an important detail for me to know, I'm just learning as I go over here. Thanks!
 
the Renn EQ is a great EQ,
but not linear phase.


some EQs' sonic fingerprint, is created by the USE of phase shift...
so there's that....

the pultecs are famous for this.

but if you are trying to avoid phase shifts,
especially if you are EQ'ing across the stereo bus,
the Linear Phase is the only way to go.

some people can't hear the difference,
i think that's because of poor monitoring or untrained ears.

you could mix an entire album, with eq's on every track, no linear phase,
and still have a cool sounding mix...
but for mastering, i'd only ever use linear phase eq's.
 
use linear phase eq's wherever possible,
and especially at the mix bus or mastering stage.

While I agree that linear phase EQ are very beneficial when mastering or doing surgical stuff...non-linear EQ are certainly NOT to be avoided, because it's the small phase-shifts caused by them that provide the EQ character, and why so many analog EQ just seemed to sound good.

Same thing with miking...some folks get too preoccupied with phase-aligning everything in the DAW...surgical precision...but often they cut the life out of tracks, kinda like what happens when you quantize your drums to a grid, and the organic feel is gone, but everything is perfect.

Also, linear phase EQ requires some delay in order to process and adjust all the frequencies to put them in-phase...and that can add some latency, not to mention, it also adds more CPU processing load...like convolution reverbs and some other types of FX/processing can do...so if you load up you tracks with those plugins, you can introduce noticeable total latency and CPU load.

(See #7 in this link you provided)
12 Common EQ Mistakes Mixing Engineers Make
 
While I agree that linear phase EQ are very beneficial when mastering or doing surgical stuff...non-linear EQ are certainly NOT to be avoided, because it's the small phase-shifts caused by them that provide the EQ character, and why so many analog EQ just seemed to sound good.
I already said all that, in my previous post #7.
 
I already said all that, in my previous post #7.

I was responding to post #4... :) ...but in either case, you seem to insist the linear phase EQ is the preferred way to go...which is what I am disputing.

All the great analog hardware EQs and the analog consoles that everyone always references when talking about their great EQ sections...are not linear phase EQs.
I wouldn't suggest that linear-phase would generally be more preferred between the two...it's situation dependent, and there are many times where a non-linear EQ would sound better...or vice versa.

[EDIT]
When I first got into the DAW world and plugins...and I discovered linear-phase EQ...I thought "DUH! I need to use this on everything...it's the 'perfect' EQ!"...
...and then I found out why that wasn't the right perspective, and that there was nothing bad/worse about the non-linear EQs...which I actually use like 75% of the time, even these days, and even in the DAW, but certainly outside the DAW.

YMMV...... ;)
 
Last edited:
Back
Top