fixing phase problems

  • Thread starter Thread starter sniixer
  • Start date Start date
S

sniixer

New member
Hi

I recently did a live recording of a jazz-quartet(sax-cello-piano-upright bass). I used two omni-microphones(km183) placed quite close to the musicians. The microphones were maybe 60cm to 1 meter apart. In addition 3 of the instruments were miked, but not the saxophone)(a DI for the bass)
When I use a phase-meter in Adobe Audition, I can see that there are serious phase-problems between the two omni-mikes. I should have probably placed them closer together. The stereo-images they give is also too wide.. (I really want to make a jecklin-desk and experiment with stereo- tequniques someday.) When I playback in mono some of the fidelity is lost, but actually its difficult to pinpoint excactly what is missing.. and the instruments seem to keep the same balance as they have in stereo.

My question is: How do you fix phase problems, like if you are a mastering engineer.. Are they certain frequencies that get lost, and can you compensate for this with EQ?
Regards.

)(I sometimes use more time at homerecording than recording, but thanks for all the valuable information))
 
"fixing" poor phase relationships is a whole can of worms....
I have had to mix stuff that came along with really weird phase relationships, and I have found that I can sort of make the stuff work (on drums in particular) by having a couple of different compressors, with drastically different time constants, help in this matter. This keeps both tracks in question (or more than two, whatever) sort of "dynamically swimming" the whole time they are active in any given part of the song. It is really easy to hear a static (not moving) phase relationship, but it is tough to pick out an ever changing one. (moving target). When the comps actually cross in their time constant, of course it would be like the static "null" you are hearing with the tracks just playing along unencumbered... but with very slow, VERY low ratio (1.5:1 is what I will start with for this purpose) That static null is only achieved once in a while, and just for a moment.

Think of it this way: it is like getting two completely polarity opposite sources, getting the faders exactly at the point that they cancel, then moving each fader up and down slightly in time with the performances in the source. You rarely would have to stay at the null point for more than a fraction of a second...

I hope this makes sense, I am tired... long week of mixing...
 
Have you tried sliding one of the mic tracks along a little bit?
 
When you slide tracks, you still wind up with a static relationship.
You may be able to get a more pleasing static relationship, but unless your chosen DAW can slip less than 1 Ms, you are just going to have comb filtering and cancellation at a DIFFERENT frequency. The duty cycle of an event at 12k is stupid fast (i suck at math, but obviously it would be 1/12,000 of a second) so if you are hearing "phasey" sounds above even 1k, then you need to be able to slip in nanoseconds, then you have messsed with the phase relationships at other frequencies anyway!

Obviously, the real answer is to learn to be ALLERGIC to poor phase relationships when you are setting up microphones. Checking for good relationships is MORE important than mic choice. Really. I promise!

Bad phase relationships will kill a recording made with 47's and 251's and c12's EVERY TIME. Conversely, you can record anything with anything and have it sound a lot like the original gesture and source if you have good phase relationships.

Phase is one of the BIG factors in a professional recording environment, so much so that we would take it for granted that the tracks exhibit good phase coherency, or at least deliberate and interesting relationships!
 
sniixer said:
... The stereo-images they give is also too wide.. (I really want to make a jecklin-desk and experiment with stereo- tequniques someday.) When I playback in mono some of the fidelity is lost, but actually its difficult to pinpoint excactly what is missing..
The stereo pair is likely stuck with whatever combing the placement gave you, but even if mono sounds poor, have you tried panning in partially?
The spot mics might respond to time shifting to the front pair however.
Wayne
 
Thanks for the replies and the good tips. Actually it doesnt sound poor in mono, but different. I tried to make the stereo-image with the omni-mikes abit narrower, add some high-frequency EQ, and to slide the di-bass track so that the waveforms on it corresponds to one of the omni-microphones.
There were also 2 microphones on the piano, and 1 one the cello, and even though these microphones leak on the other instruments, I think it sounds ok when I blend them (with low level) with the omni-microphones. SO all in all I am happy with the recording,(compared to other recordings I have done) even though the saxophone sounds a bit too much like a flute.
 

Similar threads

J
Replies
1
Views
2K
rob aylestone
rob aylestone
SpecialTricks
Replies
16
Views
4K
rob aylestone
rob aylestone
T
Replies
7
Views
7K
TieDyedDevil
T
Back
Top