monitoring +4 and -10

  • Thread starter Thread starter bethanyb321
  • Start date Start date
B

bethanyb321

Member
hello i see switches on both my monitors and monitor controller +4 and -10, should they both be set to +4?
thanks
 
I'm not sure that I'm qualified to answer this, but I'm not sure that the answer to this question is always "+4"

I'm going to assume that's +4dBU and -10 dBV btw and give you my personal simple answer - yes.


But it depends on a few different things (with the main thing being the soundcard that you're using. What soundcard are you using?), and it isn't as simple as just one being louder. (Interestingly enough +4dBU on my monitors and interface is actually quieter than -10dBV)
Signal to noise ratio is apparently better on +4, and EDIT:... actually I'm just going to share a link

-10dBV and +4dBu voltage levels
 
hello i see switches on both my monitors and monitor controller +4 and -10, should they both be set to +4?
thanks

Yes, if the source of signal to the monitors is set to +4dBu then so should the monitors. This assumes of course that feed to the monitor controller is in turn capable of driving the correct levels?

And yes again, operating at +4dBu will give a theoretical improvement in noise performance but this is mainly because a 1volt-ish signal is less influenced by any given interference field level than 316millivolts.
Those home recordists that cannot operate at +4 for whatever reason need not be overly concerned. The differences really only become apparent for long cables, say in excess of 15mtrs? VASTLY more important in keeping noise of all sorts down is equipment having a very low output impedance. It is trivial to get 100 Ohm op Zs and less than 30 Ohms can be expected of top class gear (along with the capability to drive very long lines, which is not quite the same thing).

Then of course, +4dBu outputs are usually balanced, neg ten jobbies rarely so but once again, in the tight confines of the projjy studio this should not matter a great deal.

Note that the difference between +4dB"yoo" and -10dB "Vee" is actually about 12dB not, as it looks 14dB. However, many -10dBV devices have minimal headroom and much "pro" gear can go very hot so if attenuating between them err on the side of too much, not too little attenuation!

Dave.
 
(Interestingly enough +4dBU on my monitors and interface is actually quieter than -10dBV)
This is usually true for a given source into a switchable input. Usually the -10dbV setting adds gain because the rest of the circuit is (basically) designed to run at +4dbu. More correctly, the power supply rail (which determines the maximum level of anything going through) and noise floor stay in the same place. An actual -10dbV input would be almost 12db closer to the noise floor, and have an absolutely absurd amount of headroom. So they gain it up at input to take better advantage of the dynamic range of the circuit.

If you left the inputs alone and switched the output, I thank you'd find the opposite - the -10dbV signal is either attenuated or gained up less compared to the +4dbu, and should be quieter. This is because at this point we're more worried about being polite to whatever device comes next.

The OP needs the input of the monitor controller to agree with the output of his interface, and the output of the monitor controller to agree with the input of the monitors. If given the option, go with +4dbu all the way through. Though at this point the extra noise isn't getting recorded, but distortion in the monitor chain will have you chasing crackles in the mix for weeks! If you have the bad habit of running your mixes close to 0dbfs, you might actually save yourself some fatigue, wtf moments, and possibly even speaker damage if you set the interface output to -10dbV and the monitor controller to +4dbu.
 
The point of the switch is to allow you to match the input sensitivity of the monitors to the output level of the monitor controller. If the controller has +4dBu output then use the +4dBu setting on the monitors.
 
Back
Top