Question about Line in settings: +4 vs. -10

  • Thread starter Thread starter NathanPonzar
  • Start date Start date
N

NathanPonzar

New member
I am confused. I use a digi 002 rack interface. I have primarily just used the 1-4 ins which has pres. I have been toying with channels 5-8 as a way to bypass the pres in the 002, and there is a switch for each input on the back that allows you to switch between -10 dBu and +4 dBV. What is confusing me is that when I have it set to -10, the signal is louder. All I can figure is that the switch's labels refer to the level of the incoming signal. Like, moving the switch to the -10 dBu position adjusts the gain to a setting that is more compatible with softer sound sources.

Is this correct? I did do a good half hour of searching for this answer before I resorted to posting here, but figured it wasn't a biggy since it was the newbie forum.

Thanks
 
you match the line level to that of the device that you're plugging into it.

For example, you might plug your mic into a separate preamp, and the preamp will have line out. the line out is usually labelled as +4 or -10, so you match the 003 to that.
 
Basically +4 is used by pro gear while -10 is used by consumer gear (such as "stereo" gear).

Check the manual for the equipment you intend to plug in for the proper operating level...
 
Okay, so...

Thanks for the replies. That's kinda what I suspected. Anyways, my pre amp (presonus bluetube) manual says the following "The Output XLR Connector is servo balanced and operates and +4dBu" and "The Output 1/4"TS Connector is unbalanced and operates at -10dBv" (and yes, it does use a lower case v."

Anyways, which setting would I use? My patch cable from bluetube to interface line in is XLR to TRS (balanced). I would think this would mean to use the +4dBu setting because it's XLR and balanced, however, the signal I'm getting in my DAW is not very useable as it is so low. This is with with the gain at about 4-4:30 on the preamp with a LDC on vocals. Using my interfaces pre amps, I get the same signal with the dial around 11-12.

Going back to the bluetube, when I switch the line in on the interface to -10, I get response on the gain knob that is similar to my digi 002's pres gain knob (the max gain on the bluetube is +54dB and the 002's is +65dB)

So does this mean I should not expect to get as much signal from the bluetube as my 002? Or should I be using the -10dBv setting? From the info you have given and that of the specs of the equipment, it would seem that +4dBu setting is correct, but in that case, I don't think that the bluetube is a very useable piece of equipment.
 
Nathan, I apologize... I can't answer your question, but I have a related question myself so I'm jacking your thread (gently, of course).

My Focusrite preamp gets quite noisy when I crank the input gain beyond the 3 o'clock position, but I have to max out the gain to get a decent level out of my Shure SM7B. Max gain = too much noise.

I use an Echo Mia interface, and in the software console I can switch the input level between -10 and +4. The level is hotter when I select -10, so I only have to turn the gain knob to 3 o'clock to get a usable level, and as a result the preamp noise is much more manageable. My question is:

Is it OK to lie to Miss Mia about the magnitude of what's entering her, or am I creating other problems by not matching the preamp output (+4) to the interface input?

Thank you very much for your help.
 
...the signal I'm getting in my DAW is not very useable as it is so low.

How low? You should be leaving plenty of headroom during tracking. In Pro Tools that means peaking barely over halfway up the meters. In DAWs that show dB on the meter you set the gain so your peaks are in the -12dBFS range.
 
Original Post said:
Is this correct?
Yes, that is pretty much exactly correct. That is: "the switch's labels refer to the level of the incoming signal." And, as well, it is the case thatl "moving the switch to the -10dBV position adjusts the gain to a setting that is more compatible with softer sound sources." At least once I edited the original post to change that little "u" to a big "V," which is the right symbol.

In particular, moving it to the -10dBV setting probably inserts (or increases the gain at) an amplification stage that affects the input before it goes on. If you look at the block diagram in your manual (I'd expect there is one for this type of gear), it should be reasonably clear what's going on.

Later Post said:
Anyways, which setting would I use?
The easy answer: use the one that works.

It sounds, from your description of what's happening, like the bluetube - whatever claims it may make - is incapable of amplifying the signal from your mic sufficiently to output a +4dBu signal without creating more noise than you like. This does seem a bit surprising: while it depends on the mic, most large diaphragm condensors have a relatively high output level. Whether it means the bluetube isn't usable depends on what use you're trying to put it to, I suppose.

Anyway, the main thing to keep in mind about -10dBV and +4dBu is that they're just levels. One's louder/hotter/higher than the other, that's all. There's no magical characteristic that's different between them. It's just that the +4dBu signal is about 12dB higher than the -10dBV.*

Quick comparison of two ways you could hook stuff up (for the purpose of explanation, using somewhat arbitrary dB figures for gain, though they're probably in the ballpark, and assuming I was right that the switch on the Digi input inserts a gain stage):

1) Crank up the bluetube, set the Digi input at +4: the mic signal comes into the bluetube, gets amplified by 50 dB (say), then goes into the Digi input and on to the Digi's circuitry, converted to digital, etc.

2) Turn down the bluetube a bit, set the Digi input at -10: the mic signal comes into the bluetube, gets amplified by 38 dB, then goes into the Digi input, gets amplified by another 12 dB, and goes on the Digi's circuitry, gets converted to digital, etc.

Basically, what you've done is partially substituted the amp on the Digi's line input for the bluetube's amp, by having the Digi input amp provide 12 dB of gain, and reducing the gain the bluetube is supplying by the same amount. If that produces a quieter signal, it's presumably because the Digi's amp is quieter, at least in this particular application.

The highjacker said:
Is it OK to lie to Miss Mia
Well, you're not actually lying. When you turned down the Focusrite preamp, you reduced the level of the output until it was at (or in the vicinity of) -10dBV.

And anyway, if you were lying, I think Mia would find it refereshing, as she is probably more accustomed to having people lie by telling her what's entering her is bigger than it really is.

____
I know. You'd think it was 14 dB, but it's not ... because one has a "V" at the end and the other a "u." Anyway, that's a level of detail that's not important here.
 
Thank you, sjjohnston. I appreciate the clarification.

Maaaa-aate

At a stretch, you can name your guitars and instruments, but only if you absolutely must. And nothing electronic. You cannot, for instance, name a keyboard.

And you can ABSOLUTELY NOT go naming your godamn interface!!!!!!!!! It's just not done.

Give yourself an uppercut and go sit in the naughty corner for a while.... :laughings:

Thems the rules... everybody knows but you... :eek: :cool:
 
Maaaa-aate

At a stretch, you can name your guitars and instruments, but only if you absolutely must. And nothing electronic. You cannot, for instance, name a keyboard.

And you can ABSOLUTELY NOT go naming your godamn interface!!!!!!!!! It's just not done.

Give yourself an uppercut and go sit in the naughty corner for a while.... :laughings:

Thems the rules... everybody knows but you... :eek: :cool:

You should meet her sisters, Gina and Layla. ;)
 
Mine just went through a sex change! So .................................................................... I''m confused.
 
When you change the level of signal - by moving a fader, turning up the gain on the preamp, adjusting the trim pot on an input, or whatever - you change the voltage. Hotter/louder/turned-up/bigger = more voltage.

"Matching" voltages is exactly what we're trying to help the OP to do. Though: it's not really "matching," so much as getting the level (voltage) of his signal in the range he wants it to be in going into the AD converter, and doing so in the manner that makes his signal sound as close to way he wants it as possible (i.e. without noise and distortion, or at least without distortion he doesn't want).

The trouble that occurs from "mismatching" levels isn't particularly surprising, and (at least in this application) not wildly dangerous. Some examples, which are probably obvious:
- If you - say - turn up a preamp or something too much, the voltage coming out will be too high, and when the signal flows into the next box it'll clip and distort.
- Same things, approximately: if you turn up the gain stage of a preamp too much, the signal will be too high going into later circuitry within the same box, and it'll clip and distort.
- If you turn a preamp or something down too low, the voltage coming out will be too low, and when you have to raise it up at a later point in your signal chain you'll also raise up the noise, hurting your signal-to-noise ratio.

Unless you do something crazy, you won't blow anything up (there's another recent thread on that topic). "Something crazy" would be, say: hooking the output of your power amp (that's supposed to go to passive speakers) to the input of an expensive mic preamp. Or, even crazier I guess: running the power from your wall socket directly into an audio input, or even a speaker. Fortunately, they don't sell two-prong (or even more exciting, three-prong) to XLR adapters. I don't think.

Simplifying (i.e. skipping) a few details:
-10 dBV = 10 dB under 1 volt = 10^(-10/20) = 1 / square root of 10 ~ .32 volts
+4 dBu = 4 dB over .775 volts = 10^(4/20) * .775 ~ 1.2 volts.

The second one is about 4 times the first one, meaning there's a difference of 12 dB

When you turn a pot so that your converter is peaking at -12 dBFS, instead of hitting the top of the meter, you're making the same change in the voltage going into the converter.
 
Mine just went through a sex change! So .................................................................... I''m confused.
Mine has Jacks. Weird thing is, they're female. So I guess I'm also confused.
 
Damage would come from heat, which comes from power, which would require more current than is likely in a signal circuit.

Probably not much of an issue at these voltages. But too much power on a circuit for too long can shorten it's life span. Way too much power can result in the magical blue smoke. Not very likely at these levels, but at other types, it could permanently disable a device.

I remember running a splitter Y cable into a line input (x2?) on a soundblaster soundcard (SB Awe 32 / late 90's), and after about a year, I burned out one of the 8 midi channels on the card. Not exactly related, but just an example of what could happen on some gear. Not that that damage disabled the device from working. It still worked fine in linux. But windows refused to boot because it had expectations that weren't met I guess. Pulling the card let windows boot normally. But it was Win95 at the time.
 
Probably not much of an issue at these voltages. But too much power on a circuit for too long can shorten it's life span. Way too much power can result in the magical blue smoke. Not very likely at these levels, but at other types, it could permanently disable a device.

In the link you provided there's a section on impedance.

Line level - Wikipedia said:
Impedance bridging is employed to ensure that very little power is transferred

It's not so much the voltages that matter, it's the ratio between the output and input impedance. That keeps power transfer very low, even with a level mismatch. But you wouldn't want to push your luck and connect a power amp to a line input.
 
Back
Top