Why 18?

  • Thread starter Thread starter SouthSIDE Glen
  • Start date Start date
Is there any reason why you can't use digital zero as the reference, and line everything else up against that?
 
SonicAlbert said:
Is there any reason why you can't use digital zero as the reference, and line everything else up against that?

I think the real question is how much headroom do you want to allow for?

That's your "reference level".
 
SonicAlbert said:
Is there any reason why you can't use digital zero as the reference, and line everything else up against that?
Well, if I understand you correctly, it's because digital zero is not an absolute, nor does it lend itself well to displaying a common gain stage reference through the signal chain.

I'm using dBu as the reference scale, like the SOS chart did, and then lining up the bars for VU, PPM and dBFS to that scale.

What I'm trying to work on now instead of displaying a gazillion different dBFS displays for the different calibrations, is a way of creating a dynamic display that will alter the position of the dBFS bar based upon a user input of the calibration value.

For example, I may start out with a default calibration value and display of +4dBu = 0VU = -18dBFS, but there will be a couple of input fields, one for +4dBu calibration, one for 0dBu calibration, and one for maximum digital output value in dBu (the three main different ways that factory calibration might be specified in the actual gear specifications.) The user can chose the one that he/she pleases, select the calibration value (-20, -12, whatever), and the chart will dynamically change to reflect that calibration.

I'm still in the design stage of that idea, figuring out the best way to implement that dynamic display. But I figure that beats a billion confusing (and somewhat redundant) dBFS bars on the chart, and also gets across the idea of flexible calibration standards in the fore right off the bat. Plus it's a nice way of calculating the translation between printed specs.

G.
 
masteringhouse said:
I think the real question is how much headroom do you want to allow for?

That's your "reference level".
I agree 100%.

This is where I've been coming around to in the last year or so but I'm finding it hard to articulate in a way to explain to the "track it hot" brigade. It's one of the biggest things I get from K-oolaid [ :) ]. The K number is the amount of headroom you want/need for the application. When tracking you need to allow enough headroom for the dynamics of the tracks you are recording and -18 or -20 dBFS gives you bags of headroom for all but the most extremely peaky signals. It goes without saying that I think you should work with the same amount of headroom for all the tracks even though most won't use it.

Recording in 16-bit means you have to compromise to minimise the effects of quantization distortion. In that case it would seem sensible to ride faders and/or compress and/or limit the signal in the analogue domain and go for a less generous headroom of say 12-15 dB depending on whether you subscribe to the old rule-of-thumb or Alesis' recommendations. Again I'd use the same nominal headroom for all the tracks. Unfortunately I don't have a good explanation of "why" other than it's consistent behaviour that relates to the loudness of the track rather then the peaks and it will give a consistent loudness of all the tracks if you set their faders to the same point. In fact, is that it? Does it make the fader positions relate to the loudness of the track rather than being inversely related to the peak signal of each track?

When it comes to editing and mixing I try to plan ahead for the amount of headroom I'll be needing to reduce to later on. I'll limit the peaky tracks to keep their peaks below -6 to -9 dBFS (post-fade). I'm not quite decided how much I should chop at this stage but definitely not so far that the sound quality is affected.

When I mix I use the same headroom as when I track because I drink K-oolaid so the monitors are still at the same volume (6 dB quieter than Bob Katz likes his).
 
iqi616 said:
It goes without saying that I think you should work with the same amount of headroom for all the tracks even though most won't use it.
How would you adjust your digital headroom without recalibrating your converter(s)?

G.
 
SouthSIDE Glen said:
How would you adjust your digital headroom without recalibrating your converter(s)?

G.
By controlling the level of the analogue signal I feed in. I use a Korg D1600 so direct calibration isn't an option - it is what it is.

I set the level either visually so that the average signal is somewhere around -18 dBFS (a guesstimate from the meter but a consistent guesstimate) or by ear - how loud the sound is coming out of my monitors when the volume knob is set to my standard operating level.

To change to a different headroom I would adjust my monitor levels and/or adjust the input trims and "target" a different average -dbFS.
 
iqi616 said:
By controlling the level of the analogue signal I feed in.
Ah, OK, that makes sense.

I misunderstood; I thought you were saying something about governing the signal on the digital side, which wouldn't make sense. :)

Never mind. Carry on. Nothing to see here. Move along. :D

G.
 
I almost never look at the meters in my DAW. I use the meters on my preamps and set the gain to 0dbVU.

The confusing thing is when you have all-in-one interfaces with no analog meters.
 
Farview said:
The confusing thing is when you have all-in-one interfaces with no analog meters.
If they have digital meters, then all you have to do is check the maximum output spec, usually reated in dBu, subtract 4 and reverse the sign. That'll give you the +4dBu calabration level on the digital side.

of course if it has no meters at all, then you just gotta fly by the seat of your ears :).

G.
 
iqi616 said:
By controlling the level of the analogue signal I feed in. I use a Korg D1600 so direct calibration isn't an option - it is what it is.

I set the level either visually so that the average signal is somewhere around -18 dBFS (a guesstimate from the meter but a consistent guesstimate) or by ear - how loud the sound is coming out of my monitors when the volume knob is set to my standard operating level.

To change to a different headroom I would adjust my monitor levels and/or adjust the input trims and "target" a different average -dbFS.

Trimming that little beast is tricky, innit? I have one too, but I only use it for remote stuff now...
 
SouthSIDE Glen said:
If they have digital meters, then all you have to do is check the maximum output spec, usually reated in dBu, subtract 4 and reverse the sign. That'll give you the +4dBu calabration level on the digital side.
G.
That is assuming that it isn't a preamp built into an interface. The only output is digital, so there would not be an analog maximum output.
 
Farview said:
That is assuming that it isn't a preamp built into an interface. The only output is digital, so there would not be an analog maximum output.
Yeah, I should have said maximum analog level or maximum voltage level, not maximum output level. Freudian slip on my part. The wording changes from manufacturer to manufacturer, but as I understand it, the spec retains (for most gear) pretty much the same meaning; the maximum voltage level out of the analog stage equates to 0dBFS out of the digital stage (and vice versa?). So when a converter is spec'd with a maximum voltage of, say +24dBu, that would equate to a calibration of +4dBu = -20dBFS. Is there something I'm missing in that understanding?

G.
 
Lots of good info here. I always assumed (and was told) that the 18dbfs reference level was written into an AES/EBU spec/document somewhere as an agreed upon standard for TV post.

I looked all over the place and could only find multiple standards. Confusing stuff.
 
This is a very good discussion/topic. Good one G!

Now, how come my mixes don't sound as loud as commercial CDs? Should I change my reference level to -6 dBFS?
 
The Audio Cave said:
Lots of good info here. I always assumed (and was told) that the 18dbfs reference level was written into an AES/EBU spec/document somewhere as an agreed upon standard for TV post.


for Europe...yes...
for SMPTE (US)...-20dBFS...
again, refer to my links on the first page.
 
masteringhouse said:
Now, how come my mixes don't sound as loud as commercial CDs?

You mean you don't smash your mixes with a hard limiter on steroids to get rid of the 90 or so dB of dynamic range? Dude, (may I call you dude?) that's so last century. Smash it all!

masteringhouse said:
Should I change my reference level to -6 dBFS?
No. Higher resolutuion comes from more pixels. You want your audio to have as much resolution as possible, so you need to run it through a digital camera. I think MCI stopped making cameras, but Sony still makes good ones. You need one with AES/EBU or maybe Lightpipe. (Or some other kind of pipe. :eek: ) Makes it sound more "optical"... :confused: :)


I am teh mastring enginer,

sl
 
SouthSIDE Glen said:
So when a converter is spec'd with a maximum voltage of, say +24dBu, that would equate to a calibration of +4dBu = -20dBFS. Is there something I'm missing in that understanding?

G.
Nope, you have it right. My point was that you won't find a spec for just the analog side of an all-in-one unit. Like a Roland 2480 for example. The entire studio is one box. There are many newbs that don't realise that there is a separate preamp, converter, recorder and mixer in there and there are no specs to follow. Only the advice from the manual that tells you to 'record as hot as you can without clipping'. (Isn't that nice)

When I tell someone to average around -18dbfs, I pick that number because it seems to be an average of all the standards and a safe level for everyone.

Anyway, 2db in either direction won't make much difference.
 
Farview said:
My point was that you won't find a spec for just the analog side of an all-in-one unit. Like a Roland 2480 for example.
Well, some do have them. For example the Yamaha AW series does have "Max before Clip" specs for both the analog in and analog out, and in that case the "analog out" specs do indeed match the converter calibration.

But you're right, not all have that spec to work with. When it's there though, it can be used to translate the readings on a digital meter to the analog equivalent. Of course unless you're using a test tone or something like that, you'll be seeing quasi-peak levels and not the lower average readings of the VU.
Farview said:
When I tell someone to average around -18dbfs, I pick that number because it seems to be an average of all the standards and a safe level for everyone.

Anyway, 2db in either direction won't make much difference.
Agreed. I was never really questioning the validity of the -18 spec, just was looking to try and find an "official" handle with which to grab hold of the subject.
masteringhouse said:
This is a very good discussion/topic. Good one G!
Well, I occasionaly have questions of my own too ;) :D . Just soldiering on in the Internet S/N wars ... Yet another unfortunate case where both the industry rags and Wikiality - not to mention about a thousand other websites - provide (at best) misleading information.

G.
 
You've kind of touched on this, but after reading the thread to this point I think your reference should be the voltage level at +4. Line up all the various standards to that.

But does it really matter which standard you choose as your "base" standard? What's important here is how they line up relative to each other.
 
Back
Top