if the user keeps an eye on both but especially doesnt overcook the fast peak reading meter level, why isnt that good enough?
Tim
Yolu are right, it is important to emphasize the differences between analog VU metering and digital dBFS metering, not only in the difference in the ballistics (quasi-averaging vs. peak reading), but also in the decibel scales used.
Your above quote is extremely close to the absolute truth, if I understand your meaning correctly. However, I think it's emphasis on the peak meter still kind of misses what the real fundamental key behind this whole thing is, and it can still lead to an "improper" gain operation.
SHIFTING THE PERSPECTIVE
The key, IMHO, is in understanding the jump from the VU meter scale to the dBFS meter scale, and the fact that there is a calibration, a specific conversion factor between the two. Once one understands that 0VU = xdBFS, and that "x" is determined by the individual A/D converter, they then can easily understand three important things:
- that since 0VU = line level, therefore there's a level on the dbFS peak meter that is also equal to line level.
- that if they are pushing good gain into the analog side of the converter, they'll usually get good gain coming out the digital side as well.
- that modern-day converters are purposely designed to set line level on the digitial side to find a proper balance between headroom and S/N ratio, and that this line level is several dBFS lower than most initially imagine.
Put this all together, and it really sums up this way: In most cases the ideal digital recording level is simply set by pushing the ideal line level into into the analog front of the ADC, letting the converter do it's thing, and recording the natural digital levels it puts out without any further gain modification. The only exception to that would be if the audio had a huge crest factor exceeding the digital headroom in the converter calibration, in which case we find digital clipping. In that case we simply turn down the gain into the analog front of the ADC the dB or three that we may need to make room in the converter.
By working from the basic concept of the conversion factor itself, there is a whole ton of simple slap-one's-head information that pops out that makes the whole process quite simple, understandable and de-mystified, while ensuring the bast possible tracking levels and gain structure methods at those final stages in the chain.
LOSING THAT PERSPECTIVE
What you said in the above quote does fit that procedure in one particular solution of that way of stating it, but there are other solutions that fit that way of stating it that are not correct.
That quote still implicitly treats the VU meter (and therefore the analog line leval) and the dBFS meter (and therefore the digital levels) as two rather discrete entities, and the connection to optimum line levels on the dBFS side of the equation is not made. Therefore it's simply watch to make sure you don't clip, with little-to-no reference as to where the rest of the signal should be laying.
With no line level guidance there, it's a small step to the erroneous concept of as long as you're not clipping, you're OK; so track as hot as you can without clipping, This idea is really a hold over of an old analog truism related to S/N ratio, but for reasons alread discussed in this thread, doesn't actually apply the same way at all to digital. But when we use it, we're then left with a gain structure strategy that is not the same as what's described from the conversion factor angle at all.
LOOPHOLES IN GAIN STRATEGY
Additionally, When the line level link is not carried through between the metering systems and they rather are kind of considered seperately, then the correct idea of adjusting one's final digital level by adjusting the analog signal going into the converter is no longer implied or intuitive. Many folks might be (and based upon the history of questions on this board - may real folks really are) temped to reduce any clipping they find by instead pulling back on the digital input levels on their computer interface driver or in their recording software. This may darken the clipping lights on their dBFS meters, but it does not eliminate the actual source of the clipping - namely in the converter itself. The levels will be pulled back in the computer after the clipping has already happened, which of course too late; the damage has already been done.
And a similar misunderstanding can happen in the other direction: If they see the levels are too low, even based upon a line level RMS reference, they might be - and often are- tempted to rectify that by boosting the digital input gain on the computer. And we all know that all that does is boost the overall digital volume, raising the composite noise floor volume and reducing available headroom for the mixing and mastering processes yet to come. And there is no increase in resolution when boosing it digitally that way, even though we are using more bits; all the digital boost is doing is adding more zeros to the end of the value, it's not really increasing the precision of that value at all. So there's several reasons why that's not a good idea.
AVOIDING THOSE PROBLEMS
All those potential traps are elegantly avioded while the levels are automatically kept on an reasonably optimum track, however, when one approaches it from the perspective of averaging the analog input signal into the converter at somewhere around it's expected line level, and then letting the converter do it's thang naturally.
THE WHOLE -18 THING
And that is where the idea of digitally tracking with an average RMS level somewhere in the mid-to-late negative teens dBFS come from. Most folks on these boards pick -18dBFS not so much as a magic number, but as a common example that is usually a "close enough" oversimplification for short, easy answers given to those at hobbiest level recording interests.
PRO vs. CONSUMER LEVELS
Tim Gillett said:
Still, I think that to generalize about this and make a general prohibition on the top say 6db of digital recording room simply because it may be a problem in a professional environment due to amplifier equipment limitations is I think unwise. Many users on this forum dont use pro levels at all
Don't think of so much as a "prohibition" on the top dBs as an explanation of a technique which renders impotent most needs and desires for intentionally using them.
And it doesn't matter if one is using a converter input designed at +4dBu line level ("pro" or "commercial" levels) or -10dBV ("consumer" levels). Either way the analog side of the converter considers those the designed oVU line level for that device and the "sweet spot" for signal operation, and either way the converter calibration will still be calibrated to convert that particular line level to it's designated digital dBFS level. It's calibrated around the 0VU level for that device input. For example, there are some Soundblaster-class cards out there that convert their expected -10dBV input (0VU, as far as THAT card is concerned) to -18dBFS the same way that a given prosumer or pro interface will convert a +4dBu (0VU for THAT unit) to -18dBFS.
G.