Massive Master, this one's for you

bozmillar

New member
I just read your article on tracking levels, and it makes sense. Here's the link for anyone else to read. http://www.massivemastering.com/blog/index_files/Proper_Audio_Recording_Levels.php

But, I have a couple questions. This is all speculation because I haven't tried it yet, but i feel like there are some exceptions to that, especially for home recording guys. I'm going to go out on a limb and speculate that that only really applies to "vintage" type gear and other outboard preamps. I mean, lets take a look at what most people recording at home are using: audio interfaces with a built in preamp.

The preamp is built in to the device and is connected directly to the A/D converter. Whoever designs the interface gets to choose what voltage level maps to 0dBFS. Whether 0dBFS is 0.5 V or if it's 20V, the user doesn't know because there's no direct output from the preamp. Since the preamp doesn't have to play nicely with anything but the A/D converter. Neither the preamp nor the A/D converter have to be designed to work optimally at +4dBu.

Since people are generally under the assumption that hotter without clipping is better, it doesn't seem unlikely to me that the designers of the interfaces most people use would make their product to in fact sound better at higher levels since this is how people are actually using them.

Any thoughts?
 
ok, so I read through again and I see where you mentioned that on most devices you've tested, 0dBVU = -18dBFS. But that still leaves the question of whether or not most modern devices are designed to function best at +4dBu.

I would assume that the reason to stay away from hotter signals is distortion and frequency response may be different at higher levels, but from all the gear I've tested, they all tend to stay pretty linear and flat even at levels that drive my A/D converters close to 0dBFS. If it's still linear, and it's still flat in the frequency response, then I can't think of anything else that might be contributing to it sounding worse.
 
OK So I'm most certainly not as qualified to answer as Massive Mastering but I would advise you to take a look at this

http://www.sengpielaudio.com/calculator-db-volt.htm

This'll get you up to speed on the relationship between dBV and dBU which are still basically both about voltages in analog circuits and how neither of them bear any relationship to dBFS which is about the maximum level that can be described numerically in a digital system

devices at +4 dBU are still no where near outputting a level that would register as 0 dBFS in a DAW more like +1.8 dBV still probably at least -12-14 below 0dBFS

sorry for the hijack and welcome to the wonderful world of incompatable measurment and metering tools
 
This'll get you up to speed on the relationship between dBV and dBU which are still basically both about voltages in analog circuits and how neither of them bear any relationship to dBFS which is about the maximum level that can be described numerically in a digital system

That's exactly my point. There's no relationship between dBVU or dBu and dBFS, however Massive did say in his article that with most products he tested +4dBu landed somewhere between -20 and -12 dBFS. I have no reason to believe that this is not true.

However, there is an assumption made in the article that doesn't have any sort of reference other that "trust me" and that is the part about the analog components having the least amount of distortion and flattest frequency response at this level. I'm not saying it's not true, but I've never seen anything to support this. I have done tests that show that my gear is equally as flat and has lower THD+N specs closer to 0dBFS than it has at -20 though.
 
sorry for the hijack and welcome to the wonderful world of incompatable measurment and metering tools

And that's no hijack. It's very related to the question.

I should state that I didn't start this thread to try to butt heads with massive or tell him that his article is not true. I'm just trying to get a better low level understanding of this idea which I've heard on more than one occasion, but from the (minimal, mind you) tests I've done, my conclusions don't line up with his.
 
That's exactly my point. There's no relationship between dBVU or dBu and dBFS, however Massive did say in his article that with most products he tested +4dBu landed somewhere between -20 and -12 dBFS. I have no reason to believe that this is not true.

However, there is an assumption made in the article that doesn't have any sort of reference other that "trust me" and that is the part about the analog components having the least amount of distortion and flattest frequency response at this level. I'm not saying it's not true, but I've never seen anything to support this. I have done tests that show that my gear is equally as flat and has lower THD+N specs closer to 0dBFS than it has at -20 though.

Take a look at the manufacturers specs it will very between unit to unit and that is the crux of the problem


my Black Lion converters are specced "With lots of headroom" to go to +10 dBU before they clip and thus distort the signal so I try and never go anywhere above peaks of -10dBFS on my DAW meters to ensure i don't clip the converters. If I throw an RNC into the mix which is rated to run optiamally (acording to specs) at 0dBV I have to drop to peaking even lower even though I can get a much hotter signal out of my Preamps

When I looked at my audio interface it claims that it would have no problem with inputs of above +20 dBU

when you mix and match you have to go with the levels that are not going to clip anything anywhere all along the signal chain

If I use AI only I could theoritically go to 0dBFS (although I find the noise from the pres is non linear and is much more pronounce in the last 15% or so of adding gain) but as soon as I start with the after market converters, compressors etc I have to make sure that I drop to peaking at whatever level will cause no clipping anywhere along the chain

If you shoot for -18 dBFS RMS (which is tough considering most DAWS dont have RMS metering and even then the velocitys of digital metering verses analog make that a moving target too. and the general wisdom seems to be that the meters in most DAWs are awful) I guess you will always be safe, and have plenty of mixing headroom and there is also the argument about at what level analog model plugins shoult receive the input at
 
yeah, but this gain staging takes a lot more thought and effort than most people are generally willing to put into it. Bristol, you had some good points in that it varies from device to device.

I've always actually wondered if my preamp noise was non-linear with respect to gain. I know it always feels like it is, but I wonder if the noise just gets more noticeable when I crank the gain or if the SNR actually gets worse. I guess that wold be a simple test.
 
yeah, but this gain staging takes a lot more thought and effort than most people are generally willing to put into it. Bristol, you had some good points in that it varies from device to device.

.

Yup so you either have to know every piece of gear you have inside and out or if you don't want to make the effort, if you go for -18dBFS you should pretty much always be covered, no matter what wierd mix and match gear you have
 
Whether 0dBFS is 0.5 V or if it's 20V, the user doesn't know because there's no direct output from the preamp.
Sometimes they can know, sometimes not. Unfortunately there is no standard for just what specs are "supposed to" be published. Because of this, many of these entry-level interfaces leave many important specs out of their published specs, of if the put them in, they do not use standardized terms for some of them. This does indeed make it less than easy to figure out just what the calibration between analog dBu/VU and digital dBFS out actually is for many of these devices.

But all is not entirely lost. On many devices there is a spec that is worded something along the lines of, "Maximum Output Level" that is spec'd in dBu. Assuming that the maximum output level corresponds to a conversion to 0dBFS digital - which not only makes sense because it would not make any sense for it to mean otherwise, but has been tested in a couple of models at least - and assuming that it is set to accept a typical +4dBu line level (which usually is properly spec'd), one can take that "Maximum Output" spec, subtract 4dB from it, throw a minus sign on the front of the result, and you have the dBFS calibration level for +4dBu for the converter.

For example, if you have an interface that's rated at a "Maximum Output Level" of 24dBu, that would mean that an in of 24dBu to the converter would convert to 0dBFS. If you subtract 4 from that number, that will give you a 20dB difference between +4dBu and the maximum output. This means that there are 20dB between +4dBu and 0dBFS. Which would mean that +4dBu converts to -20dBFS on that particular converter.

It's also interesting to note that the calibration has gotten lower as the technology has improved. While -14dBFS has been a European standard for years (one of about a half-dozen different "standards", BTW), -15dBFS was the DAT (and ADAT) "standard", -18dBFS the AMPEX-proclaimed "standard" common to much American-market gear over the past decade or so, and now an increasing tendency, especially among stand-alone DAWs (but not limited to them) coming out of the Pacific Rim to go as low as -20 or -22dBFS.

Instead of having to explain all that to everyone every time it comes up in a forum post. it's usually pretty safe (and much easier) to just average them out at -18dBFS, which pretty much works well enough for just about any converter.

I'll avoid the analog side of the maximum advisable level argument for the moment, because there's almost no end to that carousel ;). But on the digital side, in these days of 24-bit recording being able to handle what is probably at very best 70dB of analog range between noise floor and peak, there is not only no harm in lower conversion levels like that, but on the digital side is most likely preferable because it gives one just that much larger of a digital palate to work with once they do get to to the digital domain.

G.
 
Ignoring specs -- Don't get me wrong - I love specs. But specs are such a pest sometimes...

Plenty of preamps will go up to ridiculous levels without clipping with a simple sine wave (as they are tested). But we're talking about signals of extreme complexity, timbre and dynamics when recording a guitar or a human voice, the thwap & thud of a drum, etc., etc.

The problem of course, is that measuring 'distortion' in such complex signals is nearly impossible to do effectively. And it doesn't really sound like *distortion* until the circuit is on the verge of failure.

And that's where I start using atypical descriptives like "clarity" and "focus" and such. But I think they're good descriptives...

I also have to admit that a lot of this *is* on a "trust me" basis - I've not done a lot of scientific experimentation on this... Much of it though, seems very 'common sense' if I may be so bold. The rest is based on dozens and dozens - Most likely hundreds of e-mails, phone calls, letters with recordings (etc.) I receive on a regular basis from people who are suddenly making rather nice sounding recordings by doing nothing different that allowing for some headroom. And they're always pumped about how much easier it is to mix, how much more responsive their plugs are, etc., etc. Of course, I can back that up with experience there myself ("trust me").

And no doubt - the difference is far more dramatic with "budget-friendly" gear than with "really nice" gear (that truly does distort less at higher levels). Not that it doesn't help some with the "really nice" gear also...
 
ok, that makes sense. I'm willing to accept the fact that with a vast majority of products, they will be calibrated close enough to +4dBu = -18dBFS to not have to worry about that point.

ok, after going back through my old EE books I've decided that I'm sold on the idea. I may, if I don't lose interest first, create some test tones and post some pictures to show why this is the case. partly because I think it would help a lot of people to see pictures and partly because I want to see for myself what my op amps do when I throw tough signals at them.
 
Eh, I'd almost save the time. I've had a bunch of people (of the dozens/hundreds or whatever) send in these "OMG!" screen caps of the distortion developing far before the circuit fails. And no doubt, that "inaudible" (sort of) distortion can add up to a whole lot of mush (and loss of "clarity and focus") when you have 20 tracks lumped together.

But still - They're doing these experiments with sine waves. It's ridiculously simple to show distortion in a sine wave. Not so with a complex tone.

But don't let me stop you -- I suppose "seeing" it with such a simple tone might really drive it home when you consider a complex sound...
 
there are plenty of test signals you can use other than a simple sin wave: mulititones, swept sine waves, AM multitones, pink noise, impulses. A lot of these are far harder on the gear than music.
 
there are plenty of test signals you can use other than a simple sin wave: mulititones, swept sine waves, AM multitones, pink noise, impulses. A lot of these are far harder on the gear than music.

But none of them tell you how a preamp or interface is going to react to instrument A at X decibels.

Don't make the mistake that because you're recording in the digital realm that everything is perfect until clip. That's not entirely true. Remember, your Analog to Digital converters are analog devices, and analog devices do not always behave in the same manner at every gain level.
 
What does pink noise or a frequency sweep tell you about tonality, dynamics, detail, soundstage, or clarity?

ok, I'm willing to start a discussion about this as it doesn't turn into "numbers don't mean anything" argument. I guess I'll start out by asking what tonality, dynamics, detail soundstage and clarity mean. What do they mean?
 
Just chiming in as one more voice to say I completely agree with Massive's position on this stuff. Over the years, having worked on several hundred albums, I have seen so many recordings harmed or even made unusable by the quest for hot levels, and I have never seen one ruined by recording with conservative levels.

I have seen hot levels wreck tracks on mboxes and SSL and Neve consoles into Pro Tools HD systems.
 
Back
Top