question about tracking too hot......

  • Thread starter Thread starter dastrick
  • Start date Start date
Also, some sound cards have a software input level control, which is not useful because you can run into exactly the problem you describe, where the signal distorts below digital zero.
That may be the issue. There are no hardware pots on my interface, only software controls for gain/attenuation.



If your synth can distort at intermediate stages, that is totally unrelated to how hot people should record in a DAW.
The point I was making is that I can set the gain within the instrument that it is hot w/o any digital clipping or analog distortion and distort the inputs on my audio interface.

Glen, the K2600's outputs are +4dBu line level. Nothing esoteric, at least as far as the printed specs in the manual indicate.
 
That may be the issue. There are no hardware pots on my interface, only software controls for gain/attenuation.

Yes, this most likely is the problem.

Again, and to keep this focused, people often suggest setting up gear so that 0 VU on the mixer's output meter equals -20 on the DAW's record meter. But this is exactly the sort of level calibration that leads to the troubles you report. If you can't record signals cleanly right up to Digital Zero, then something is very wrong.

My mixer and sound card are set up as shown in this article from EQ Magazine:

Using a Mixer with a DAW

My M-Audio Delta 66 inputs are set for +4, which hits digital zero somewhere around +8 as I recall. My Mackie 1202 direct outs are rated up to +22. So getting a good clean signal is as simple as adjusting the mixer's preamp trim for a good level that doesn't clip. It's a total no-brainer that always works perfectly. This is why I don't understand the obsession with "calibration," at least in the context of a home studio using a small format mixer and a computer sound card.

--Ethan
 
My M-Audio Delta 66 inputs are set for +4, which hits digital zero somewhere around +8 as I recall. My Mackie 1202 direct outs are rated up to +22. So getting a good clean signal is as simple as adjusting the mixer's preamp trim for a good level that doesn't clip. It's a total no-brainer that always works perfectly. This is why I don't understand the obsession with "calibration," at least in the context of a home studio using a small format mixer and a computer sound card
Plus 8 what? +8dBu or +8VU?

If it's +8 dBu, I'd bet the Delta is ignoring your input level selection and is still actually assuming -10dBV internally, in which case a +8dBu summit would pretty much make sense, with about a 16dB spread from -10dBV to +8dBu. Either that or there's something else seriously wrong with the Delta (which wouldn't surprise me either.)

If you mean +8VU, then I don't see an issue.

What's a self-described gear snob doing with a M-Audio Delta 66 to begin with? That's as non-sequiter as someone who claims to pretend to only the finest gourmet food getting caught scarfing down a Big Mac.

G.
 
Last edited:
Just to clarify, by "ignoring the input selection", what I'm thinking is that that selection simply changes the input expectations to expect +4dBu and provide the proper High-Z input impedance for it, but the switch doesn't actually change the internal converter calibration, which by default expects consumer-level signals, and is set to somewhere around -10dBV = -16dBFS. This would drop the "headroom" between 0VU and clipping on a +4dBu signal by almost 12dB to +8dBu.

G.
 
Last edited:
What's a self-described gear snob doing with a M-Audio Delta 66 to begin with? That's as non-sequiter as someone who claims to pretend to only the finest gourmet food getting caught scarfing down a Big Mac.

:D

I think he's got you on this one Ethan! ;)

Now I may never give up my analog tape! :p



Somebody rep SouthSIDE. I'm all repped out at the moment.

Got it. :)
 
Plus 8 what? +8dBu or +8VU?

I don't remember. :D I spent half an hour measuring voltages etc last year for a Gearslutz post. I thought I saved it but now I can't find it. The upshot is my preamp can put out 10 to 15 dB more than the sound card needs to reach Digital Zero, which gives me a strong clean signal with no noise beyond what the microphone picks up in the room.

What's a self-described gear snob doing with a M-Audio Delta 66 to begin with? That's as non-sequiter as someone who claims to pretend to only the finest gourmet food getting caught scarfing down a Big Mac.

I'm not a gear snob, if that's what you mean. But there's nothing wrong with my Delta 66. Did you watch my AES Audio Myths video?

This would drop the "headroom" between 0VU and clipping on a +4dBu signal by almost 12dB to +8dBu.

I'm not sure what you mean, but headroom in a DAW is irrelevant. Unlike analog tape where distortion creeps up slowly, digital is clean right up to hard clipping. So any "headroom" over hard clipping is never used anyway.

--Ethan
 
I'm not sure what you mean, but headroom in a DAW is irrelevant. Unlike analog tape where distortion creeps up slowly, digital is clean right up to hard clipping. So any "headroom" over hard clipping is never used anyway.
Sometimes I just don't understand the disconnect we have in communication, Ethan. For the record, I think you're a pretty smart guy, and I like the products you put out at Real Traps (and recommend them often).

But on some of these subjects I feel like I (and others) must be talking in Pandoran, because some simple concepts that should be pretty easy and obvious to a guy of your education just don't seem to be coming across.

Your converter clips at +8dBu (which is the only measurement that makes sense in this situation, actually.) OK, we agree on that. That's because the Delta 66 converter is designed to expect a consumer-class input level revolving around -10dBV. When that is the case, the converter "calibration", or exchange rate, if you prefer, is that a 0VU signal at -10dBV will convert to about -16dBFS on the digital side. This leaves a range 16 dB between 0VU analog and digital peak clipping, which is about average (maybe just a tad lower) for the range of converter designs out there these days, which can range anywhere from 14dB for the EBU standard, to 15dB for the DAT/ADAT standard, to 18dB for the AMPEG standard, to as much as 20-22dB for many of the newest designs coming mostly out of the Pacific rim. The 16dB on the Delta 66 falls fairly nicely in that range.

The lower ranges tend to be the older "standards" and/or on machines where 16-bit (90dB) digital canvasses are used or assumed, and as the 24-bit and higher canvases have become more the standard, the 0VU-to-clipping range has increased to take advantage of the increased canvas size.

The wild card in your situation - and what I consider to be poor compromise design on M-Audio's part - is that the Delta 66 offers a switchable input pad between -10dBV and +4dBu, which seems on the surface to be a nice feature. The problem is that it only changes the design parameters of the analog input itself, but apparently dies not change the nature of the converter. The converter's calibration or exchange rate stays the same.

This means that if one is sending a +4dBu signal to the 66 that happens to be RMSing anywhere near 0VU (we won't argue a few dB here or there), which is common in analog chains with good gain structure, that leaves one with a possible crest factor of only 4dB before clipping in the 66's converter.

Now your argument, if I understand it right, is "What's the big deal? I just throttle the output busses on my 1202 by some 12dB or more, so that the input to the 66 is basically the same voltage level as if I were feeding it a -10dBV signal or less instead, and I have plenty of room before I clip that way."

And you're right, that will work. But it's an unnecessary extraneous step that does nothing musical, but only manages the signal, and is made necessary only because of the halfway design of the 66. If you had a true +4dBu interface, you'd never have to do that and the signal levels would flow much more naturally.

And beyond that, it means that the 66 forces you to "change the rules" (for lack of a better term offhand), it creates confusion by interrupting the continuity that is otherwise built into the entire analog/digital signal chain. I'm starting to think that may be what's happening here, that you're not seeing the bigger gain structure continuity picture that I described because the halfway design properties of your Delta66 are clouding your vision. I don't blame you, I blame the Delta66 for that.

The Delta may sound fine to you, and honestly I'm happy that you are happy with it. But it does throw a ratchet into the otherwise beautiful symmetry and continuity that has been otherwise pretty purposely engineered into general gear and signal level design on both sides of the converter. This symmetry/continuity is not just there to look pretty to geeks like me; it has purpose. It provides a path of least resistance to delivering a signal from the microphone or instrument all the way through conversion and digital mixing to the final digital 2mix that usually results in the best possible sounding 2mix that the gear can deliver (assuming a human pilot that understands all that, anyway.)

There is an operational synergy there that is not immediately apparent by looking at the technical specs of just the individual links in the chain alone. The actual physical evidence in the form of clients and students for folks like me Jay or John (Amongst many others) that constantly report back that paying attention to that path by not pushing their levels on either the analog *or* digital side has indeed improved the sound of their mixes and allowed for better mastering results confirm that on a regular basis (not that any of us really needed that confirmation once we figured out with our own work.)

HTH,

G.
 
My M-Audio Delta 66 inputs are set for +4, which hits digital zero somewhere around +8 as I recall. My Mackie 1202 direct outs are rated up to +22. So getting a good clean signal is as simple as adjusting the mixer's preamp trim for a good level that doesn't clip. It's a total no-brainer that always works perfectly. This is why I don't understand the obsession with "calibration," at least in the context of a home studio using a small format mixer and a computer sound card.

--Ethan
That explains why you aren't experiencing the same thing a lot of other people are. If your converters are clipping at +8, you couldn't possibly be running the preamps anywhere near 0VU.

If you were using a set of converters that were actually designed to recieve a +4 signal, the whole world would be different. I'm not even talking high end stuff, I use Motu interfaces. 0VU = -18dbfs meaning that my converters clip at +22dbu. That makes a big difference and that's probably why you are not noticing the same problems a lot of other people do.
 
The wild card in your situation - and what I consider to be poor compromise design on M-Audio's part - is that the Delta 66 offers a switchable input pad between -10dBV and +4dBu, which seems on the surface to be a nice feature. The problem is that it only changes the design parameters of the analog input itself, but apparently dies not change the nature of the converter. The converter's calibration or exchange rate stays the same.

.

Glen,
You say the Delta 66 "only" changes the parameters of its analog input circuit. But that's what it needs to do! It's the analog input that must deal with the real world signal sent to it and so it's the analog input parameters that need to be changed!
It's the same as a "trim" control on a pre. The converter's "exchange rate" WRT its input side HAS been changed!

IMHO

Tim
 
Glen,
You say the Delta 66 "only" changes the parameters of its analog input circuit. But that's what it needs to do! It's the analog input that must deal with the real world signal sent to it and so it's the analog input parameters that need to be changed!
It's the same as a "trim" control on a pre. The converter's "exchange rate" WRT its input side HAS been changed!

IMHO

Tim
Well, yeah, Tim, you're kind of right I suppose. Though the "exchange rate" has not changed, he's still going to clip at +8dBu, -10dBV is still going to translate to ~-16dBFS, and +4dBu is still going to translate to -4dBFS no matter what.

It still means that if you actually try feeding a nominal +4dBu signal into it that the converter is going to run out of room way too fast. It requires extra signal management either on the output of the upstream device or on the input of the 66, on the order of a good 12 dB of padding or more, in order to keep away from converter clipping. That should not be necessary. It's like requiring the operator to make three left turns instead of a simple right turn. Sure, you wind up on the same street going in the same direction, but it's still kind of a waste.

Not to mention the confusion in understanding the forest of gain flow that the trees of such signal gymnastics seem to cause in the minds of many.

I cold see maybe if they simply hard-wired a 12dB pad into the 66 in-between the input and the converter on the +4 side to make it all transparent to the user that it might make more sense. But again, that's just lengthening the signal chain.

I'd much prefer an integrated interface that defaulted to +4 and allowed a direct through at nominal levels at +4, and treated the -10 as the optional setting. Or better yet, just stick to a +4 only device as most of us will never have a -10 feed to worry about unless one is forced to use the tape outs on an inexpensive mixer as their main feed.

G.
 
Glen either you or I am right on this. There's no wiggle room here.

Again, why is it possible on a pre to adjust the trim control and accomodate a huge range of inputs from mic levels to well beyond that? It doesnt compromise the performance. Surely you know this from experience. It's basic.

The higher pro level voltages are conducive to the wider dynamic 'canvas' you speak of. Always have been. But once you have the pro level circuitry it's dead easy from a design point of view to add in a boost of analog input sensitivity to accomodate the lower consumer level coming in. It's not a compromise!

Just because a piece of audio hardware also accomodates consumer input levels and sockets doesnt mean it's "a lesser being" than one which doesnt. In fact it's more flexible.
 
This is going in an awful lot of confusing directions.

Let's go with a car analogy -

You've got a motor that will blow up at 10,000 RPM (sure, that's high but just for the sake of simplicity and the metric system, 10kRPM).

That's the CLIP level - The moment of failure. The spot the engine cannot exceed. And let's equate this to an ANALOG failure - This entire thing has NOTHING TO DO WITH DIGITAL LEVELS FROM THE START. The only line being drawn is the whole "Digitally, that'll fall around -(xx)dBFS" analogy.

That engine is a preamp. [SELF-CENSORED] digital for the moment - Let's assume you have a perfectly decent converter that will run perfectly linear and clean up to full-scale (although I've had several that won't).

Now does it sound like a good idea to ANYONE to run that engine at 9,999 RPM consistently? Of course not. Just because it's right on the edge of failure, it hasn't been running efficiently for several kRPM now.

It's probably running most efficiently at maybe 3kRPM. That's where the power and fuel consumption find a 'harmony' of sorts. Nothing is 'complaining' - You're cruising on the highway at 65MPH while the motor is humming along nicely at around 3kRPM, your gas mileage is as good as it will get, the horsepower required to keep you at this speed is nominal and typical, there's no worry about overheating - "It's got a lot of miles, but they're all highway miles" if you know what I mean. Lower than that, you're not getting the best mileage, but it's still all fine. Higher than that, you're going to start losing mileage, building heat and straining the parts. When you get up to 10kRPM, it starts liberating parts on the roadway.

3kRPM is analogous to line level here. That where the motor really runs well. Line level is going to be about 1.25 volts. That's where the gear is spec'd at, that's where it's designed to run.

10kRPM is where the motor fails - It's analogous to the clip level (where the circuit fails). Figure it to be around 4.5 volts. Nearly 400% of the voltage the system is designed to run at before complete failure of the circuit.

Now maybe some people's preamps sound exactly the same whether running at 100% or 300% of their spec'd voltage. I've heard very few that do. I don't want to point any fingers, but I'll go ahead and do it - Roland 2480's. Listen to those preamps at -18dB(FS)RMS and they sound decent enough. Push them up to -6 -- FAR short of the PREAMP clipping and HALF the voltage before the CONVERTER starts clipping, and they sound like complete garbage. The analog circuit is far from failing - But it isn't running efficiently anymore - CLIPPING and DISTORTION are two completely different things. Another? M-Audio DMP3's. Really, a pretty nice sounding preamp - until you push it a little too hard and they get very harsh & edgy and "spectrally compromised" (and before someone asks, what I mean is that the spectrum is affected dynamically by the level of the signal - The low end might "blossom" while the high end dulls out - On some preamps, the low end can 'shrink' and the high end will sound like it's being fed into a maul-the-band compressor). It's not clipping - It's complaining about being pushed too hard.

And a lot of people won't notice that one preamp being pushed too hard on one track. That's why I always insist on experimentation with groups of tracks (a.k.a. 'songs') where that little bit of "schmutz" (if that's how it's spelled) can add up to a whole lot of "yeech" when you have 20 track running together...

Headroom is a wonderful thing - Preamps love it, car engines love it, the tires love it, speakers and amplifiers love it - and wonderful photos can be captured with the light from a single candle.
 
Glen either you or I am right on this. There's no wiggle room here.

Again, why is it possible on a pre to adjust the trim control and accomodate a huge range of inputs from mic levels to well beyond that? It doesnt compromise the performance. Surely you know this from experience. It's basic.

The higher pro level voltages are conducive to the wider dynamic 'canvas' you speak of. Always have been. But once you have the pro level circuitry it's dead easy from a design point of view to add in a boost of analog input sensitivity to accomodate the lower consumer level coming in. It's not a compromise!

Just because a piece of audio hardware also accomodates consumer input levels and sockets doesnt mean it's "a lesser being" than one which doesnt. In fact it's more flexible.
I've had a delta series card. It does not change the level when you change the setting from -10 to +4. It still acts like a -10 input. It's very frustrating when everything else in your chain is +4.
 
I've had a delta series card. It does not change the level when you change the setting from -10 to +4. It still acts like a -10 input. It's very frustrating when everything else in your chain is +4.

I'm sure it is frustrating but I cant comment on that card. If the card's instructions say its input sensitivities can be changed you would expect that it would actually do that. If it cant it sounds like false advertising doesnt it.

But if I'm wrong on this in principle someone more knowledgeable witll surely chime in and correct me.

Of course the other issue is balanced versus unbalanced.
 
Glen either you or I am right on this. There's no wiggle room here.
I see lots of it. So far there is only one thing (other than this statement ;) ) you said that I disagree with.

What's happening here is a typical forest vs. trees perception issue. It is entirely possible for you to say correct things about the trees but overlook the larger nature of the forest. Neither the tree people or the forest people are necessarily wrong within the scope of their points and perceptions, yet don't agree upon the overall description of the whole thing.
Again, why is it possible on a pre to adjust the trim control and accomodate a huge range of inputs from mic levels to well beyond that? It doesnt compromise the performance.
The preamp is there on the mic side to accomodate a variety of microphones with a variety of differing output voltages and output impedances. Before that pre, the whole line level continuity thing in the signal path does not yet exist; this is where it starts. On the line-in side of the circuit, the "pre" is a simple trim control. meant to a) accommodate natural fluctuations in average signal strengths from device to device (e.g. the previous in-line EQ setting wound up dropping the signal strength a bit, but it did not have an output gain control to bring things back up to snuff), and b) to allow the user to take purposeful advantages, if they wish, in the differing personalities of the circuit "color" (e.g. I want to overdrive this pre because it sounds really k3wl to do so). It call purposely gaming the gain structure, and is a ley element in quality audio engineering.

You know that already, of course, I'm just setting up how that's different from being forced to negative pad the signal by 12dB just because your throwing a device not designed to work at +4 in the signal path (the fact is, the Delta 66 is a consumer-level device in it's core, the +4 switch notwithstanding.) It's the odd man out.

Now, if you have no choice in the matter, then you have no choice in the natter. But there is no good reason why one's interface should be at -10 unless they are working strictly with consumer or gaming gear. So there is no good reason why one should have to adjust for it. The Delta 66 does not belong in a pro chain.

Does it hurt technically or electronically for it to be in there? No, not much, if at all. On that we do agree, and I have already stipulated to that. But try to explain gain strategy to Ethan - let alone a newb - when they have gear that's swinging voltages all over the place. As this thread shows, it makes it next to impossible gain in a holistic understanding kind of way. And *that* itself is, IMHO, damaging enough. I say that based upon my experience with these very folks learning all about that stuff, and that when they do finally get it, amazingly enough, their tracks and mixes improve significantly in quality.

But it goes beyond that, even for those like you and I who do get this stuff. Why should you or I have to even worry about what gear we have in line and adjust our gain structure so *arbitrarily* to account for it? KISS - Keep It Simple, Stupid. I don't want to have to worry about whether my interface thinks that 0VU is the same thing that the rest of my gear thinks it is. 0VU has *meaning*. And it only makes sense (to me anyway) that that meaning should hold the same meaning all the way down the chain and not change with every piece of gear the signal passes through. this is not so much about +4dBu being the perfect signal voltage as it is 0VU holding the same meaning.

0VU just happens to play out as a solid reference all the way through the digital mixdown. Not because the digital domain cares about the level. As long as we keep it between the ditches of noise and clipping, we're OK, sure. But when you have a conversion of something like 0VU=-18dBFS (just for example), that just so happens to be just about the sweet spot (give or take a fudge factor of a few dB to taste, of course) on the digital canvas where it's about as high as one can get above the digital floor to accommodate the bulk of the signal without digitally adding to the analog noise floor, while still leaving enough room for peaks to stretch out before clipping. And when you consider that the true RMS of most analog signals is actually probably some handful of dBs below 0VU, that means they will convert to an actual digital RMS somewhere a few dB below that conversion level. Still room at the bottom, even more room for peak crest factor. These converter calibrations were purposely picked *to make sense*.

Then take those tracks to mixing; when the engineer knows all this stuff and knows enough to keep the general overall levels balanced more or less around this idea, two things happen: the less work there is for him because the less he has to worry about signal management (keeping stuff between the ditches) and the more he can concentrate on mixing the music unburdened (or at least less burdened) by such worries. It's a lot easier to keep the car on a rough and bumpy road when one picks the center lane to drive upon.

And when one realizes that for most pop/rock/country/blues/etc. mixes (I'm not necessarily always including the extremes like metal/core or or dance/trance) when they get to the mixdown/summing stage - digital or analog - that mixes that just naturally RMS out to somewhere around or just below the converter conversion level usually are the ones that wind up having the highest sonic quality *and* take to the mastering stage better, one realizes that that level still does have a similar meaning, even after all the digital manipulation.

So yeah, you guys are right when you make some localized technical points that padding a dozen dB here or gaming the digital input there does not in and of itself necessarily cause a mangling of the signal, but my counterpoints are three: a) there is no reason for it to be that way other than inefficient device design or operator error, 2) that understanding and working the system as an integrated drive train with 0VU as the calibrating value from beginning to end, (that does not mean that everything is recorded right at 0VU, only that it's a constant standard around which signal values should be judged), that it makes for a simpler and easier and more elegant process, and, III) That it makes for a robust, holistic that by it's very nature keeps the signal in a line that winds up optimizing the eventual mix quality with a minimum of conscious concern, and that in is not only better for veterans, but helps the newbs grasp the whole thing and make better mixes by a noticeable amount.

Is it necessary to look at it that way? No. I'd bet there are a large number of veteran engineers who never gave it that much conscious thought, frankly. But I'd also bet that they never had to explain stuff in this forum either ;), and it's a very effective way of explaining to to those that don't already have their pre-conceived ideas. It's like the four dimensions approach to mixing. Many don't look at it that way, but that doesn't change the fact that they are indeed mixing in those four dimensions; the core truth of it remains even if they are not conscious of it.
The higher pro level voltages are conducive to the wider dynamic 'canvas' you speak of.
Unless you an I mean two different things by "canvas" (I'm referring to the 140-some-odd dB that a 32-bit recording affords us), they have nothing to do with each other. It's the converter that decides how any given input voltage conduces to the digital canvas, it's the converter that's the key to the whole thing.

G.
 
where that little bit of "schmutz" (if that's how it's spelled) can add up to a whole lot of "yeech"
I was trying not to get that technical about it, John. thanks for crossing that breech for me ;) :D.

j/k :D

G.
 
Back
Top