question about tracking too hot......

  • Thread starter Thread starter dastrick
  • Start date Start date
What was difficult about it?

I'm assuming even though you tracked real hot, the tracks were OK quality-wise...so when you started mixing, why did they give you problems?
I mean....once you had them in the DAW, the actual levels would be irrelevant. You could raise or lower them without any degradation.
Not a levels issue. Sure I'd have to bring things down as to not to clip the master bus but that wasn't the issue. It was simply more difficult to have tracks fit into their own space. I would EQ things a lot more for example. Things were more muddy.

Once I started to track at lower levels all of a sudden I had more clarity in the overall mix. Hard to put my finger on it really.
 
What about the "this sounds good, and this sounds like crap" theory? If it works and sounds good what the hell is the difference? It is our ears we are dealing with, we are bot robots!:)
 
IMP better than NEVE

IMP preamps just beat the living crap out of Neve, SSl or any other "pro" preamp... Its all just marketing hype. Get a £30 preamp with a £20 mic and you basically got a better sound than Neve and a U87...
Just my two cents..
(wow you guys know about basic physics your all so smart yes we have heard of Schroedinger's cat!!! Wow I'm impressed)



Stick to the point its an interesting read this thread until (like all the other threads) it gets weighed down with pretentious show off's and over opinionated bull...

Just my two pence.....:mad:
 
I have to say

IMP preamps just beat the living crap out of Neve, SSl or any other "pro" preamp... Its all just marketing hype. Get a £30 preamp with a £20 mic and you basically got a better sound than Neve and a U87...
Just my two cents..
(wow you guys know about basic physics your all so smart yes we have heard of Schroedinger's cat!!! Wow I'm impressed)



Stick to the point its an interesting read this thread until (like all the other threads) it gets weighed down with pretentious show off's and over opinionated bull...

Just my two pence.....:mad:
This was a joke if you guys didn't already detect the sarcasm...
I recently bought a Focusrite Octopre and after using lower end stuff for ages I noticed a real difference... Particularly when comparing the DI input with my Alesis io26 DI input... Also the built in compression sounds great compared to a lot of the compression I have used in the past, even my DBX compressor...
 
Glen, a couple of questions regarding your post that I need some clarification on
I'll hit you up on all that in a bit. I haven't had my morning coffee yet and my brain is still too fogged to deal with a long post. I'll update a bit later. But for now I just wanted to add this:
Once I started to track at lower levels all of a sudden I had more clarity in the overall mix. Hard to put my finger on it really.
If I had a fiver for everybody who has told me in person, e-mailed me, PM'd me or just plain announced in a post how much better their recordings and mixes have turned out for them ever since they decided to dial back on the tracking gain, I'd probably be able to take this entire thread out to lunch at a halfway decent restaurant.

OTOH, I have yet to get one sing remark via any channel saying that tried the lower gain thing and it just did not work out for them or they noticed no difference. And trust me, in this forum, when you give advice that doesn't work, you hear about it. I have never heard about it.

G.
 
I prefer the quantum level. at that level the possibilities that I have both recorded and sold a million albums and not recorded anything at all exist simulataneously. So long as I don't open my eyes and look at my computer the probability wave will never resolve into an actually

also I have found this cat, the name tag say Schroedinger...... any takers
That's a very good point, but I can't used a matched pair of mics in quantum recording, because it throws the 3:1 rule right out the window. Even if I put the two mics way across the room from each other, everything that hits mic A hits mic B across the room at the exact same time because of the whole action at a distance thing.

G.
 
If I had a fiver for everybody who has told me in person, e-mailed me, PM'd me or just plain announced in a post how much better their recordings and mixes have turned out for them ever since they decided to dial back on the tracking gain, I'd probably be able to take this entire thread out to lunch at a halfway decent restaurant.
Amen.

Freakishly common to receive feedback on the matter (several weekly, minimum). And it's such a common issue... Have two projects in at the moment that are "victims" -- One of them is really a mess... I was just having a quick listen and every track is 'pinched' and unfocused, no 'open space' anywhere - just a semi-distorted mess that won't take EQ well, forget compression... Then I hit one particular song which WAS open, focused - You could 'point' to individual instruments - everything sat well in the mix, the top was clear, the sound was dynamic. I called the client and asked what was different about this mix (figuring he'd say that they recorded it at a different studio or what not). He told me that it was "the first song we did - we didn't even have the levels set properly yet" etc., etc.

And there's no contest - That tune is head and shoulders above the others. It's not even close. Half dB adjustments in EQ make dramatic differences, compression brings 'punch' without 'pinch' and the that 'focus' is still there. Width, depth - You'd never know it was the same recording rig.
 
Glen, a couple of questions regarding your post that I need some clarification on.
OK, George, a coffee and my morning medication later, here's my answer ;) :

Let me start by saying that going back and reading it, I think my previous answer may have been a bit misleading. It cannot in and of itself normally be directly used to determine the converter calibration, as I may have unintentionally inferred. Really, all it's good for is to provide a known, constant 0VU signal that one can use to feed the converter to help determine ballpark setting levels on the interface input.

Now to details:

I'm not sure just what kind of outputs the audio outs on your Kurzweil actually are. If they are anything but conventional line level outs (+4dBu), the method does not apply. You could always record the signal to a machine that will output +4dBu like level and use that to feed your interface.

And yes, you'd use the multitester to check the actual voltage of the signal going through the cable to your interface. One correction I need to make about that, though. when using a simple sine wave, the actual voltage would/should read an amplitude of about 1.737V; the 1.23V figure is for RMS of an average signal, which is mathematically a bit different from what a simple sine will generate.

If you can calculate the converter calibration via your interface's maximum output specification (as I described earlier in this thread), then the idea would be to send a 1.737V sine into your interface's +4dBu lime level input, and send the digital out to your DAW. Have the input gain on your DAW (including any intermediate OS driver stages that may or may not exist) set to unity gain. Then adjust the input volume on your interface until your DAW meters equal the converter calibration level. Then you'll know at what general setting your input gain on your interface needs to be set at for a unity gain of a 0VU signal into your converter.

If you have good RMS metering on your DAW, and not just peak metering, you could also forgo the test tone and multitester, and simply send your complex musical signal through the interface, and dial the input gain on the interface until the RMS is reading somewhere around the converter's calibration level.

If you can't determine your converter's calibration by spec (BTW, sometimes if it's not published in the spec sheet, it can be gotten from the manufacturer or form the mfr's customer support boards), and you just want to ballpark things by selecting -18dBFS as the assumed calibration (you could be as much as 4dB off in either direction, but it's usually "close enough for rock n' roll", so to speak), you'd use the same methods, just setting the input to get your metering on the DAW to read -18 on the peak meters for the sine wave, or -18 on the RMS meters for a real-world signal.

Of course in all instances you'll want to turn the interface input gain down from there if the actual recorded signal is so dynamic as to cause peak clipping on the digital side.

HTH,

G.
 
One of them is really a mess... I was just having a quick listen and every track is 'pinched' and unfocused, no 'open space' anywhere - just a semi-distorted mess that won't take EQ well, forget compression...

The fact that I still track to tape first before I dump to DAW, probably goes a long way to keeping me "safe"...but I just want to be clear about a few things WRT your comments, as I may start doing more and more direct-to-DAW stuff.

In the cases you mentioned above...I'm assuming that their front end was already crapping out, and THAT'S where the bulk of their pinched/unfocused/distorted sound was coming from.
IOW, to go back to the question I asked you a few posts back...IF the signal coming out of the front end is clean/good...but HOT...does the converter in any way cause sonic degradation just 'cause the front end is feeding it a hot signal (but below clipping)?

IMO...most of the people with these bad/hot audio situations are the ones using the all-in-one boxes...so it's hard for them to separate out what their built-in pre/DI/comp/EQ is doing VS. the actual A/D in their all-in-one-box.

I would say that if you can monitor your front end signal before it gets to the A/D...if it sounds right, then the actual dBFS level is not all that critical as long as you are not clipping.
That said...since a lot of people may ONLY be monitoring their post-A/D signal...then yeah, to "play it safe", it may be best to just visually stay in that safe zone.

Since I don’t use “all-in-one” boxes…even if I decide to track to DAW instead of going to tape first…my SOP would be to just let the analog front end tell me where to set the levels and not really worry all that much about the DAW’s dBFS level other than to make sure it’s not clipping.
I’m sure if people follow that SOP, their levels would rarely be too hot anyway since the analog front end would/should “take care of it”, and as long as they’re listening to what it’s doing…I don’t see how they could ever get into trouble. :)

Am I way off base here with that thinking?
 
So, everything you said only really pertains to the equipment you chose to work with and the way you choose to work.

Of course.

That's fair, but it does ignore some of the most sought after preamps and the reason why they are so popular. It also completely ignores most of the budget stuff and the starved plate toob stuff that a lot of people on this board are using.

Starved plate? Are you talking about fuzz boxes that pose as preamps? Did you ever hear Fletcher's rant on starved plate designs? :D

As I explained in my AES Audio Myths video, distortion is a cheap commodity and there's no need for magic. Tubes distort, tape distorts, transformers distort, and so do preamps if you push them beyond the levels they're designed for. I occasionally use a tape-sim to get that character, but I certainly wouldn't want it on every track of every song.

I know for a fact that the Mackie 8 buss mixers from the mid 90's could not output a really hot signal. In fact, 0dbVU wasn't even +4 or -10, it was somewhere in the middle. So if you were trying to push the level of something like a synth or heavily distorted guitar up to around 0dbfs, the direct outs would fall apart well before you got there.

Now wait a minute! You're saying that distortion in a preamp is desirable and worth paying handsomely for, but it's a deal killer in a mixer output? Why is that?

Regardless, it would help if you'd give specific details rather than subjective assessment like "pinched" and "falls apart" which doesn't really say anything. For example, exactly how much distortion did you get from that board at 1 KHz at 0VU? Mackie makes good stuff, and I can't imagine they'd sell anything meant for pro audio that couldn't deliver at least +20 at every output. The most restricted output on my Mackie 1202 claims to be clean up to +22 dBu.

Out of curiosity I found this page on the Mackie site:

http://www.mackie.com/products/8bus/8Busspecs.html

Is this the console you mean? I notice that the input and output levels are the same as for my little 1202 mixer. If someone told me this mixer has unacceptable distortion at levels near 0 dB the first thing I'd ask is if they're driving some old piece of gear having a 600 Ohm input impedance. :eek:

That's one of the reasons why that mixer got the reputation for being a little thin and screachy sounding

I follow brand reputations a bit on the 'net, and for the most part I find them to be incorrect, biased, and based on improper tests. I could be wrong about the Mackie 8 buss mixers! I never had one, so all I can go by is their reputation. :D But I almost never accept subjective web forum assessments unless they're based on a properly conducted blind test.

But you probably knew I'd say that! :laughings:

--Ethan
 
in my experience with my equipment tracking close to 0dBFS definitely caused issues when all parts were recorded and I was starting to mix. While things didn't really sound distorted, it was always a struggle to get things fit come mixing time. Things have improved considerably since I have started to take it easy when tracking.

$100 says if you set up a proper controlled test, where all that varies is the level you record at, you will find the sound to be identical whether recording at -1 or -21. This assumes your preamps and converters are interfaced properly, of course.

when recording from Kurzweil with this setting I peak around -6dB or so (sometimes lower maybe down to -10dBFS), and that seems to give me the best overall balance.

Something was probably overloading before, which makes sense if you ran your synth output at +27 or a similar excessively high level. Or maybe it's just a fault in your Aardvark card, which is not unreasonable given its poor specs you mentioned.

--Ethan
 
Something was probably overloading before, which makes sense if you ran your synth output at +27 or a similar excessively high level. Or maybe it's just a fault in your Aardvark card, which is not unreasonable given its poor specs you mentioned.

--Ethan
This should answer both your and Miro's questions.

It is very hard to overload the analog outputs of the Kurzweil. If anything, I get digital clipping before the analog outputs get out of shape. For those that are not familiar with the K2600, it's got many stages where one can control the gain digitally within the instrument. Its synth engine and the FX engines are separate, so you have to be careful what you're doing when building patches/programs on this beast. You've got the synth engine with it's outputs. Within a VAST program (the synth engine), you can have GAIN blocks where you can set output levels as well. Once you output the signals out of the VAST synth engine it goes to the KDFX (FX) inputs. You can set the gain on the FX inputs. Right after the inputs you have a 2 band EQ section where you can also mess with the gain. After this, the signal can either go directly to the FX outputs (w/o any FX processors in the chain) or you can setup an FX chain using FX blocks (i.e. processors such as reverb, delay, compressors, filters, distortion, etc, etc)... Each FX block has it's own set of input and output gain controls. Then you can route the signal to Aux FX or directly to Mix our individual outputs. This is yet another place where you can mess with gain...

So, internally, there are many many stages for gain setting, and one has to be careful not to digitally clip things.

However, it is possible to structure the gain stages where internally everything is clean, nothing clips, and still output a signal that will register over 0dBFS on my Aardvark when it's inputs are set to Unity.

So, yeah in order for me to make it so that I am not pushing the Aardvark, I already have to push things back on the K2600, which by default already puts it well within the safe zone as far as the K2600's analog outputs are concerned. So, if I am not overloading the analog stages on the K2600, then I must be overloading the Aardvark. In my case, this particular interface, seems to be the happiest when I am peaking around -10dBFS or lower on sustained sounds, and not really going above -6dBFS on percussive materials.

You can argue that the interface is broken, poorly designed, etc. That may be. However to just make a blanket statement such as "just make sure you're not clipping, you should be fine even if you're peaking at -0.1dBFS" is at best misleading.
 
The fact that I still track to tape first before I dump to DAW, probably goes a long way to keeping me "safe"...but I just want to be clear about a few things WRT your comments, as I may start doing more and more direct-to-DAW stuff.

In the cases you mentioned above...I'm assuming that their front end was already crapping out, and THAT'S where the bulk of their pinched/unfocused/distorted sound was coming from.
IOW, to go back to the question I asked you a few posts back...IF the signal coming out of the front end is clean/good...but HOT...does the converter in any way cause sonic degradation just 'cause the front end is feeding it a hot signal (but below clipping)?

IMO...most of the people with these bad/hot audio situations are the ones using the all-in-one boxes...so it's hard for them to separate out what their built-in pre/DI/comp/EQ is doing VS. the actual A/D in their all-in-one-box.

I would say that if you can monitor your front end signal before it gets to the A/D...if it sounds right, then the actual dBFS level is not all that critical as long as you are not clipping.
That said...since a lot of people may ONLY be monitoring their post-A/D signal...then yeah, to "play it safe", it may be best to just visually stay in that safe zone.

Since I don’t use “all-in-one” boxes…even if I decide to track to DAW instead of going to tape first…my SOP would be to just let the analog front end tell me where to set the levels and not really worry all that much about the DAW’s dBFS level other than to make sure it’s not clipping.
I’m sure if people follow that SOP, their levels would rarely be too hot anyway since the analog front end would/should “take care of it”, and as long as they’re listening to what it’s doing…I don’t see how they could ever get into trouble. :)

Am I way off base here with that thinking?
This is a very interesting turn on this.


IF the signal coming out of the front end is clean/good...but HOT...does the converter in any way cause sonic degradation just 'cause the front end is feeding it a hot signal (but below clipping)?
I hope not.
It begs the question then-
If our D/A's (and by association our A/D's) can't do full scale cleanly- how about when they're playing back all these SOP 'hot mixes?
 
Starved plate? Are you talking about fuzz boxes that pose as preamps? Did you ever hear Fletcher's rant on starved plate designs? :D
Yes, and personally I hate them and wouldn't use them. But you can't deny they exist just because it's not 'proper' in your estimation. You made a blanket statement about analog being linear, but what you meant to say is: Analog is linear as long as you ignore all the analog gear that isn't linear.

As I explained in my AES Audio Myths video, distortion is a cheap commodity and there's no need for magic. Tubes distort, tape distorts, transformers distort, and so do preamps if you push them beyond the levels they're designed for. I occasionally use a tape-sim to get that character, but I certainly wouldn't want it on every track of every song.
Neither do I, that's why I have several different preamps that give me the response that I want for different situations.



Now wait a minute! You're saying that distortion in a preamp is desirable and worth paying handsomely for, but it's a deal killer in a mixer output? Why is that?
There is good distortion and bad distortion. Ask any guitar player. Distortion, of course, is a broad term that really means nothing on it's own. Anything that changes the signal 'distorts' it. EQ, compression, gain, etc...

Regardless, it would help if you'd give specific details rather than subjective assessment like "pinched" and "falls apart" which doesn't really say anything.
I would love to, had I not gotten rid of the board 14 years ago.

For example, exactly how much distortion did you get from that board at 1 KHz at 0VU? Mackie makes good stuff, and I can't imagine they'd sell anything meant for pro audio that couldn't deliver at least +20 at every output. The most restricted output on my Mackie 1202 claims to be clean up to +22 dBu.
I'm sure the specs are correct and that the board performs admirably when they send a 1k test tone through it. But I generally don't record 1k test tones so it's irrelevant how the board reacts to it. I would bet that it doesn't handle a broadband signal as favorably at +22dbu.

Out of curiosity I found this page on the Mackie site:

http://www.mackie.com/products/8bus/8Busspecs.html

Is this the console you mean? I notice that the input and output levels are the same as for my little 1202 mixer. If someone told me this mixer has unacceptable distortion at levels near 0 dB the first thing I'd ask is if they're driving some old piece of gear having a 600 Ohm input impedance. :eek:
That's probably it. The last time I used one, I was using it with a Radar.


I follow brand reputations a bit on the 'net, and for the most part I find them to be incorrect, biased, and based on improper tests. I could be wrong about the Mackie 8 buss mixers! I never had one, so all I can go by is their reputation. :D But I almost never accept subjective web forum assessments unless they're based on a properly conducted blind test.

But you probably knew I'd say that! :laughings:
--Ethan
I get that. But what I don't get is you arguing against running at line level because in your (self admitted) limited experience and limited view of what a 'proper' preamp is, it isn't that necessary.
 
I would say that if you can monitor your front end signal before it gets to the A/D...if it sounds right, then the actual dBFS level is not all that critical as long as you are not clipping.
Correct me if I'm wrong about this, miro, but I'm assuming that when you talk about "all-in-one" boxes, that you're referring to an integrated preamp/converter interface and not a full digital workstation in a box. This post will work on that assumption (though it's probably not all that difficult for an actual DAW).

In such a case it's usually not possible to monitor what's happening between the preamp and the converter, and therefore theoretically impossible to make sure whether who one hears is from the pre or the A/D. You can pretty much know, though, different types of noise or distortion are probably from one or the other. Raising of the noise floor as well as most types of harmonic distortions are likely from the analog side, for example. Also if a particular artifact is present when running mic in and not instrument direct or line in (or any combination thereof), it's probably not the converter itself, but rather an artifact of one or more certain circuit paths on the analog side.

As far as actual A/D converter distortion itself, there is some pretty good evidence that when pushed up towards the 0dBFS limit that many converters can and do introduce some distortion in the last 3dB or so. Do some searching of pipelineaudio's posts or check out his website for some pretty good equipment tests he ran a while back documenting this effect (which we have occasionally joking referred to here as "the Pipeline Effect" after those tests.)

With regards to the actual digital recording levels, you're right (and so is Ethan) that the actual act of digital recording will not make a difference in sound quality as long as you keep the signal somewhere between the ditches.
But there's a bit more of a workflow importance to digital levels than the technical one, which can - if one is not the most experienced on mix engineers - have an effect on the final mix:

The engineers that decide and design all the different "standards" (as loose as that term may indeed be) for the gear we use do not just pull numbers out of a hat. None of it is perfect, of course, but there is at least some reason and pattern involved, not just haphazardness. And the combination of analog line levels, A/D converter calibrations, and digital recording canvas are designed pretty much to fit together like a transmission, drive shaft and differential; i.e. they are built and intended to fit together properly and work together smoothly as a single drive train without a whole lot of - if any - massaging or coercing needed from the driver.

The sometimes magical-seeming results of this - which are not magical at all, but rather are the intended results of this design - is that if one pays attention to and follows the designed "sweet spot" (without us geeks arguing as to just how wide or narrow that "spot" may be) through the whole gain structure from beginning to end, from digital to analog, that the mixes just have a tendency to work themselves out both level and cleanliness-wise by the time the digital mixdown is ready to be delivered for mastering, with out whole lot of extraneous fader jockeying and extra other Massaging or processing needed on the part of the mix engineer.

The issue is not so much whether or not that extra jockeying causes too much extra digital math to mangle the signal quality 9thogh it's always good to keep anything to a minimum when it's not needed), but rather that the engineer can concentrate on *mixing the music* and not managing the signal. For a seasoned mix engineer, it simply means less work - though that alone is enough of a reason IMHO - but for the average home recordist its also means less loose variables for them to mis-juggle and mess them up, and an easier line of sight to concentrating on the music and the musical mix.

This is what IMHO makes the A/D conversion factor the key link in the chain. By using the 0VU --> ___dBFS conversion factor as the link, with the added digital governor of keeping away from clipping, it keeps that intended continuity of level all the way through the process. A typical pop mix that winds up RMSing somewhere near that conversion level typically has a pretty optimal S/N, plenty of dynamic range to work with, plenty of headroom, and winds up also being the most conducive to receiving a real quality mastering job, and not just polish on a turd. And no, throttling the master buss to bring it to that level is NOT the same as playing the gain structure game all the way through to let that level develop itself of it's own volition.

G.
 
Correct me if I'm wrong about this, miro, but I'm assuming that when you talk about "all-in-one" boxes, that you're referring to an integrated preamp/converter interface and not a full digital workstation in a box.

Si mi amigo. :)
 
Yes, and personally I hate them and wouldn't use them. But you can't deny they exist just because it's not 'proper' in your estimation.

I pretend crappy cheap toob gear doesn't exist because that stuff is bought mostly by gullible newbies who read in a magazine that toobs are desirable.

what you meant to say is: Analog is linear as long as you ignore all the analog gear that isn't linear.

Fair enough.

There is good distortion and bad distortion. Ask any guitar player.

Again, I'm not talking about fuzz boxes or guitar amps. I'm talking about pro and prosumer quality audio gear.

I would bet that it doesn't handle a broadband signal as favorably at +22dbu.

I would bet that it does.

what I don't get is you arguing against running at line level because in your (self admitted) limited experience and limited view of what a 'proper' preamp is, it isn't that necessary.

I'm not sure what you mean by my arguing against "running at line level." What I argue against is the blanket statement that recording at -20 in a 24-bit system is somehow better than recording at -1. That makes no sense. If anything, the opposite should be true (in theory if not in practice). If a sound card / converter can't handle signals right up to the onset of Digital Zero, then it's a lame design. Even my $25 SoundBlaster card can handle levels just below clipping with no loss of quality.

What does vary with level is our ears as per Fletcher Munson.

--Ethan
 
You can argue that the interface is broken, poorly designed, etc. That may be.

How hot an input can it accept as Digital Zero if you take into account the input attenuator? Can it take +5? +10?

Also, some sound cards have a software input level control, which is not useful because you can run into exactly the problem you describe, where the signal distorts below digital zero.

However to just make a blanket statement such as "just make sure you're not clipping, you should be fine even if you're peaking at -0.1dBFS" is at best misleading.

It's not misleading, at least not in the context of what levels we should record at. If your synth can distort at intermediate stages, that is totally unrelated to how hot people should record in a DAW.

--Ethan
 
..With regards to the actual digital recording levels, you're right (and so is Ethan) that the actual act of digital recording will not make a difference in sound quality as long as you keep the signal somewhere between the ditches.
But there's a bit more of a workflow importance to digital levels than the technical one, which can - if one is not the most experienced on mix engineers - have an effect on the final mix:

The engineers that decide and design all the different "standards" (as loose as that term may indeed be) for the gear we use do not just pull numbers out of a hat. None of it is perfect, of course, but there is at least some reason and pattern involved, not just haphazardness. And the combination of analog line levels, A/D converter calibrations, and digital recording canvas are designed pretty much to fit together like a transmission, drive shaft and differential; i.e. they are built and intended to fit together properly and work together smoothly as a single drive train without a whole lot of - if any - massaging or coercing needed from the driver.

The sometimes magical-seeming results of this - which are not magical at all, but rather are the intended results of this design - is that if one pays attention to and follows the designed "sweet spot" (without us geeks arguing as to just how wide or narrow that "spot" may be) through the whole gain structure from beginning to end, from digital to analog, that the mixes just have a tendency to work themselves out both level and cleanliness-wise by the time the digital mixdown is ready to be delivered for mastering, with out whole lot of extraneous fader jockeying and extra other Massaging or processing needed on the part of the mix engineer.

The issue is not so much whether or not that extra jockeying causes too much extra digital math to mangle the signal quality 9thogh it's always good to keep anything to a minimum when it's not needed), but rather that the engineer can concentrate on *mixing the music* and not managing the signal. For a seasoned mix engineer, it simply means less work - though that alone is enough of a reason IMHO - but for the average home recordist its also means less loose variables for them to mis-juggle and mess them up, and an easier line of sight to concentrating on the music and the musical mix.

This is what IMHO makes the A/D conversion factor the key link in the chain. By using the 0VU --> ___dBFS conversion factor as the link, with the added digital governor of keeping away from clipping, it keeps that intended continuity of level all the way through the process. A typical pop mix that winds up RMSing somewhere near that conversion level typically has a pretty optimal S/N, plenty of dynamic range to work with, plenty of headroom, and winds up also being the most conducive to receiving a real quality mastering job, and not just polish on a turd. And no, throttling the master buss to bring it to that level is NOT the same as playing the gain structure game all the way through to let that level develop itself of it's own volition.

G.
Yeah! It's... sticky time... :cool::)
And to add.. again, but first a question...

How many of you have recording app software that does not have 'Average + Peak' metering available?

I'm tossing this out again- (repeating myself, sorry but it works so well..) I track 'Average + Peak', range/scale' -24Dbfs. When the body of the signal pops into range- done. 'Nominal, 20db head room, mix', plugs, all in alignment
-move on.
'Up-stream analog likely happy but optional!
 
Last edited:
I pretend crappy cheap toob gear doesn't exist because that stuff is bought mostly by gullible newbies who read in a magazine that toobs are desirable.
And a lot of people on this board asking questions about how to properly use their equipment.



Again, I'm not talking about fuzz boxes or guitar amps. I'm talking about pro and prosumer quality audio gear.
The analogy still works. Your mackie will sound different when it distorts than a neve will because transformers sound different than op-amps when they are overloaded.

I'm not sure what you mean by my arguing against "running at line level." What I argue against is the blanket statement that recording at -20 in a 24-bit system is somehow better than recording at -1. That makes no sense. If anything, the opposite should be true (in theory if not in practice). If a sound card / converter can't handle signals right up to the onset of Digital Zero, then it's a lame design. Even my $25 SoundBlaster card can handle levels just below clipping with no loss of quality.
No one is saying (that I've noticed) that converters aren't can't do that. But there is a lot of equipment feeding the converters that cannot. That is the point that's trying to be made.

What does vary with level is our ears as per Fletcher Munson.
But recording level and monitoring level are not necessarily related. What you hear at a given volume will change via Fletcher Munson, not at a given recording level.
 
Back
Top