24bit vs 16bit and Hz

  • Thread starter Thread starter adam79
  • Start date Start date
A

adam79

New member
I just did some 24bit recordings (used to do 16bit) and the difference is amazing! I feel like an idiot for not going 24bit in the past. Anyways, I'm wondering if it's worth recording at a higher Hz than 44.1? I've read that Hz don't convert down as well as bits, when formatting back to CD levels.

Thanks,
-Adam
 
Punch your question into teh googler and you'll find a wealth of information...

Most here either use 44.1 or 48 (for a specific purpose) as far as I've noticed.
 
I just did some 24bit recordings (used to do 16bit) and the difference is amazing! I feel like an idiot for not going 24bit in the past.

Recording at 24 bit gets you a lower noise floor and nothing else. It means you can leave yourself more headroom. Unless you changed the hardware there should be little audible difference. It's a matter of convenience, not a tonal improvement.
 
For years, I have been recording in 24 because that was the default setting in in my DAW. I had been plagued for years with problems converting to 16 for burning to CD. A few months ago I started setting up my projects in 16 and have been unable to notice a difference between the two other than the fact that I don't have problems in conversion anymore.
 
I had been plagued for years with problems converting to 16 for burning to CD.

What kind of problems could you be having to convert to 16bit??


For the on-topic, I think there *could* be a noticeable difference if the noise were additive from mixing down a lot of 16bit tracks to one stereo track. Maybe a hundred tracks of no audio or something. For the typical rock/pop/metal/rap songs we have here, probaly can't hear too much difference, if any.

Still, i record in 24bit for teh extra headroom and also for the hope that the extra data helps the plug-ins provide more precise calculations. But, I really don't know if it helps.
 
For years, I have been recording in 24 because that was the default setting in in my DAW. I had been plagued for years with problems converting to 16 for burning to CD. A few months ago I started setting up my projects in 16 and have been unable to notice a difference between the two other than the fact that I don't have problems in conversion anymore.

Dithering and truncating from 24 to 16 bit is a simple and routine operation. What "problems" have you had?
 
I just did some 24bit recordings (used to do 16bit) and the difference is amazing! I feel like an idiot for not going 24bit in the past.

Boulder already gave you the right answer. Did you actually do a proper comparison recording an identical source onto identical systems with only the sample rate different? Or did you just record something at 24 bits and decide it sounds better than other stuff you recorded in the past?

--Ethan
 
I've been recording at 24-bit, 96 kHz, exclusively. I may be wrong, but it seems to me that for adding per-track effects and mixing, more data is better -- I would think that, for the final mix, the "rounding errors" would be less severe. I burn a CD direct from the mastered mixes. I have found problems if I render the master and then burn in something like Nero. I'll get occasional glitches. I don't know if that's a problem specific to Nero or an example of the kind of problems that can result from 24-bit to 16-bit, 96 kHz to 44.1 kHz translation. However, for the past year or two, I've been burning the CD within Adobe 3.0 and haven't had any problems at all. Also, as I mentioned in an post somewhere, for some reason CDs burned from inside Audition 3.0 sound better than when burned with Nero or even Audition CS6; they seem to have better dynamic range, a wider frequency response and don't sound as "muddy." Maybe it's all my imagination, but I swear I can hear a fairly dramatic difference.
 
I would think that, for the final mix, the "rounding errors" would be less severe.

Not really. It's easy to measure this yourself if you have an audio editor such as Sound Forge that has an FFT display. Most modern DAW software processes all track data at 32 bits, regardless of the source track's bit depth. Rounding errors occur during DAW (and plug-in) math, so doing this at 32 bits keeps those errors extremely low. As in way more than 100 dB below the music.

Maybe it's all my imagination, but I swear I can hear a fairly dramatic difference.

It's either your imagination or you're not testing properly. Comparing different bit depths and sample rates is not as simple as recording some stuff using one format, then recording some other stuff in the other format, and listening for what you think sounds better.

--Ethan
 
Look, I believe in science and all but when you record and mix records day in and day out, the science has little to do with how your opinions get formed about the gear you use. For instance, if you've been making records for years at 44.1 with the same converters and then switch to 96, the difference may well be staggering to you. If it makes you produce better recordings and influences your mix in a positive way, it IS better. If you know your monitoring chain intimately along with the rest of your gear, it might be easy to perceive the benefits/shortcomings of the switch just based on how it influences your decisions. You may find, for instance, that reverb tails sound clearer and transients have more detail and that makes you work differently and have to work a little less harder to get where you want to go.

I dunno, we're a cynical bunch and some of us, like Ethan, rely solely on scientific proof to make up our minds about audio but when you're actually out there making records for a living and get more and more acquainted with your gear day in and day out, you will probably have a different perspective.

I say "cynical" because some of us don't believe anyone is qualified to make a decision based on listening alone because audio engineering is now such a convoluted, devalued art where EVERYONE is an engineer, it seems. Some of us think that when someone hears a difference, it's all in the mind. Audio is somehow now perceived as a working science - which it is to a certain extent, mostly to those who THEORIZE about it more than actually PRACTICE it - but it has never been that way for working engineers who make all the wonderful records that we enjoy. They simply knew how to masterfully execute the fundamentals. Technology is a tool and quite incidental to the program material. Sure, there's lots to know in order to make a good recording, but to me, trial and error and the process of deduction is a worthwhile way to form opinions.

If the guy says he hears a difference, great. Now use that positive influence to make better recordings.

Cheers :)
 
Having used a bunch of different converters at different sample rates, it seems that if a converter sounds better at one sample rate than the other, it's really the converter, not the sample rate making the difference.

Some converters sound better at one sample rate than the other. They shouldn't, but some do. Since some sound better at lower sample rates and some sound better at higher ones, it really must be something with the specific design of the converters in question and not really the sample rate that sounds better.

That would also explain why some people swear by higher sample rates and other swear at them. Also, the difference is subtle enough, even when there is a difference, that it really could just be in your head.
 
Back
Top