24bit vs 16 bit

This is a difference in resolution or bit depth. A possible example of the audible difference would be the difference between an MP3 and a WAV file.

I tracked a demo in 16bit. After all the processing in the mixing stage, I really lost a lot of the atmosphere of the original recording and added a lot of "Grain".

So in some circumstances, with a fantastic monitoring system you could tell a difference between a 16bit recording session and a 24bit recording session.

I think its important in digital recording to track and mix at your highest resolution possible to compensate for software quantization and rounding error. Then you would mixdown with dither to 16bit for CD.
 
zazz said:
come on guys...just wondering if the yam aw16g being only 16bit is a big deal to worry about??

Rather than wait around, you could have easily found this out yourself.
Google is your friend.....

Impatience on this forum will only bring you trouble.
 
JKestle said:
This is a difference in resolution or bit depth. A possible example of the audible difference would be the difference between an MP3 and a WAV file.

I tracked a demo in 16bit. After all the processing in the mixing stage, I really lost a lot of the atmosphere of the original recording and added a lot of "Grain".

So in some circumstances, with a fantastic monitoring system you could tell a difference between a 16bit recording session and a 24bit recording session.

The first comparison--.wav to .mp3--is a large exaggeration of the difference.

As for your demo, sure it could have been grainy, but what was the processing? Simple mixing, or a lot of plugs? How much of the difference was the quality of the effects vs. mixing? Was the software really working in 16 or 24, or was it working in 32 bit and saving the conversion back to the end? How much dynamic range did your recording use? Did you retrack at 24, dither to 16, and notice a difference?

My view is if you can't do quality work at 16 bit, 24 won't help you. However if your 16 bit mixes are high quality, you might realize some additional improvement at 24.

Now it may very well be that if you upgrade to 24, you get better converters as part of the deal, but that's not strictly the difference between 16 and 24. It could be the quality of the anti-aliasing filters. It could be a better clock.
 
JKestle said:
I tracked a demo in 16bit. After all the processing in the mixing stage, I really lost a lot of the atmosphere of the original recording and added a lot of "Grain".

To echo that, sort of, I find that with audio, video, or even photo processing, initially an extremely high resolution sample is functionally overkill. No one will notice a difference in the original unless their a bat. But the more you process it from there the more you will begin to notice it.

If you care, always give yourself the best raw building materials. You can always resample and cut it back.
 
thanks guys this is really helpfull..and to the google crack..i always try and help if i can even if it can be found out with a search because its really difficult sorting out the wheat from the chaff!!!!!!!
 
mshilarious said:
The first comparison--.wav to .mp3--is a large exaggeration of the difference.

An intentional exageration to make point. With a clean airy quality acoustic recording it is possible to hear an audible difference. I'm not saying I could, but someone out there.

mshilarious said:
My view is if you can't do quality work at 16 bit, 24 won't help you. However if your 16 bit mixes are high quality, you might realize some additional improvement at 24.

My Demo was a really bad example, and you are right...I did not retrack at 24 bit and compare. Even if I did, I'm sure the grain would still have been present. It project was an unorganized mess with no chance of recreation.

I also agree that if you can't do quality work at 16 bit, 24 won't help you. And I'm obviously still working on this point myself.


I completly agree with Birdyfoot,

Why not use the best building blocks possible, even if you can't tell the difference from just looking at them.
 
Zazz, from my experience with my 16G (two years now)... if someone can't produce pristine recordings on the G, then it isn't be the G's fault. As others have written here, whenever I've lost some quality in a recording, I've traced it to something other than bit rate. There's just so much more that can negatively affect the sound (even household electrical-noise issues, in my case), especially in a home-studio situation. When I take the time to improve such factors, my 16-bit sound can be simply superb. Of course, as someone who also records in full pro studios, I'm talking here about home studio semi-pro results with the G, with no claims beyond that.

Yes, as someone suggested, do a search, and not only on Google. Use this site's search feature. There's a very lengthy and interesting thread on this exact issue from the past year or so. We're just repeating it here.

Best,
J.
 
it doesnt take a high priced monitoring system to hear differences.....try an mp3 at 128k and at 256k. theres a difference there. and i could tell it on a cyber acoustic multi media speaker system. now if you play the same ones on some monitors, you can tell a bigger difference.
 
The biggest difference between 16 & 24 that I could immediately discern was the difference in the noise floor. That makes it worth it to me whether or not there is any difference in the sound quality.
 
I approach 16 vs. 24 bit this way: I always record in Cubase at 24/44. To me, 24 bit has the advantage of being just more one's and zero's - it gives some more headroom for any processing that may take place. I agree with the noise floor statement too. You'll actually find a bit of discussion around there that already took place on this topic. Some knowledgable people have chimed in too regarding all this.
 
The Alesis AI3 I use now has a spec range of 98dB, the old 20 bit converters I used were 94dB. The sources I record max out at around 100dB, and my mics have self-noise of 14-18dB. I just don't see any benefit here.

I don't want to sound like an apologist for 16 bit, because I don't track at 16 bit anymore either, but honestly if I use the same converters, but truncate at either 16 or 24 bit, I don't think it makes any useful difference.

I might have to set up a test, it would be interesting.
 
Fine then lets do this.

Granted that the weak link in the recording chain (a microphone for instance) will limit the possible recordable fidelity, most likely well below the 16bit threshold, I find that to be beside the point. Nor is there any way that, given the same sample, the human ear can hear the difference between 16 and 24 bit. It simply goes beyond, and well beyond, the auditory acuity of any human being.

The advantage, "in my opinion," has only to do with extensive processing, i.e. the adding of effects and filters, mixing, re-sampling, etc, and specifically if working entirely in the digital realm (which adds effects based upon the manipulation of digital data). More data, in this case, affects the ability of a computer to discreetly edit/filter data without deteriorating the sound quality. Editing will fade into the background more smoothly and be more effective at producing the desired results if it has more points to work with.

One could expect better results from adding filters in 24bit even if they simply expanded their sample from 16 bit, and then moved it back for the final cut. This would be less desirable, of course, than to have recorded in 24 bit to begin with.

I guess what I'm trying to say is that even if your equipment kind of sucks, if you plan to screw around with your samples a lot, 24bit is not a complete waste of time.
 
If you can use 24bit, use 24 bit. At the same time there are literally thousands of recordings using 16 bit (Remember before computers/pro tools those blackface and Xt ADAT recorders ruled the world lol). I would put money down that no one could sit down and pick out what was recorded at 16 and 24 just by ear on a finished product. Well maybe someone could but its still a longshot.
 
Precisely, however, "before computers" there was no "computer editing", and there was only so much you could do after the fact, hence less of a need for higher resolution.

Now-a-days, you could make someone farting sound incredible and set it to anything.
 
Another thing you have to remember is that the standard listening format is still the 16 bit cd. So no matter where you start your going to end back up at 16 bit if not something worse. Hell alot of people these days can't tell the difference between a 128 mp3 and a 24 bit wav file (now that is sad). My outlook is that there are plenty of other areas that are going to mess with your sound more than what bit your recording at.
 
Back
Top