Why 20/24 bit A/D to 16 bit recorded data?

gleason

New member
I see there are a few recorders (MRS1266, VF160 and I'd guess others) that use 20 or 24 bit A/D converters, then record the track using a 16 bit recording format. At one point in the signal/data path there is this high definition 20 or 24 bit information. Then the recorder is throwing part of it away by only writing or retaining 16 bits on the hard disk. 16 bits is bad, that's what CDs are today.

This doesn't make sense to me. Am I missing something?

Is there anything gained over a 16 bit D/A to a 16 bit recording format?

Just curious. Thanks.

Dave
 
gleason said:
... Is there anything gained over a 16 bit D/A to a 16 bit recording format?

Yes:

- If there's any mixing or processing going on, the extra bits retain accuracy that would be lost due to rounding through multiple calculations.

- Dithering.
 
The way I understand it is that higher bit analog / digital converters are more expensive and deliver a result with more data in it - essentially a higher resolution. This is a pretty competitive market and manufacturers have to make compromises. Change a 16 bit unit to 24-bit design, and you will have to change the price along with it.
 
It could be a way to make use of better or newer converters while staying with (or the option of) recording at 16 bit.
I fed my 16 bit adats for a while with 24 bit RME's and it diffinetely lowered the noise level. But you would really have to jack the playback level way up to begin to hear this. (Don't mess up if you try this.:eek: )
Later.
:)
 
It's mostly a question of margins. With more bits you get a bigger dynamic range, which means you can amplify a weak signal without loosing quality. Compression is the prime example of this.
 
Any convertor worth a damn would dither from 24 bit to 16 bit, not just "throw them away" If that was the case , then ya it would be pointless to A/D at 24 then record at 16. Throwing the 8bits away just introduces noise
 
deadleafecho said:
Any convertor worth a damn would dither from 24 bit to 16 bit, not just "throw them away" If that was the case , then ya it would be pointless to A/D at 24 then record at 16. Throwing the 8bits away just introduces noise

This is incorrect.

Actually dithering introduces noise. You trade noise for resolution. :)
 
Thanks for all the replies. Just trying to understand some of this stuff. Will read up on "dithering".

Dave
 
Back
Top