bblackwood said:
Heck, few 24 bit ADCs actually achieve 20 bit performance - the extra four bits are typically referred to as 'marketing bits'...
Nothing personal, Brad - I mean that - but that answer just further begs the question. Maybe I'm not going a good enough job asking the question or something, but I feel like I'm going in circles here.
Let me cut right to the chase: Why the don't the marketing bozos who are hawking "20-bit CDs", which they admit in their own documentation referrs to the word length of their converters and not the actual word length on the CD or the "simulated quality" of the CD, just go ahead and use the same 24-bit converters the rest of us do and call their CDs "24-bit CDs"? Why make a big deal out of only 20 bits, which in 2005 is several years the obsoleted technology?
And if there's "nothing wrong with" them using the terminology they use, then why can't the rest of us make the same claim when we dither our 24 bits down to a 16-bit CD-R (regardless of whether those last 8 bits are all zeros or all ones or anything in between)? I can give the answer to that last question: because we know that it
is wrong on both a technical and an asthetic level, not to mention the subjective ethical level.
On a more analytical point: To say that a 16-bit CD has a "20-bit quality" to it because that is the width of the converter before dithering and/or because of the quality of the dithering algorithm itself is, by definition, incorrect at best, and meaningless at worst. The reduction of word length combined with the addition of dithering is a two-stage compounding of noise to the signal (even if the last four bits are all zeros.) The noise added to the signal by dithering reduces the "unplesantness" or "unnaturalness" of the sound caused by mathematical curiosities involved in the truncation of the word length is nothing more than a "trick" that takes advantage of how the human ear and mind processes sound. In strict mathematical terms it is still adding a level of noise that takes the bits one step further away from their original 20-bit value. The dithering may make it sound "better", but it does so in a way totally unrelated to the way that a wider word length would. Dithering does not add resolution or dynamic range to the signal; it just "smears" the current resolution in the last bit(s) to take the edge off the truncation.
That actually brings up another question I've had for some time now: why the powers that be in designing these systems decided to use the extra bits to increase dynamic range instead of to increase the resolution within a dynamic range. But that's a question for another thread...
G.