I'm sorry to have to ask this, but I couldn't find anything in the archives that provided an answer. I guess my knowledge it at too basic a level; that's why I keep asking these damn questions about A/D conversion, because I'm trying to understand. Maybe one of these days I'll get it. Anyway, Here goes:
Say I'm using an external 24-bit A/D converter, like a Lucid; I've got an analog signal plugged into its input, and the output (I'm assuming) is connected to my computer via SPDIF. From what I've read, SPDIF is ALWAYS transmitting 24-bit signals (but I also read that some of these bits are 0 or "unused" if you're sending a lower-resolution signal.)
OK, so my question is, can I/should I plug an external, stand-alone 24-bit A/D converter into a 20-bit card? And will my resulting signal be 20-bit (limitation imposed by my card, eventhough not using their converters) or will it be 24-bit (limitation imposed only by SPDIF, the Lucid-type converter, and/or the software I'm recording into)?
Say I'm using an external 24-bit A/D converter, like a Lucid; I've got an analog signal plugged into its input, and the output (I'm assuming) is connected to my computer via SPDIF. From what I've read, SPDIF is ALWAYS transmitting 24-bit signals (but I also read that some of these bits are 0 or "unused" if you're sending a lower-resolution signal.)
OK, so my question is, can I/should I plug an external, stand-alone 24-bit A/D converter into a 20-bit card? And will my resulting signal be 20-bit (limitation imposed by my card, eventhough not using their converters) or will it be 24-bit (limitation imposed only by SPDIF, the Lucid-type converter, and/or the software I'm recording into)?