mshilarious
Banned
OK, I can think of four semi-legitimate complaints against USB mics in general:
- They are mono. OK, they don't have to be--most if not all USB codecs (I assume you know what a codec is) are stereo, so they *can* support stereo operation. That's a design decision, not a design limitation. It's possible for them to be multichannel as well, that's down to driver implementation--see next point.
- They are limited to 16/48. That's only true of USB mics intended for USB Audio Class 1.0 compliance. That is actually a feature rather than a limit, because it means the mic does not have to have custom drivers for it, the AC 1.0 standard is native to *many* devices, especially portable devices.
And is 16 bit really the limit you think it is? Yeah, 16 bit was more work, but somehow we managed to slog through. Read on! First, if a mono mic is what is called for in the application, then the stereo codec can be cleverly used as a differential input--happily, some codec manufacturers seem to use dither that is at least partially common-mode. That means you can realize >100dB mono dynamic range out of a stereo 16 bit converter. Clever.
But even if you're still stereo, is 90dB dynamic range not enough? That was more than anybody had in analog days. Oh wait, you say, we had Dobly back then (Spinal Tap joke, sorry), that expanded dynamic range beyond what the tape was actually capable of. You know what? You can use the same technique in digiland, and there's even an existing AES standard for CD preemphasis--that together with the differential thingy can push A-weighted dynamic range for a mono mic close to 110dB--out of 16 bit! And stereo to 100dB.
But hey! If a manufacturer chooses, they can bypass AC 1.0 and write drivers, and do any sample rate/bit depth they want. How would that be any different from a USB interface? Do you honestly think the electronics in an "interface" are magically different than in a "mic"? Think again.
But wait once more! For a few years now there's been AC 2.0, which is 24/192 and up to six channels (that's off the top of my head, gotta check). It hasn't been widely implemented, but once is it then we'll have 24/192 surround USB mics (ports for extra capsules if you like) with driver-free, universal plug-and-play.
- They are ADC only. Well, not all are, some have headphone ports. As I said, they mostly use codecs, so they are capable of doing ADC and DAC. This is an implementation issue. A corollary to that is you can't select different input and output audio devices with ASIO. That is an ASIO limitation, not a limitation of the laws of physics in the universe.
- The codecs are cheap. Actually they are rather expensive! But that shouldn't matter, because some of them allow SPDIF input. Now it's a funny thing and maybe not implemented often or ever, but you could use the sexiest converter chip you like into a SPDIF transmitter into a USB codec, and the codec will have nothing to do with the quality of conversion. Or you can use a USB transceiver chip with your converter IC via I2S and skip the codec and SPDIF transmitter (several USB mics do that) and write your own drivers. Again, this is implementation, not a physical limit. I find that if you work with the codecs for a while and squeeze what you can out of their circuits, the quality is just fine--as good or better than the prosumer converters before 2001 or so. But I guess nobody ever made a hit record on an ADAT, right?
All of the other objections--and I invite you to try them--are based upon your misunderstanding of digital and analog electronics. So let's hear them.
Bottom line: if a recordist wants a simple, no-fuss, no-muss method of recording *mono or stereo tracks only*, you are doing them a serious disservice by universally panning a class of microphones you largely haven't even tried. You are costing them money, and money=time, which means you are unnecessarily taking away a piece of their life. Please stop it.
- They are mono. OK, they don't have to be--most if not all USB codecs (I assume you know what a codec is) are stereo, so they *can* support stereo operation. That's a design decision, not a design limitation. It's possible for them to be multichannel as well, that's down to driver implementation--see next point.
- They are limited to 16/48. That's only true of USB mics intended for USB Audio Class 1.0 compliance. That is actually a feature rather than a limit, because it means the mic does not have to have custom drivers for it, the AC 1.0 standard is native to *many* devices, especially portable devices.
And is 16 bit really the limit you think it is? Yeah, 16 bit was more work, but somehow we managed to slog through. Read on! First, if a mono mic is what is called for in the application, then the stereo codec can be cleverly used as a differential input--happily, some codec manufacturers seem to use dither that is at least partially common-mode. That means you can realize >100dB mono dynamic range out of a stereo 16 bit converter. Clever.
But even if you're still stereo, is 90dB dynamic range not enough? That was more than anybody had in analog days. Oh wait, you say, we had Dobly back then (Spinal Tap joke, sorry), that expanded dynamic range beyond what the tape was actually capable of. You know what? You can use the same technique in digiland, and there's even an existing AES standard for CD preemphasis--that together with the differential thingy can push A-weighted dynamic range for a mono mic close to 110dB--out of 16 bit! And stereo to 100dB.
But hey! If a manufacturer chooses, they can bypass AC 1.0 and write drivers, and do any sample rate/bit depth they want. How would that be any different from a USB interface? Do you honestly think the electronics in an "interface" are magically different than in a "mic"? Think again.
But wait once more! For a few years now there's been AC 2.0, which is 24/192 and up to six channels (that's off the top of my head, gotta check). It hasn't been widely implemented, but once is it then we'll have 24/192 surround USB mics (ports for extra capsules if you like) with driver-free, universal plug-and-play.
- They are ADC only. Well, not all are, some have headphone ports. As I said, they mostly use codecs, so they are capable of doing ADC and DAC. This is an implementation issue. A corollary to that is you can't select different input and output audio devices with ASIO. That is an ASIO limitation, not a limitation of the laws of physics in the universe.
- The codecs are cheap. Actually they are rather expensive! But that shouldn't matter, because some of them allow SPDIF input. Now it's a funny thing and maybe not implemented often or ever, but you could use the sexiest converter chip you like into a SPDIF transmitter into a USB codec, and the codec will have nothing to do with the quality of conversion. Or you can use a USB transceiver chip with your converter IC via I2S and skip the codec and SPDIF transmitter (several USB mics do that) and write your own drivers. Again, this is implementation, not a physical limit. I find that if you work with the codecs for a while and squeeze what you can out of their circuits, the quality is just fine--as good or better than the prosumer converters before 2001 or so. But I guess nobody ever made a hit record on an ADAT, right?
All of the other objections--and I invite you to try them--are based upon your misunderstanding of digital and analog electronics. So let's hear them.
Bottom line: if a recordist wants a simple, no-fuss, no-muss method of recording *mono or stereo tracks only*, you are doing them a serious disservice by universally panning a class of microphones you largely haven't even tried. You are costing them money, and money=time, which means you are unnecessarily taking away a piece of their life. Please stop it.