Why digital is superior to analog

  • Thread starter Thread starter jordanstreet
  • Start date Start date
Status
Not open for further replies.
Answer,,, digital is NOT superior to analog.............................. Also, analog in many ways is NOT superior to digital. Thread closed. This argument can not be won by either side because they both have advantages and disadvantages.

You are free to leave the thread if you like, but I don't see how your post is relevant to the continuing discussion. To the OP, this was a low-quality term paper. To most of the rest of us this an interchange, not an argument.

For example, most people probably didn't know that most converter ICs actually hit 0dBFS at 1-2VRMS. It's not complicated, really, the digital world standardized on supply rails a long time ago, and most converter ICs obey those standards because they have to communicate with the rest of the digital world. So their supply rails are typically +5VDC for analog and +5V or +3.3VDC for digital. If you have a +5VDC rail then you can't much exceed that as a peak-to-peak input. That's 1.7VRMS or so. Some chips hold their 0dBFS to within 2.8V peak to peak though, that's a nice clean 1VRMS. I don't think anybody goes beyond +6dBV as a max input.

So the whole discussion about digital headroom and operating levels is trying to fit new wine into old wineskins.

Let's look at mics. The maximum output a mic can produce is around +10dBV. That's a -30dBV/Pa mic at 134dBSPL; above that most mics would need an internal pad on their capsule to avoid distortion. That's going to need to be padded before any converter IC. Let's say -10dB. That yields a max level of 0dBV, add back +6dB of gain to clip our converter. We need analog headroom above that, let's say +10dBV. That requires a +/-5V supply rail. Hey, we already have a +5V rail for our converter! Just need that -5V now. Works out pretty well.

That yields an operating level right at -10dBV . . . a normal condenser mic at normal source levels would require a mere +20dB of gain; dynamics +40dB. How simple life could be . . .
 
For example, most people probably didn't know that most converter ICs actually hit 0dBFS at 1-2VRMS. It's not complicated, really. . .

roflmao.gif
roflmao.gif
 
You are free to leave the thread if you like, but I don't see how your post is relevant to the continuing discussion. To the OP, this was a low-quality term paper. To most of the rest of us this an interchange, not an argument.

For example, most people probably didn't know that most converter ICs actually hit 0dBFS at 1-2VRMS. It's not complicated, really, the digital world standardized on supply rails a long time ago, and most converter ICs obey those standards because they have to communicate with the rest of the digital world. So their supply rails are typically +5VDC for analog and +5V or +3.3VDC for digital. If you have a +5VDC rail then you can't much exceed that as a peak-to-peak input. That's 1.7VRMS or so. Some chips hold their 0dBFS to within 2.8V peak to peak though, that's a nice clean 1VRMS. I don't think anybody goes beyond +6dBV as a max input.

So the whole discussion about digital headroom and operating levels is trying to fit new wine into old wineskins.

Let's look at mics. The maximum output a mic can produce is around +10dBV. That's a -30dBV/Pa mic at 134dBSPL; above that most mics would need an internal pad on their capsule to avoid distortion. That's going to need to be padded before any converter IC. Let's say -10dB. That yields a max level of 0dBV, add back +6dB of gain to clip our converter. We need analog headroom above that, let's say +10dBV. That requires a +/-5V supply rail. Hey, we already have a +5V rail for our converter! Just need that -5V now. Works out pretty well.

That yields an operating level right at -10dBV . . . a normal condenser mic at normal source levels would require a mere +20dB of gain; dynamics +40dB. How simple life could be . . .

WOW! My previous post was meant as a joke to ez_willis Get it right! And thanks for the offer to leave the thread. You da MAN!!
 
You are free to leave the thread if you like, but I don't see how your post is relevant to the continuing discussion. To the OP, this was a low-quality term paper. To most of the rest of us this an interchange, not an argument.

For example, most people probably didn't know that most converter ICs actually hit 0dBFS at 1-2VRMS. It's not complicated, really, the digital world standardized on supply rails a long time ago, and most converter ICs obey those standards because they have to communicate with the rest of the digital world. So their supply rails are typically +5VDC for analog and +5V or +3.3VDC for digital. If you have a +5VDC rail then you can't much exceed that as a peak-to-peak input. That's 1.7VRMS or so. Some chips hold their 0dBFS to within 2.8V peak to peak though, that's a nice clean 1VRMS. I don't think anybody goes beyond +6dBV as a max input.

So the whole discussion about digital headroom and operating levels is trying to fit new wine into old wineskins.

Let's look at mics. The maximum output a mic can produce is around +10dBV. That's a -30dBV/Pa mic at 134dBSPL; above that most mics would need an internal pad on their capsule to avoid distortion. That's going to need to be padded before any converter IC. Let's say -10dB. That yields a max level of 0dBV, add back +6dB of gain to clip our converter. We need analog headroom above that, let's say +10dBV. That requires a +/-5V supply rail. Hey, we already have a +5V rail for our converter! Just need that -5V now. Works out pretty well.

That yields an operating level right at -10dBV . . . a normal condenser mic at normal source levels would require a mere +20dB of gain; dynamics +40dB. How simple life could be . . .

Nope, wrong.
 
I also wanted to touch on the notion of not merely having "headroom" from operating level to clip point, but another 6 or 8 dB of "cushion" to make sure you not only don't clip, but stay in the best sounding operating range of the gear. Taking this seriously and consistently across the studio may cause you to decide that +4 dBu is too high of a level to do properly.
Also agreed, and also a good point. I don't know if it's sad or humorous, or both, but it's something how the more one actually understands about the nuts and bots of this stuff, and the more one gets into gaming gain structure, the further (in concept, and more often than not in reality) one gets from the mythological "record as hot as you can" and "use all the bits" mentality that's still being fed to beginners like hormones to caged chickens.
Gents, it seems to me that the only way you can effectively compare analog and digital clipping / distortion levels across various analog and digital systems is by using volts.
Which is exactly what I was trying to do, by using the +4dBu (aka 0VU in the signal path, aka a signal level of ~1.23v) level as a common base reference.

The beauty of this in practical audio engineering terms is that this is not so arbitrary as it may seem in some ways. Otto is right, in that when gaming the gain structure, you have to game the structure and not just arbitrarily pick 0VU, because some hardware or wetware may have more preferred sonic characteristics at other levels. But like planets around the sun, they all really have their dance around 0VU as the overall attractor.

This thought was not lost on the engineers when deciding how to calibrate converters. It's not a coincidence that as converter quality increases along with digital word length, that the dBu-to-dBFS calibration tends generally downward, starting back in the 90s with the -14dBFS EBU standard and the -15dBFS DAT calibrations, down through the -18s to the increasingly common -20s and even occasional -24s of today. The lower noise floors and increased digital dynamic range has allowed for the downturn in the calibration level, providing for more digital "headroom" above that reference without losing the lower volume dynamics to the digital noise floor.

Additionally, it sometimes seems almost magical that when one actually follows the dance around and near that line level attractor from analog into digital, and one mixes digitally in an "analog way", that the numbers just seem to automatically work themselves out at the other end, with pre-mastering mixes resulting in levels and relative quality most similar to those of quality pre-mastering mixes of the pure 100% analog days of yore. This, of course, is not magic nor coincidence, but rather exactly why the A/D conversion and digital design world has evolved the way it has. There is purpose there in the newfangled digital gizmos, if one is used to the idea of old school gain structure because, at least in that way, not much has really changed.

(the original ;) :D) G.
 
This thought was not lost on the engineers when deciding how to calibrate converters. It's not a coincidence that as converter quality increases along with digital word length, that the dBu-to-dBFS calibration tends generally downward, starting back in the 90s with the -14dBFS EBU standard and the -15dBFS DAT calibrations, down through the -18s to the increasingly common -20s and even occasional -24s of today. The lower noise floors and increased digital dynamic range has allowed for the downturn in the calibration level, providing for more digital "headroom" above that reference without losing the lower volume dynamics to the digital noise floor.
Agreed, and this is a big area of weakness in analog recording . No matter how good it is, it cant match the dynamic range of a good digital recorder.

The noise in analog recorders is...analog. It's inherent in trying to use analog means to record and playback. It's not the same as an analog signal running through a cable, a point apparently lost on some people.

To reduce analog recording noise to about the same as the rest of a good live signal chain, you need serious compander noise reduction, which is complex circuitry itself. Digital recording doesnt need NR.

Countless excellent recordings have been made using analog recorders and maybe will continue, but it wont be because they are analog but in spite of being analog.

In certain circumstances, analog tape distortion/compression is a pleasing production effect. But it's still just an effect and a distortion.

Cheers Tim
 
It's pretty much unreadable.
I wouldn't say it's unreadable. I'd say it's got a lot of posts that are simply common knowledge that's been restated to try to sound intellectual.

But your post beats mine. 4 words to 26, not counting contractions. Brevity rules.
 
In certain circumstances, analog tape distortion/compression is a pleasing production effect. But it's still just an effect and a distortion.


True...but THAT may be the essence of why so many folks seem to prefer the sound of tape/analog VS. digital (without just looking/comparing their specs).

For instance...I HATE digital TV...you either have a signal or you don't, so the incremental loss of signal (or you can call it distortion gain) you get with analog TV is much more acceptable to the eye than NO signal. Plus...did you ever look at some of the colors on digital TV...the fades & gradients...etc...some serious artifacts! It just doesn't have the "smoothness" of analog...that gentle degradation. And I'm no bigger fan of HD TV either. I DON'T LIKE that hard/harsh contrast so I can see every blackhead on someone's nose! :D
It’s kinda OK for sports, maybe…but I do not like it for anything else. The "softness" of film will trump HD TV every time IMO....and is much more pleasing to the eye than HD TV for any serious viewing. Watching HD TV is like being on amphetamines (so I'm told ;) )...it just has that semi-psychotic quality to it..
And I think there's some similarity with digital audio. I know this may sound bizarre...but IMO, it can be TOO perfect. I think just like the eye, the ears seem to find prefer the imperfections of tape over the perfections of digital (for many people) because of that incremental/gentle degradation of signal you get with tape.
I know a lot of this is subjective and a matter of personal preference...but I do believe that the pleasing distortion/compression that tape provides in many cases outweighs the extra headroom of digital AFA sound quality is concerned.
And besides...many people don't use the digital headroom anyway, they just push everything up to 0dBFS because they can. So in some ways, digital is its own worst enemy. The debates and discussions to try and get people to work with the concept of headroom like is done with analog just doesn't click for many folks that never worked with tape/analog.
 
Agreed, and this is a big area of weakness in analog recording . No matter how good it is, it cant match the dynamic range of a good digital recorder.
And besides...many people don't use the digital headroom anyway, they just push everything up to 0dBFS because they can. So in some ways, digital is its own worst enemy.
This is the Big Irony IMHO and the crying shame of the trends of the digital age. One of digital's big selling points from the outset was the depth of clean dynamic range it offered, Even at 16-but it was/is usually significantly more than the total dynamic range of the analog signal being pumped into it.

Yet as that range has even increased over the years, there has been an ever-greater tendency to smash the shit out of the signal so that only the top 20-25% (give or take) of it is actually used. The result is that 0dBFS, instead of being the very high limit that one should not even be a concern as the "system" was designed, if you look at the reality of it all, has now become a focal point of concern for most of the folks who come to places like this. And I'm not sure you can blame digital (the technology) for that; it's not the technology's fault that people misuse it.
True...but THAT may be the essence of why so many folks seem to prefer the sound of tape/analog VS. digital (without just looking/comparing their specs).
How much of that, though, is largely because that's what folks have become accustomed to, and not because it actually has an inherent pleasantness? Imagine a hypothetical planet where they, for whatever historical reason, tripped across digital recording before they did analog. They were used to digital for many years. Yeah it had gotten better as technology increased, but then someone came along with the idea of analog recording. Now, if analog recording truly intrinsically sounded better, there'd be no nostalgia for the "old school" digital sound, and digital would be dropped like lead balloon.

Do you think that would happen that easily, or would people think that analog sounded too distorted and artifacty compared to the "clean" digital they and their parents grew up on?

I suspect that it wouldn't be much different than it is here and now, and that on Planet Digital First there'd be as many debates in as many forums like this one over the whole A vs. D subject. The only real difference would be a role reversal between the two sides.

G.
 
Well folks, the previous post pretty much wraps up this thread. Though, as a critique, it would have been more effective if the three superfluous words had been removed... value of brevity and all...


No doubt a following post will make this thread WAY longer. But, none the less, at this point it's wrapped up.
 
Well folks, the previous post pretty much wraps up this thread. Though, as a critique, it would have been more effective if the three superfluous words had been removed... value of brevity and all...


No doubt a following post will make this thread it WAY longer. But, none the less, at this point it's wrapped up.

So, why is digital superior to analog? Oh, because Gerg said so.....:p
 
  • Like
Reactions: XLR
After scanning parts of this endless wankthread I've decided that analog is superior to digital. For the simple reason that it's one letter shorter.
 
After scanning parts of this endless wankthread I've decided that analog is superior to digital. For the simple reason that it's one letter shorter.

But Gerg didn't say so.....:mad:
 
Status
Not open for further replies.
Back
Top