Why digital is superior to analog

  • Thread starter Thread starter jordanstreet
  • Start date Start date
Status
Not open for further replies.
Digital systems can be frightfully linear up to their top value ("full scale"), above which they immediately clip because there is nowhere higher to go in a world of zeroes and ones when you have them all flipped on. To create headroom in a digital system, you pick a level below "full scale" as your reference level (for an "average" signal level) and that dictates the amount of headroom you set for your peaks to avoid clipping, which you do want to avoid because it doesn't sound nice.
This is an excellent point. In fact, if you actually choose the exact same reference levels for digital as you do for analog, you actually have MORE headroom with digital.

The erroneous part of what hardwire was told was that it equates 0dB analog with 0dB digital, when in fact they are two entirely different levels. 0dBVU analog does not equal 0dBFS digital any more that 0°F equals 0°C. Depending upon the exact calibration of the converter, 0dBVU actually equals somewhere between -22dBFS and -14dBFS digital. Usually to save time, in forums like this, -18dBFS is quoted as an average "standard" conversion value (though there's really nothing "standard" about it.)

This means that even if the analog tape deck can be pushed to +8dB without unpleasant amounts of distortion, that still only comes out to a maximum of -10dBFS on a system with converters calibrated to -18dBFS. This basically would mean that digital has 10 more decibels of headroom than analog on that system.

G.
 
This is an excellent point. In fact, if you actually choose the exact same reference levels for digital as you do for analog, you actually have MORE headroom with digital.

The erroneous part of what hardwire was told was that it equates 0dB analog with 0dB digital, when in fact they are two entirely different levels. 0dBVU analog does not equal 0dBFS digital any more that 0°F equals 0°C. Depending upon the exact calibration of the converter, 0dBVU actually equals somewhere between -22dBFS and -14dBFS digital. Usually to save time, in forums like this, -18dBFS is quoted as an average "standard" conversion value (though there's really nothing "standard" about it.)

This means that even if the analog tape deck can be pushed to +8dB without unpleasant amounts of distortion, that still only comes out to a maximum of -10dBFS on a system with converters calibrated to -18dBFS. This basically would mean that digital has 10 more decibels of headroom than analog on that system.

G.

Yes, it's key to note that 0VU on a tape machine represents the signal level for a time-averaged signal level of "average" intensity. More realistically, it's the time-averaged level of a forte passage. 0dBFS in a digital system is the absolute maximum level.

Glen, I think you are also confusing the terms slightly in that last bit, so I will give a bit more specificity.

The "+6 dB" reference on tape is actually with regard to the old Ampex operating level of reference fluxivity (0VU) of 180 nWb/m at 1K, corresponding to the Ampex level of 185 nWb/m at 700 Hz. +6 is twice as much, actually 355 nW/m at 1K, so I assume there was some rounding in the 180 number.

Really high output tapes like 3M 996 might saturate at 2700 nWb/m, which is nearly +24 dB (above the Ampex level). If you were using a reference level of +9 for that tape, that would leave you 15 dB above the reference level to reach saturation. If you calibrate at +6, that would leave 18 dB.

Of course that maximum level is pretty distorted. More realistically, you might typically have 10-15 dB of useable room above your reference level in most situations. Of course, you also end up with a signal with a crest factor no more than that amount.

Cheers,

Otto
 
Ahhh, all good stuff. Thank you gentlemen.:)
 
Ahhh, all good stuff. Thank you gentlemen.:)

Indeed.

Ofajen, so do I understand correctly that by simply going from the Ampex tape to the 3M 996 you can push the levels a lot higher on the same machine (assuming the electronics on the machine itself will allow)?
 
The "+6 dB" reference on tape is actually with regard to the old Ampex operating level of reference fluxivity (0VU) of 180 nWb/m at 1K
On the tape side of the equation, yes. But - and correct me if I'm wrong - there's also the input signal level side of it which also equates the 0VU reading to +4dBu of input signal ("line level"). In other words, in effect, a properly calibrated tape deck calibrates the 0VU reference level on both sides of the heads to each other.

With this in mind, assuming 0dBVU to represent +4dBu - using "line level" as the reference average, not only would a +6 on the tape also translate (assuming a unity gain setting on the deck output, of course) to +6dBVU/+10dBu signal chain level, but then would also directly translate via the calibrated conversion level calibrated in the converter to the corresponding reading in dBFS on the digital side.

G.
 
Indeed.

Ofajen, so do I understand correctly that by simply going from the Ampex tape to the 3M 996 you can push the levels a lot higher on the same machine (assuming the electronics on the machine itself will allow)?

In theory, yes, but the difference would never be so striking, nor is it necessarily so easy! Remember that tape machines and tape evolved together over a period of decades. Scotch 111 tape could only handle the Ampex level. That was the tape of the 50s. Eventually there were +3 tapes like Scotch 203, and then +6 tapes like Ampex 456 and then +9 tapes like 996 and Ampex 499 and BASF SM 900. If you have a consumer or semi-pro machine it may want to stay at whatever it was designed for: +3 or +6. To move up to a +9 tape means using a thicker tape and that may be a problem for your transport.

Assuming the transport is OK, you still have to bias the tape and then record.

No matter what level you record, you need to enough bias field to bias for max sensitivity at 1 kHz. So first you make sure the machine can bias the tape.

Once you've done that, max recording level can be limited by either the recording amplifier or the tape. When the tape reaches its limit, the distortion will be symmetrical (same on both sides of the waveform). I think most recording amplifiers a single-ended, so when they overload one side of the wave will probably look different from the other side. This should help you tell whether the tape or the amp is limiting the recording level.

The bottom line is that moving up to those thicker, higher output tapes requires a robust machine with a transport that can handle thicker tape, can bias it and have headroom in the record electronics to stay in undistorted operation and may involve mods to the circuitry. Of course, there are some who like to record that hot with a machine not designed for it, because they like the distortion in the sound.

Cheers,

Otto
 
Last edited:
On the tape side of the equation, yes. But - and correct me if I'm wrong - there's also the input signal level side of it which also equates the 0VU reading to +4dBu of input signal ("line level"). In other words, in effect, a properly calibrated tape deck calibrates the 0VU reference level on both sides of the heads to each other.

With this in mind, assuming 0dBVU to represent +4dBu - using "line level" as the reference average, not only would a +6 on the tape also translate (assuming a unity gain setting on the deck output, of course) to +6dBVU/+10dBu signal chain level, but then would also directly translate via the calibrated conversion level calibrated in the converter to the corresponding reading in dBFS on the digital side.

G.

There is the option, but generally not a requirement that signal levels be the same going in and coming out. You can calibrate each one. Every tape machine I know of has a record gain control for each channel. It's a nice feature. You can set a range of line signal levels as your reference level on input (my 3M M-23 specifies the range is from -20 dbm to +8 dbm on a 600 ohm bus). I typically take the gain up on the mix deck, since I have a little Mackie mixer that sounds much better if you run buss levels down a bit (say -4 dBu, remembering that 0 on the Mackie is actually 0 dBu). Back when I tracked with the Mackie, too, I'd crank the gain on my multi-track, too, so the preamps wouldn't have to put out a full +4 signal. Makes a significant improvement in the sound.

On the output side, many, but not all machines can either use a calibrated level (+4 or maybe switchable from +4 to +8 or maybe 0) or an adjustable output level which again lets the reference fluxivity represent a different signal level. Obviously, you want to calibrate all these levels carefully and set them up to work to your gear's maximum advantage.

Cheers,

Otto
 
Last edited:
The erroneous part of what hardwire was told was that it equates 0dB analog with 0dB digital, when in fact they are two entirely different levels.

That's what I was more or less talking about. In turn I may have relayed that information wrong, because what I remember from that class is spread pretty thin these days. Especially seeing that the only thing I ever have to worry about is not going past 0db.
 
With this in mind, assuming 0dBVU to represent +4dBu - using "line level" as the reference average, not only would a +6 on the tape also translate (assuming a unity gain setting on the deck output, of course) to +6dBVU/+10dBu signal chain level, but then would also directly translate via the calibrated conversion level calibrated in the converter to the corresponding reading in dBFS on the digital side.

OK, Glen, you're still not quite seeing this, which is quite understandable, and I'm going to try to help. The +6 designation is a level of reference fluxivity on the tape. The "+6" is just a convenient way to relate it to the old Ampex standard level that few people still use, other than perhaps the Library of Congress putting stuff on old non-back coated tapes that last forever. It means you're running the recorder 6 dB hotter, putting 6 dB stronger signals on your tape than they did in the 50s. OTOH, you can do this all with real numbers of fluxivity in nWb/m and that's probably easier to follow.

Whatever you pick as a reference level, you then set as 0VU on the meter. So, if you say you choose to calibrate to +6, using RMGI SM 911, for example, since it's a current tape, what you are really doing is setting 0VU to be 355 nWb/m, using an appropriate calibration tape (which could have tones at 355 nWb/m, or just as likely at some other level and corresponding adjustments can be made). That means you will have that "+6" level of 355 nWb/m as your 0VU, commonly with that requiring +4 dBu input and providing +4dBu output, but not necessarily, as I've noted.

I believe SM 911 has a saturation flux about equal to Scotch 226 which was 2000 nWb/m, which is 15 dB above the reference level of 355 nWb/m.

If I were running all the audio at +4, the +15 dB peaks would be at +19 dBu. Running the unbalanced mike pre sends of the Mackie to feed a recorder at that level would have it begging for mercy and already sounding less than pleasant. It hits the wall between +21 and +22, but it's getting ugly before then. Better to keep peaks at or below +16. Hence the recorder gain adjustment to keep 0VU down to somewhere from -4 dBu to 0dBu.

Does that make more sense?

Cheers,

Otto
 
Whatever you pick as a reference level, you then set as 0VU on the meter. So, if you say you choose to calibrate to +6, using RMGI SM 911, for example, since it's a current tape, what you are really doing is setting 0VU to be 355 nWb/m, using an appropriate calibration tape (which could have tones at 355 nWb/m, or just as likely at some other level and corresponding adjustments can be made). That means you will have that "+6" level of 355 nWb/m as your 0VU, commonly with that requiring +4 dBu input and providing +4dBu output, but not necessarily, as I've noted.

To be more explicit for those not familiar with the process, standard calibration of a tape machine is done with a choice of reference level and is done on a particular machine for a specific type of tape.

It's done in two parts: first you use a calibration tape to set the reproduction ("repro" or playback) side so that playing back the reference fluxivity level will indicate 0VU and frequency response on playback is a good as possible and corresponds to some standard eq curve, so that if you play someone else's tapes they sound right. Then you calibrate the record section using the tape you want to use, so that your tapes will sound right both on your machine and on someone else's.

You start with a 1K tone for level, adjust head azimuth, then play the tones or tone sweep and adjust the playback eq for flattest response.

You have to do that first, because you need a properly calibrated playback side to do the record calibration for the specific tape you want to use.

Basically, once the playback side is right, you remove the calibration tape and put on the reel of tape you want to use and now you are ready to set up the record electronics.

First, you set the bias (various ways, a simple one is to adjust to maximum output at 1K and then overbias to reduce 1 dB of output).

Next, you adjust record head azimuth.

Next, choose your audio signal input operating level by the level of the signal you input (i.e. if you want +4 dBu, input a +4 signal at 1K, if you want -2 dBu, input a -2 dBu tone)

Now adjust the record gain so that when you record it, the playback indicates you are getting the reference fluxivity (0VU) on the tape. Adjust the record monitor so that that same signal also shows up as 0VU on the input.

At this point, putting in a 1K tone at your operating level should show 0VU when you monitor the input and if you push record and monitor playback, it should also show 0VU. But you want it to work that way for all frequencies.

So you adjust the record eq to get the high frequencies to your liking (you typically have to make a choice of whether to seek flattest response and take more droop at the top, or broadest response, with a broad peak somewhere between 6K to 10K. Some machines have a problem doing the record eq at full operating level (full operating level at 20K is a LOT of 20K), so you may have to back off to -10VU, adjust for flat response and then pull the gain back up at the end to get the level right.

Then you adjust low end on playback, because you are looking for overall flat response and other than Studers, there is usually no low end adjustment on the record side, so you tweak the low end playback, again to balance minimizing the bass bump versus maintaining response at and below 30 Hz.

That's typically the whole process. It's not exactly the same for every machine, but that's pretty much it.

Cheers,

Otto
 
Last edited:
There is the option, but generally not a requirement that signal levels be the same going in and coming out. You can calibrate each one. Every tape machine I know of has a record gain control for each channel.
I think we're talking past each other, Otto (as always happenes on these boards. ;)) I understand and agree with all that about tape calibration and gain staging the I/O of the tape deck. (The more gain stages I have, the happier I am :).)

What I'm talking about is calibrating the machine so that 0VU represents both the accept tape calibration level *and* the nominal average line level. IOW, if you are using the old Ampex standard, an input preamp level of +4dBu (0VU from the line levl perspective) at 1k would wind up laying out 180 nWb/m onto the tape (0VU from the tape calibration perspective.). The 0VU reading in that case isn't simply telling you what's going on to or off of the tape, it's also indicating the signal level on the electronic side of the heads.

For those decks not calibrated that way, where there is not a line level-to-tape-saturation equivalence, there will obviously be a discontinuity there. Some decks addressed that by offering a meter display toggle that switched between displaying actual tape level and input or output signal level. Which was particularly helpful on the output sou you *could* visually adjust the variable output level to send a good gain signal downstream to the next device.

You're right in that if the metering and the calibration on the deck does not equate tape level and line level, then the exact numbers I gave earlier will not play out exactly that way. But if pumping 0VU into the decks means that you'll be laying down 0VU by tape calibration standards, and that you'll be playing 0VU back out of the deck (whether by default or by manual I/O gain control) then the numbers will jive. And sooner or late along the chain, you'll probably be wanting to bring that gain structure into that kind of line regardless of what you do to the tape anyway.

G.
 
Last edited:
What I'm talking about is calibrating the machine so that 0VU represents both the accept tape calibration level *and* the nominal average line level. IOW, if you are using the old Ampex standard, an input preamp level of +4dBu (0VU from the line levl perspective) at 1k would wind up laying out 180 nWb/m onto the tape (0VU from the tape calibration perspective.). The 0VU reading in that case isn't simply telling you what's going on to or off of the tape, it's also indicating the signal level on the electronic side of the heads.

For those decks not calibrated that way, where there is not a line level-to-tape-saturation equivalence, there will obviously be a discontinuity there. Some decks addressed that by offering a meter display toggle that switched between displaying actual tape level and input or output signal level. Which was particularly helpful on the output sou you *could* visually adjust the variable output level to send a good gain signal downstream to the next device.

You're right in that if the metering and the calibration on the deck does not equate tape level and line level, then the exact numbers I gave earlier will not play out exactly that way. But if pumping 0VU into the decks means that you'll be laying down 0VU by tape calibration standards, and that you'll be playing 0VU back out of the deck (whether by default or by manual I/O gain control) then the numbers will jive. And sooner or late along the chain, you'll probably be wanting to bring that gain structure into that kind of line regardless of what you do to the tape anyway.

G.

Agreed all around. I just wanted to illustrate the options on setting levels in the complete context of what those levels mean. The ideal situation is to have gear that all has the same capacity and expectations for signal level.

I also wanted to touch on the notion of not merely having "headroom" from operating level to clip point, but another 6 or 8 dB of "cushion" to make sure you not only don't clip, but stay in the best sounding operating range of the gear. Taking this seriously and consistently across the studio may cause you to decide that +4 dBu is too high of a level to do properly.

Cheers,

Otto
 
Gents, it seems to me that the only way you can effectively compare analog and digital clipping / distortion levels across various analog and digital systems is by using volts.

Here's the conversion chart again for the various metering formats back to volts once again.

http://hux.com.au/Soapbox%20Items/World%20Audio%20level%20Reference.pdf

If I'm reading it right, it pans out like this;

My 2 track recorders are set to read 0 on their VU meters with a voltage of 1.228v at 320 nWb/m flux. If you look across to the VU column (standard) on the chart you can see that is "0". So my 0VU on the tape recorder's meters is 1.228 volts.

Glen mentions digital systems calibrated to -18dbfs or -24dbfs. They are two different calibration standards depending on what part of the world your digital system is calibrated for. Voltage wise, they are measuring the same thing. -18dbfs in the EBU R68 standard for the UK & parts of Europe and -24dbfs using SMPTE RP155 for most of the rest of the world equates to a voltage of 0.775 volts. On a digital system's meters, depending on what standard your digital system is using, 0.775 volts should show as 0 for a UK/Europe standard system (because it's calibrated to read 0 at 0.775 volts) but -4 for the US one (because it's calibrated to read 0 at 1.228 volts - like my +4dbu tape systems - and that is -20dbfs using RP155).

So my analog 0VU would show as +4 on a EBU R68 calibrated digital system but the same - 0 - on a US SMPTE RP155 calibrated one.

The clipping lights on my recorders are set to +6dbu above it's 0VU setting. Going back to the voltage chart and adding +6dbu to the +4dbu I'm already at, equates to 2.499 volts. 2.499 volts on a US calibrated digital system would also show up as +6 on it's meters or -14dbfs RP155 on that scale.

I use SM911 tape and it has a saturation level of +10.5db @10kHz over the 320 nWb/m reference level (and +14db @ 1kHz) - say 14.5dbu (being the 10.5 db @ 10kHz spec + the +4dbu the tape deck is already set at to meter 0VU) and about 4.1 volts - and this would also be +10.5 (or -10.5dbfs) on a RP155 calibrated set of digital meters.

At +16.5dbu the tape is harmonically distorting at 3% and climbing exponentially from there. +16.5dbu equates to say 5.1 volts or so and would show up as +12.5 (or -8.5dbfs) on a set of RP155 calibrated digital system meters. My tape recorder wouldn't show it because their meters only show up to +3VU - but it'll be red all the way on the clip lights.

As folk have been saying, tape doesn't clip like a digital system, it saturates and distorts in a musically pleasing way in some situations - up to a point. You can hear it when it becomes 'unmusical'. For rock recording, with no noise reduction, going live to two track we would regularly hit the tape with the clip lights mostly in the red and peak voltages above 4.0 volts. I have no idea what they peak at. Pushing the same voltages on an RP155 calibrated digital system would show up with levels peaking over +10 (or -10.5dbfs RP155) on it's meters.

So, as the comparison can only really be made at the voltage level the question becomes at what voltage does the digital system noticibly distort at? Then you can get your digital to analog 'unmusical' distortion comparison going.

I might try and experiment on my own digital system with the tone generator, 'scope and voltmeter and see when it noticably clips. I guess it's going to be different for different digital systems? I've got a feeling mine's not going to be too happy much above -10dbu 0.245 volts input and output. It sure isn't a +4dbu calibrated system or a +4dbu analog tape being hit at +16dbu. And I suspect it conforms to no standard but it's own proprietary one.

G
 
Last edited:
Nearly all calibration standards of voltage to digital level are arbitrary because they aren't based on the physical characteristic of the converter. A typical converter IC actually clips at around 1VRMS, that's +0dBV. So by setting a standard higher than that, all you are really telling a converter is how much it needs to pad its input (and amplify its output).

I saw one comment on a less reputable board--think I mentioned this already--that real "pro" converters could handle levels up to +24dBu, but weak little prosumer converters could only manage +15dBu. Yeah, as if different resistor values are more expensive or something :rolleyes:

If an integrated preamplifier/converter system was designed to acknowledge that fact, we'd be talking about -20dBV as 0VU. The whole box could run off of +/-5V rails, which would reduce power consumption by two-thirds and make the box smaller, cheaper, and lighter.

But no, let's keep on keepin' on feeding our converters +15dBu, just because that's the level tape needs or whatever . . .
 
Answer,,, digital is NOT superior to analog.............................. Also, analog in many ways is NOT superior to digital. Thread closed. This argument can not be won by either side because they both have advantages and disadvantages.
 
Answer,,, digital is NOT superior to analog.............................. Also, analog in many ways is NOT superior to digital. Thread closed. This argument can not be won by either side because they both have advantages and disadvantages.

oh right. the next thing you'll tell us is that macs aren't actually superior to pcs and that Pepsi is better than Coke. :rolleyes:
 
oh right. the next thing you'll tell us is that macs aren't actually superior to pcs and that Pepsi is better than Coke. :rolleyes:

Oh please! Everyone in the world already knows Coke is better then Pepsi. That's why I didn't write that in my last post to start with. HEHEHEHEHEHEHE!!!!!
 
Status
Not open for further replies.
Back
Top