Is close enough good enough on calibration? -10dBv=0VU=.316VAC almost

Blue Jinn

Rider of the ARPocalypse
OK.

Just throwing this out there, don't pounce too hard.

-10dBv/0VU is supposed to be .316 volts. I have a good AC Voltmeter* that will read that and the MRL tape. But adjusting a tiny trimmer to get there ain't easy, especially for the MSR-16. As long as the reproduce level is close enough, the bias voltage within a few percent of target, the output from the board close enough, and 0VU matches 0VU on record and playback at 1kHz are a few millivolts either side of .3 going to matter that much?

EDIT *HP 400E 10Hz-10MHz +/- 1-3% in the audio spectrum. However, it hasn't been expertly calibrated itself in a who knows how long.
 
Last edited:
One of my science teachers said never try to measure to a degree that you cannot verify. .3V or .32, or .29 - that's the kind of accuracy that is not repeatable. I doubt the specs of the meter get remotely that accurate. The other snag is when you say AC voltmeter. A meter designed to measure 50 or 60Hz, is probably going to be totally inadequate to measure voltage at say 6KHz. A scope has calibrated displays so it's ability to 'see' audio waveforms is known and accurate. I'd bet most common AC meters are only accurate at mains electricity frequencies, not audio. Don't forget that the ~voltage will also be at a specific impedance. Best you can do is comparative measurements where you measure something you trust, and then use that to a comparison. Your assumption that a bit either side of the 'ideal' is OK is how I would work.
 
Cool. FYI, the meter is an HP 400E 10Hz-10MHz +/- 1-3% in the audio spectrum. However, it hasn't been expertly calibrated itself in a who knows how long. I also have a Fostex A-8 and I don't have a cal tape. 0VU in and out is close enough, and I live with it.
 
Back
Top