Upgrading convertors

  • Thread starter Thread starter kenoflife
  • Start date Start date
mshilarious said:
Yes, at the individual sample level. So the difference is more than just latency.

It looks like you have a gain loss in there. Try using a reference tone at -3dB and calibrate your input until the reference tone is also at -3dB. You shouldn't have anywhere -near- that much signal left after a null test unless the level is way off.

Edit: scratch that. It looks like your high frequencies are nulling out a good bit, but you're getting a massive low frequency roll-off. Use a low frequency reference tone (500 Hz, maybe) and adjust that level to be the same. Then see what happens.
 
dgatwood said:
It looks like you have a gain loss in there. Try using a reference tone at -3dB and calibrate your input until the reference tone is also at -3dB. You shouldn't have anywhere -near- that much signal left after a null test unless the level is way off.

I spent a lot of time calibrating levels, and repeated the test today. After long thought, I think I've come to the conclusion that it is perhaps not a valid test, because levels cannot be matched closely enough. I can only normalize to .001dB resolution, which equates to something like a -77dB difference file.

But even using the most careful procedures I have, I could get an average -31dB difference on program material (with most of the improvement from yesterday in high frequencies), and on a 200Hz sine wave, -41dB.

Without more converters to test, I can't draw any conclusions. Is there that much distortion (0.008V, or .006% THD if my math is right) in a D/A/D roundtrip with a midgrade converter? Maybe, I don't know. Food for thought for the OTB crowd ;)


The filter behavior and the sine wave distortion test are still quite interesting, as neither is dependent upon matching levels closely.
 
Back
Top