Time accuracy of digital sources?

cfg

Member
Do various cd players play music at slightly different rates, like say uncalibrated turntables might, or are all modern cd players dialed in to perfect time?
 
Good question.

One would think so, but I have to use one clock source for two identical interfaces, and the third alternate one.

I suppose one digital clock is needed to keep the input to computer aligned as it comes in. Thus I guess the latency of devices or how a computer takes it in is important?

Never bothered trying other than BNC cables to clock them all together.
 
A central clock source was considered essential in the 80s, and since then getting less and less vital it seems. Early phase lock loop systems were crystal based. Now PLL technology is capable of much better accuracy, but to answer your question - do they all play at different rates? Yes I suppose they do. Is 44.099KHz rather than 44.1KHz important? I've noticed that when syncing video tracks, each new camera seems more stable. I can now run two cameras for 2 and a half hours and not have to re-sync them in the edit. Direct to card recording. Tape based digital always drifted a little, yet the sample rate was, on paper, exactly the same.

They do differ, but how much matters before whatever you are doing doesn't work?
 
A sync clock would be vital if you're trying to run multiple digital sources together, and you needed them to be in absolute sync...and then only to minimize jitter between them...but when you just play a CD in one player and then another...you're never going to have enough difference for it to be audible as a slower/faster playback speed. Sample rates are pretty solid individually.
 
Back
Top