Syncing tracks

  • Thread starter Thread starter ecc83
  • Start date Start date
E

ecc83

Well-known member
Say I have 2 tracks recorded (44.1kHz, 24bits) from my MOTU M4 in a DAW as .wav files. Then two more recorded on a Tascam handheld, same .wav, same sample rate/word length. Can I just drop them into the DAW and have them work?

I know enough to know the two interfaces will not run at EXACTLY the same sample rate so is there a way (in any DAW) to lock all tracks to a "system" clock?

"Asking for a friend". Ah! Should have said! ALL tracks would be made of the same band at the same time. Effectively trying to use two interfaces to "mutitrack".

Dave.
 
If they're recorded on separate devices, there's a good chance they'll drift. The question is how long before it's noticeable. I generally record video of live shows, so sets of around an hour. My Zoom H5 seems to match my cameras pretty well over that time. If I record on my Soundcraft UI24R, the drift is a second or two, so I have to squeeze it down just a bit. That might not be enough to matter over the length of one song, but it's fixable in any case.
 
If the devices could be connected via SMPTE or ADAT, it might be possible to clock one off the other.
 
Is there some sort of reference point(s) between the two recordings?

Assuming there is - seems like you should be able to edit one of the two - even in pieces - to line up with the other and compensate for drift. ???
 
I often have to match audio from different devices, most usually audio on a video track and audio recorded separately on a laptop. I load both into Reaper, and most of the time they line up just fine. However, if there is a bit of drift, it is easy to stretch one or the other to get them to line up.

Having said that, the audio from the video is usually ditched, so there are no artifacts from mixing to very slightly different tracks.

But more closeloy to what you are doing, I do a fair bit of remote collaborations; recording stuff where I am and having others record material where they are and send that to me to incorporate into what I have. The external material is recorded on different systems and different DAWS, but I have yet to experience any drift.
 
You can forget drift due to sample rates running at different speeds. Not had this for years. I jam sync video and audio all the time. Frequently, as in multiple times a week, record video for editing. For convenience one video camera, connected to a black magic switcher feeds a black magic recorder and the embedded audio and video get put into premiere for editing. Frequently i also am running additional audio to a zoom recorder and cubase running on a separate computer. Sync in premiere is done with a clap, or perhaps something like the start of a chord. Anything really, and premiere allows me to get accuracy of maybe a 50th of a second. Sometimes the recording can go 45 minutes to an hour, as i tend to record a continuous sequence, and edit out the rubbish. If its in sync at the start it always stays in sync. The only exception to this was using an older mac, and on long recordings the buffers would eventually overflow and recording would either stop or have gaps. Doing shorter recordings is fine, but for longer ones I use a better specified computer which does not do this. There is, as far as i can tell, no drift at all after an hours recording. To be fair on the older mac, i can often fix the odd missing section with a cut and resync.

You really should not find drift audible.

It is there, of course, and if you record two identical tracks on different recorders, then gradually they start to phase together after on my gear, ten minutes or so. It starts to sound odd and hollow. But this is only on identical tracks. Even a stereo pair manages quite well, although you can notice the apparent positions of point sources change. Even this is salvageable by editing a few transients to bring them back. My conclusion nowadays is that its hardly an issue. On a typical multitrack with low spill, its a non event.
 
I found that my R-24 clock is slightly off from my video camera, even running both at 48K. It's a known issue with the R24. It's not enough to mess up a video sync of a single song, but the two audio tracks end up with a slight difference.

I can stretch the audio in the Reaper if I need to, it's usually fine for 5 minutes or so. Anything longer and I can nudge it.
 
I've found differing degrees of drift from various devices, all compared to my cameras. I've synced up audio from my H5 and UI24R, Mackie DL16S, Behringer X32 etc.
 
From what I have read, it depends entirely on the clock used to lock the sample rate. The particular oscillator used in the Zoom R24 is off about .25 seconds/hour. My Yamaha AW1600 and Tascam 16x08 were essentially identical. I saw no issue syncing the audio between them.

Interestingly, when the R24 is used as an interface, it will track based on the DAW/computer clock, not the internal clock. That would make sense with an interface as the samples all go through a buffer where minuscule timing issues would be compensated for.
 
I think I had better put you all in the picture!
The band my son plays bass for have just bought a Xenyx 2442 mixer. 10 MIC channels and 8 direct outs.
I can see their next move will be wanting to record those DOs but son only has a MOTU M4, 4ins and a Tascam hand held, another 2 inputs.
Thus they could record 6 tracks, how exactly they share those out I have no idea (they are in France) but some will be from the Tascam.

From what I have read so far, "drift" may not be a problem?

Dave.
 
At one time, I thought about trying to use my Zoom H4n to record 4 drum mics, plus my R24 to give total of 12 tracks, but in the end, I decided that the added complexity of trying get all of that set up at the same time was not worth the hassle for the jams I was recording.

If I ever get into a situation where I think that would be an advantage, I'll just find me something like the Tascam Model 24. 16 mic inputs should be plenty, plus I still have the Yamaha mixer so I could submix vocals to one or two tracks..
 
Important to keep in mind that the “drift” is an actual difference between the speed one or both was recorded and the speed at which it is being played back. That means there is a pitch change involved, and while it may be super small in terms of tones and cents, it is real. When we go to stretch one to match the other, we want to make sure the pitch changes too. None of that resampling pitch preservation, just straight analog-style varispeed. In Reaper there’s an Item Property for that.
 
Yep, it's basic resampling, no fancy pitch processing. Let the pitch follow the playback speed adjustment.
 
All that said, try not to split things that want to be phase coherent across devices. Like, all the drum mics should be on the same clock when you record it. If you’re doing like bass via DI and amp, make sure they are recorded using the same clock. You’ll have a much better chance of aligning the drums on one machine to the bass on the other close enough for rock and roll than actually perfectly aligning two elements of the same instrument well enough to avoid phasey weirdness.
 
Back
Top