Latency

  • Thread starter Thread starter TelePaul
  • Start date Start date
TelePaul

TelePaul

J to the R O C
I'm continuing in my quest to ask many of the questions I should have asked here on day 1. I know latency is undesirable, but:

What is the difference between input and ouput latency?

Is latency (or a lack thereof) based on processing power?
 
TelePaul said:
I'm continuing in my quest to ask many of the questions I should have asked here on day 1. I know latency is undesirable, but:

What is the difference between input and ouput latency?

Input latency is on the input, output latency is on the output.


TelePaul said:
Is latency (or a lack thereof) based on processing power?

Depends on the OS and how the software was written. Generally, no.

Have a look at the attachment for a good visual picture of latency. Between the device and the computer, you will have one or more buffers, or queues. The way these work is that one device or piece of software writes to a buffer while another device or piece of software reads from the same buffer. It is somewhat like you have two open reel tape decks sharing the same loop of tape. When the end of the loop is reached, it starts from the beginning.

In the same way, with computer buffers (circular queues) either side has a "pointer". That's computer jargon for a location in memory. In effect, you can think of this as merely a position within the buffer. When that pointer reaches the end of the buffer, it jumps to the beginning of the buffer again. Thus, in the image below, when it reaches the right end, it jumps back to the left.

Because one side is recording data into the buffer while the other side is playing back from a different position, you get a delay. Now it should be noted that this delay has absolutely nothing to do with the actual size of the buffer. The latency is caused by the lag between the read and write pointers within that buffer. However, to some extent, that distance is often set at a certain percentage of the buffer size, and thus, the latency is generally proportional to (but is not determined by) the buffer size.

Now computers do not really process data continuously. They process one thing for a while, then process something else for a while, and so on. Thus, when you look at the computer side of the buffer, it appears to start and stop spastically. When one side of this buffer stalls for a moment, the other side may not (and if the other side is hardware, it will not ever stall at all unless it copies data a chunk at a time into its own buffer). When this happens, the relationship between these two pointers changes.

Where this becomes a problem is when either pointer overtakes the other one. If the write pointer overtakes the read pointer (on recording, this is usually because the computer is too slow; on playback, this generally points to a sample rate mismatch), you end up overwriting data that has not yet been played. When this happens, the effect sounds like you just skipped ahead by a fraction of a second of playback.

If the read pointer overtakes the write pointer (on recording, generally because of a sample rate mismatch; on playback, generally because the computer can't keep up), the playback side ends up playing material that it has already played once. This results in a section of audio repeating itself.

Depending on software, these conditions may be detected and cause the application to stop, or if they are in some noncritical path (between two plug-ins on playback) may be ignored and simply allowed to cause a momentary glitch. In either case, the problem can be improved by increasing the buffer size, and thus proportionally increasing how far the play pointer can lag behind the record pointer, thus making it less likely that either one will catch up with the other.

To the extent that it is necessary to increase the buffer size for this reason, latency does depend on computer speed. However, it would be more accurate to say that the minimum latency depends on the computer speed. You can always use a higher buffer size and get more latency even on a fast computer if you have some reason to do so.

NOTE: The example in the image shows audio playback. You could probably tell this because the side writing data into the buffer (red) is coming from the computer, but I thought it was still worth pointing out.

Each of the boxes represents a single sample (or a single group of samples written/read at once, but probably a single sample).
 

Attachments

  • audio_queue.webp
    audio_queue.webp
    4 KB · Views: 60
best thing to do is get software and interface that supports direct monitoring... you wont be able to add effects to the newly recorded tracks as you record but it's zero latency.... when mixing latency tends to not be a problem except you can get some lag in responce to fader moves and that sorta thing... :cool:
 
VST Instruments are where I personally think low latency is most important, as dynamics processing can be done on dry tracks after the take.
 
Processors DO play a big role in output monitoring latency (if any FX are applied) or latency in general. Any decent computer can handle most stuff nowadays without latency though. Even mine (AMD 1.2 GHz).

Also the hardware drivers play a huge part. ASIO drivers usually provide the least amount of latency.
 
Back
Top