dgatwood said:
You're missing something very subtle here.
Nope, I am not
dgatwood said:
A true random access pattern refers to reading/writing a block at a time or a small stride at a time and then skipping to a different track, waiting for the right part of the track to get under the heads, then reading or writing another block or small stride, then repeating.
Yup. Agree with this definition.
dgatwood said:
An audio workload doesn't do that unless your DAW and your OS are both horribly broken.
This is where your logic is starting to run into trouble... read on.
dgatwood said:
A DAW preallocates a very large stride (usually measured in minutes of audio) of continuous blocks for a track when recording. As such, each individual audio track tends to be very nearly contiguous. Based on your comment, I'm assuming you already understand that.
Agreed. Emphasis on
a track. However, this is only true if one assumes that the track in question is addressing only one actual audio file, which is almost never the case. Each separate take, even if it's on the same track will be a separate audio file which your DAW must then address.
dgatwood said:
The flaw in your reasoning is how you envision playback occurring. The DAW doesn't read one disk block from one track, skip to the next track and read a block, etc. It reads usually a couple of seconds worth of each track. It might easily read or write... say a megabyte at a time.
And the flaw in your reasoning is assuming that a computer is able to read or write a full megabyte at a time. It may appear to an end user that a computer does many things all at once, but this is far from what really goes on inside the computer. In reality, things inside a computer never happen "at once" or as you put it "at a time". The computer processes things in a more or less "round-robin" fasion. Computer will only process things "at once" in multi-threaded AND multiprocessor systems. However, I must emphasize that even this is done in very small chunks, we're talking 32-64bit chunks, depending on the internal data bus architecture of the processor.
Now, depending on how much RAM your computer has, a DAW will preload a good chunk of audio data resident on the tracks in RAM. However, this is where things get a bit more complicated...
Each audio "take" or recording that is resident in a track, will be a separate .wav or .aiff file as mentioned above. Every time you record something new to a given track, you are creating a new file with it's own name (different DAWs have different ways of identifying these, perhaps taking the name of the track and appending a number in sequencial order). So, it is not unusual for a single track to contain more than one audio file in a single project, which the project is accessing. You, as the end user have absolutely no control where exactly these files will be physically located on the disk. By luck they may be in adjacent blocks, but more than likely they will not be. Now compound this with multiple tracks each of which is accessing
multiple actual files from the disk and things get more interesting. However this is still not a problem, because as you say, the computer will have at least some of these pre-loaded in RAM.
However, the trouble is that most DAWs will assume that you're going to read a given audio file in its entirety in a contiguous manner, which admittedly should be the case in most instances. However, now I am going to ask you to take off your pop/rock musician hat and put on your freaked-out electronic nutcase hat on, a la Aphex Twin, and imagine that you're insane enough not to use samplers, but do your sequencing directly using audio tracks in a DAW
And this is where you can easily push the limits of your HD even with 24-30 tracks. Now, imagine that you have say 4-5 tracks that access the same drum loop file. The reason you might have this file spread across multiple tracks can be many, for example you might want to isolate the hits containing the kicks on one track, snares on another, etc. You might also have some chunks from this loop on other tracks so you can apply different effects on them. Once you start doing this, you're forcing your DAW to skip around this audio file and play a bit from here, a bit from there, often times asking it to play a bunch of bits from discontiguous locations
within the same file "all at once" (although as we now know that since things cannot happen simultaneously inside of a computer, it will have to read a little chunk here then a little chunk there and hopefully have enough time to shove all this mess into RAM to feed the CPU). Now imagine doing this to about 24 tracks that are accessing separate files totalling in excess of 100-120, at various points within a track (perhaps even performing crossfades on them), and you can easily see how you can get your DAW in trouble even if there is nothing fundamentally wrong with your computing environment and you have diligently defragmented your drives.
dgatwood said:
BTW, if you're coming anywhere close to the limit of random read/write performance on a modern drive (<5 years old), either you are using an insane number of tracks (50+ @24/96) or there's something else causing a huge number of unnecessary seeks for tiny reads or writes (e.g. your OS isn't caching file metadata like it should). Either way I would tend to blame the DAW for not reading far enough ahead to account for latency. That really shouldn't happen.

Even with pretty intricate edits, my drives are idle... I'd estimate about 80-90% of the time during 32-track playback.
Addressed in previous paragraph.