
miroslav
Cosmic Cowboy
I wanted to bring this discussion to a new thread, rather than where it was started, so as not to mess up the topic in the other thread.
Dive in...let's kick it around here...this was the original comment.
Dive in...let's kick it around here...this was the original comment.
It's the same reason why, when I listen to Robert Johnson recordings, they can still send chills down my spine today, even though they were recorded nearly 85 years ago. When I listen to those songs, I'm not thinking at all about the recording quality, because I'm immediately sucked in to the performances.
Of course, I understand that they were using the best recording technology they had at the time. And I'm not saying that people shouldn't do that. I'm just saying that I think it's a bit ridiculous when people argue about whether or not you can make "pro" recordings on a budget interface, etc., because the "converters are sh*t" or whatever. Or when they talk about "lossy mp3" formats vs. lossless. LOL
I would gladly pay someone $50 if they could tell a 128 kbps mp3 from a WAV (or other lossless) file 10 times out of 10 in a blind test.
The most important thing has and always will be what's happening in front of the microphone --- not behind it ... especially when you're talking about recording "natural-sounding" performances like a singer-songwriter, a choir, etc. I know that sometimes studio manipulation is part of what defines the "sound" of a record --- i.e., the Beatles, Brian Eno, etc. --- and that's a different bag, obviously.
I guess my overall point is that I think it's funny how people rag on and/or dismiss "consumer-grade" or "prosumer-grade" analog recorders because of the recording quality.
Think about this test: Let's say you have a song in digital wav format and you record it to a (properly calibrated, up to spec) consumer-grade reel to reel or even a Tascam cassette portastudio (again, in great shape). And then you played that recording back on the tape recorder. Do you think in a blind test most people would immediately be able to tell the difference between the original file and the tape? Because if the recording quality is as crappy as most people make it sound, the difference should be utterly obvious, shouldn't it?
I would be willing to bet that many people --- the majority even --- would not get it right 100% of the time. And I'd also be willing to bet that many of the folks who would immediately say "Yes, of course I'd be able to tell!" have probably never even used a cassette 4-track or consumer-grade R2R and are just parroting what they've heard other people say.
I did this test myself when I first got my Sony TC-530 (from 1967). And I hadn't even overhauled the machine (recapped, checked bias, alignment, etc.). All I did was clean the tape path --- maybe demag the heads (I can't remember). In other words, this was just a test I did at the onset to test the machine's functionality to see if I could keep it or needed to return it.
And I couldn't tell the difference between the digital file and the tape ... literally. Now, maybe lots of others have much better hearing than I do, but I doubt it's that much better.
And so my point is, if that's the case, then the recorders are more than doing their job. And any "crappy" sound that comes from these machines in other scenarios is not the fault of the recorder but rather the recordist.