Conversions, Quality, and Old Technology

  • Thread starter Thread starter bewildered
  • Start date Start date
B

bewildered

New member
Ive gotten advice before that i should record at the same samplerate that my end product is going to be at, because recording at higher rates then converting down to the final rate will cause poorer quality than just recording it at the final rate in the first place. So what is the general consensus on this?

I dont want to start up the endless debate on what samplerate sounds better for the buck, but is their any functional difference to working with higher sample rates, like is there anything you can do with a 192khz sound that you cant with a 48khz?

Its always assumed that the final product will end up on a lousy 16/44.1 cd format, but why would someone butcher their soundquality to fit the specifications of a soon to be extinct technology? i suppose that more of a rhetorical question, but imagine how much better music would sound if it werent for the crippling parameters of cds, and how we will never get to hear current music in its originally recorded quality because of it.
 
There's nothing "crippling" about 44.1kHz with decent converters... The best designers in the industry will tell you that if you can hear the difference (on their own units) between 44.1kHz and 96kHz, then there's something wrong with the unit.

Recording in 24-bit is another story - 65,000 points of resolution in 16-bit vs. 16.7 million points in 24-bit... And no doubt, arguably, certain plugs sound better at higher rates than others. But several of these already resample on the fly as it is (a few of the UAD plugs come to mind).

But the simple matter is that if you can't get 44.1kHz at 24-bit to sound absolutely astounding, bringing the sample rate up isn't going to help.
 
Back
Top