N
notCardio
I walk the line
How big of a difference do you think recording at 96K makes over recording at 48K (or 44, for that matter)? I'm not talking about 16 bit vs 24 bit (or 20 bit), but specificaly 24/96 as opposed to 24/48? And yes, I understand the whole Nyquist frequency thing (even though I can't spell it), but in the realm of HOME recording, how much of a difference would it make to someone on budget HOME equipment (i.e., no Avalons, Neves, Drawmers, Genelecs, etc.). Things like sub-$250 mics, sub $200 pre, or board pre, run of the mill amps (Alesis, Hafler), and budget monitors (20/20's, Reveals, etc). And here's why I ask. I always assumed that there would be a huge difference, and this was reinforced by the guys at the music stores (who want to sell you something). I therfore thought that I should automatically eliminate anything that wouldn't do 96K from consideration. Then I started noticing that there were relatively expensive (to me) Apogee converters that did 24/48, not 24/96, and I began to wonder. Would some of the new USB interfaces that do 24/48 be good enough? So, I guess what I'm asking, is would 96k make a significantly discernable difference to a home hacker whose music will more than likely never be played for anyone other than himself and possibly a few friends?