are new interfaces really any better?

  • Thread starter Thread starter CoolCat
  • Start date Start date
CoolCat

CoolCat

Well-known member
I return to this question at times. Are new interfaces really any better?
is 24/44.1 still ok even though new ones are 64 bit, and can go 192khz or more?
does Windows or Apple care, if the drivers still work what's the improvements?
what is Reference Conversion and does your interface conversion meet it?

Ill add using my ears, my new vs old is not noticeable, is it my ears aged or is it just tech-specs continue to improve but the human cant hear 120khz highs?

sanity check please!:eatpopcorn:
 
I still record in 24/48K - When my interfaces and computers got better I experimented in the higher clock rates. I simply couldn't just hear it, I couldn't see any useful info up there either - PLUS transfer times and file sizes were getting silly. For some of my stuff, even an mp3 is too big - attaching them to emails or uploading them takes far too long. I also have video kit that can record in Ultra HD - but the file sizes are astronomic and copying from card to drive or from drive to drive is just enough time to go and make a coffee. Worse is that I do not even have a monitor capable of really showing me what is in that extra data. If you have a gopro or similar small lensed camera - being UHD is pointless. The format may well be all those pixels bigger, but is the lens and sensor really capable of 'capturing' the extra quality. My opinion on these extended audio and video formats is that they're just a bit pointless.
 
The ONLY reason I record 88.2/24 is that a) my computer can handle it and b) it cuts my latency. I can't hear a difference in terms of audio quality anymore. Maybe 40 years ago, but not today.

32bit float would be worthwhile since it means you don't need to worry about overloading, and your inherent noise floor will be essentially zero. Plus audio processing is moving towards 32 bit processing.

I was reading yesterday that LG is the lastest mfg to drop 8K TVs. There is minimal available program material, cost is much more than 4K, and tests have shown that the vast majority of people can't see the improvement. Going to 192, 384, or higher sample rates might technically be better, but ultimately you will most likely be down sampling things anyway. I suspect that 98+% of the people listening cannot tell the difference.
 
I do video, and went to Sony a year or two back when I got an invite to go to Pinewood and see Sony’s latest camera that had crazy ability to see detail in the shadows. Like when we went 16 bit to 24 and then 32 bit, the detail was going up. The trouble was they only had one monitor that could reproduce it that was not on general sale because the failure rate on the production line made it the price of a small house! None of their professional or domestic ultra HD monitors or TVs could reveal what the camera captured.
 
I return to this question at times. Are new interfaces really any better?
is 24/44.1 still ok even though new ones are 64 bit, and can go 192khz or more?
Depends on what you are recording - for me rarely do I need 192kHz - for compatibilty I use 48khz and 24 bit.


does Windows or Apple care, if the drivers still work what's the improvements?
I have no idea about Windows - but Apple reacts differently - but its not the drivers - ti’s the processing.

what is Reference Conversion and does your interface conversion meet it?

Yes - I use an Apogee.
Ill add using my ears, my new vs old is not noticeable, is it my ears aged or is it just tech-specs continue to improve but the human cant hear 120khz highs?
For me it’s less noticeable these days than it was 8 years ago - inexpensive interfaces tend (for me) to have a brighter sound - where as my Apogee has a balanced sound.
 
Ill add using my ears, my new vs old is not noticeable, is it my ears aged or is it just tech-specs continue to improve but the human cant hear 120khz highs?

sanity check please!:eatpopcorn:
I concur with This on the dodgy subject.
 
I have been sticking with 16/44.1. I don't make that great of a recording, it sounds good enough on my CDs. I would in some ways like to experiment with 5.1, mainly for the immerse effect. But I am personally having better results than I ever expected. OK, they are not at the multi million dollars studios, but better than a 4 channel cassette deck .

Now, 4K is really great. I have been watching a lot of old movies that they are digitizing into wide screen, 4K resolution and with an OLED or MINI-LED TV, I can see a huge difference. But remember, all of those old movies I watched on 480 letter box format. 8K, maybe scientific shows or something that the detail is important, but I can't really think it will be that noticeably better. I think the next gen TV improvements will be to color correct as intended by the source, HDR10 for example.

I've noticed even 5.1 has really been settled on for sound standards and Dolby Atmos isn't catching fire.
 
16 bits is more than good enough for home reproduction. Do the numbers? You will be lucky if your room is quieter than 30dBC (you have to stop breathing to better that!) CD has a dynamic range of bit over 90dB so if you are to hear the veeeery quietest parts of the music the very loudest parts will be at over 120dB. How many of us have monitors that can do that or indeed rooms and neighbours that allow it? Of course, "they" don't put such a dynamic range on commercial recordings, few classical pieces require it and certainly not pop!

But, we should record at 24 bits because that allows a 144dB dynamic range. Very high end interfaces can now get to 125dB or so and almost all now are in the 100dB range. the limiting factor being electronic noise and I doubt that can be improved by more than a decibel or two. Academic for most of us anyway. Even if your room is subjectively "silent" record any half decent capacitor mic and the ROOM will be noisier than the mic or AI!

The debate as to whether gear should reproduce frequencies well beyond human hearing is far older than digital recording and has never, to my knowledge ever been resolved. Personally I think capturing anything (musically) beyond 20kHz is bollocks.

Have AIs got better though? Yes IMExp, they have quieter mic pres with a bit more gain than ten years ago. Some however have achieved this at the expense of headroom.

Dave.
 
If sample rate and bit depth settings are equal - can't it be assumed then that the preamp quality found in newer interfaces is the primary difference between older vs newer (and I guess lesser priced lower quality interfaces as compared to more expensive higher quality interfaces)?
 
If sample rate and bit depth settings are equal - can't it be assumed then that the preamp quality found in newer interfaces is the primary difference between older vs newer (and I guess lesser priced lower quality interfaces as compared to more expensive higher quality interfaces)?
Yes. I had a Behringer UMC204HD for a while and was surprised at how quiet the mic pre was. Good enough for speech recording with an SM57/58 at a few inches. Perhaps not quite good enough for an SM7b? Don't got one. In fact my MOTU M4 pres are probably no better for noise and gain but certainly have much more headroom.
Yonks ago I had an M Audio fast track pro. Not a bad little interface but the mic pres were next to useless with dynamics. The purchase of a pair of AKG P150s sorted that! This was WAY before the coming of the Cloudlifter and its derivatives.

Dave
 
Back
Top