Can a DAW alter sound just by importing / exporting

  • Thread starter Thread starter nlduke
  • Start date Start date
N

nlduke

New member
Hi. New here so hopefully I've got the right subthread :)
When I import a track as a .WAV file into GarageBand and then simply export it at the same lossless quality (WAV, 16-bit) with no added effects or adjustments (none in the master track either), the exported track somehow has a noticeably different tonal quality, perhaps even degradation, compared to the original. Does anyone know why this might happen? Or if I’m doing something wrong?

P.S. I know it might sound like a pointless exercise, but I just want to trim an already-produced track and I don’t have the original file with all the stems – just the final mastered track.

Any info would be much appreciated!
 
Last edited:
Different dither 'shape'? I don't know but will watch this space.

Dave.
 
You have to be careful with GarageBand - it tends to normalize everything, and there's FX on the master/stereo out that may not be turned off. For file editing, Audacity is the better (free) option I think.
 
I agree - when something sounds different, there is some process happening - otherwise it's a data duplicate
 
That makes sense, thanks so much for the suggestions. I will definitely try Audacity and compare.
I did turn off the default FX on the master; however, one thing I'm now noticing in GarageBand that might be a factor ... When I import the WAV file and hit play, it’s already clipping the track. I guess that's maybe because it's a file that’s already been brought up to "commercial" volume? Not sure it would matter, but I do wonder if the track clipping is causing it to export with some distortion or sound loss or something.

Thanks again for the advice!
 
I get backing-track files (mostly mp3s) from clients who intend to sing over them, and they are very often slammed, so that's not unusual. I get caught off guard now and again by that. If a track is exported and it was already slammed, and GarageBand is maybe adding a little gain or whatever, then yes, that's a problem. I'm not sure why GarageBand would automatically say, normalize tracks, but you need to be careful of that. A guy I often record with just got an Ipad with GarageBand in it, so I'll ask him how it's going and if he's noticed anything coming out too hot or distorted.
 
There's an easy way to see of the process is neutral. Simply load the two files into the DAW, invert one and sum them. If the sound has been changed, you will easily hear something. If they are the same, you should hear little or nothing (depending on dither, level matching etc.)

I took one of my recordings, inverted the phase and rendered it at 0dB level. I then added that track back to the original and played them. You can see that both tracks are reading healthy signal, but the master, which is the summed signal, reads no signal. The input equals the output.
April Invert.webp
 
I never thought of doing that! such a simple test - never too old to learn new tricks!
 
Excellent idea. I ran the test with several different samples and got some useful results. Both the GarageBand export and the Audacity export (both with no added FX) completely zeroed out the master meter when I flipped the phase and played against the original. (I wonder if that means my ears, or mind, were deceiving me when I was first sour on the GarageBand export.) When I did the test using exports with FX added, or exports of different file types like MP3 etc, there was plenty of noise left over when the inverted file was played against the original.

Very cool tip that I’m sure I’ll be using a lot. Thanks!
 
Adding in different effects can cause the null to be incomplete because most plugins have some type of randomness added to them, be it compression, reverb, saturation or other such devices. They are, in many cases, trying to model hardware devices, which can vary slightly.

Your comment about your ears deceiving you is perfectly valid,,, its one reason that I always take comments on gear with a grain of salt. Perception bias is very real. I always say that if I know the answer, I will get it right a lot more often! I get a chuckle when I come across to the comparisons that are put up by SoundPure. They do most of their comparisons blind. You have to email them to the actual answers. People go ballistic in the comment section with how they waste everyone's time because they don't tell you what you're listening to. The key is which sounds better to you... NOT which one is closest to Mic A or Preamp B.

A microphone that is 70 years old might be legendary, but that doesn't mean that every remaining example is perfect. Setting a device as some type of absolute standard is pointless. You wouldn't consider a 1958 Corvette as the pinnacle of auto excellence. It might be a really cool car (I would love to have one). The same thing happens with '59 Les Pauls, and pre-CBS Strats. The fact that Sinatra used a U47, or that Clapton played a 59 LP doesn't prove anything except that they very good items at that time.

But I digress..... I'm glad you're a bit more comfortable with your results.
 
Side question about rendering, in Reaper does WAV Bit Depth method change the sound? 32bitPCM vs 32bitFP? 4bitIM ADPCM?
 
I have not tried the 32bit 24bit thing, but I did open a 48k project in reaper not realising the project setting was 44.1k and reaper converted all the incoming 48k files to 44.1k.

Cheers
 
I use 24bit 48k for video, and 44.1k for general recording. However, I tend to not pay much attention to what the project settings are, and record in one or the other without realising. Reaper doesn't seem to care. So I can have a mixture of 48 and 44.1 in the same project, and it all renders in a very civilised way.
 
Side question about rendering, in Reaper does WAV Bit Depth method change the sound? 32bitPCM vs 32bitFP? 4bitIM ADPCM?
It really shouldn't change the sound. Adding bit depth just pushes the digital noise floor down. Since the digital noise floor of 24 bit is -120db, which is far lower than the noise floor of the analog signal you are recording, it shouldn't change anything.

What adding bit depth does do is push any processing errors farther down the scale, where they are less likely to be heard.

Of course, floating point keeps you from distorting due to going over 0dbfs during processing, but since there is no such thing as a floating point converter, what you hear will always be fixed point which will distort if you try to go over 0dbfs.
 
Back
Top