wat is bit rate??

  • Thread starter Thread starter jerzeysk8board
  • Start date Start date
J

jerzeysk8board

New member
im exporting audio from recording, i can choose between 128-320 kbps, wat does it do to the audio and which would give the audio best quality?
 
Its a triple post actually :D

Be patient jerkeyz. Dont need to question this on every forum. Aswers will appear
 
The higher the bit rate...obviously the better.
Usually 224k is what I use....when I need to export something to MP3 (for my iPod).
 
can all devices play at 320 k, like ipods and other items, or it only effects the size of the file?
 
jerzeysk8board said:
can all devices play at 320 k, like ipods and other items, or it only effects the size of the file?


Yes
It only afects the size of the file. If a device can play a Mp3 file, it can play at any rate. High or lown
 
Juts one quick thing to point out about the word "Bitrate", as it has two meanings in the recording world...

The one that you are talking about (which is usually reffered to as bps or Bits Per Second) refers to the compression of an MP3, and hence it's quality. A CD quality wave file (ie, uncompressed) has a bitrate of 705.6 kbps (16 bits, 44.1 kHz). Whilst the MP3 (MPeg layer 3 compression) algorthym is good, the more you compress it, the "worse" it is going to sound. I'm not sure what the "standard" is, but I think 192kbps is the "usual" for MP3s (although with variable rates etc...). However, since they are all using the MP3 algorythm, any MP3 player can play any MP3 of any bitrate. There is a universe of websites dedicated to Mpeg compression and it's effects, just use google.

The second use of the word "bit rate" (which is the one that you're likely to hear about on these forums) reffers to the "depth" of a digital recording. It is also commonly called "bit depth". When you convert an analouge signal into digital, you take a "sample" of the current voltage. This is represented as a binary "word". The word length (aka bitrate) determines how precise you can get the sample. CD audio uses a bitdepth of 16 bits, and most recording equipment uses a btidepth of 24 bits. Telephones use 8 bits. In simple terms, the more bits, the better the quality.
Also related to this is the "sample rate", which is the number of times a second that a sample is taken. For CDs, this is 44100 times a second (44.1kHz). The "standard" pro recording rate is 48kHz.
Once again, google is your friend in finding more info about bit/sample rates and how they relate to audio.
 
cpl_crud said:
The second use of the word "bit rate" (which is the one that you're likely to hear about on these forums) reffers to the "depth" of a digital recording. It is also commonly called "bit depth".
It is commonly called bit depth because it is bit depth, not bit rate. The word rate tells you that something is being measured against time. Bit depth doesn't have anything to do with time. It is and will always be bit depth.

cpl_crud said:
The "standard" pro recording rate is 48kHz.
Only in video is this the "standard" rate. (notice that this is a rate because it has to do with time.)
 
I think you'll find the term bit rate does apply, as it refers to bits/sample.
A "rate" of anything is basically anything per anything else, not anything per time. It's just that the majority of the "rates" that we see in day-to-day life are time-rates.

Yes, it's a question of semantics, however the terms bit depth and bit rate are interchangeable. Reading a variety of manuals/texts will show you this. (Although I will concede that, to be completely accurate, the "bit rate" of a recording must include both the bit depth and the sample rate)
I included this point simply because this forum is designed to teach people. If you go blindly through life thinking that the term "bit rate" always means the overall bit rate, then you're going to get confused when a spec sheet, forum post or review calls the "bit depth" the "bit rate". Also, when manuals are written in languages other than english and then translated, it is possible also to see the bit depth reffered to as the bit rate.


As for the "standard", this is once again a question of semantics. The "standard" for CDs is 16/44.1, although you can buy 24/96 CDs. What is known as the "professional sample rate" is 48kHz. Sure, not many people use it as their recording rate, but that's what the standard is. Just like the Standard Kilogramme- not everything is going to have 1 kg mass, but the Standard will always be 1 kg.
Also, you'll find that the usual rates used are multiples of 48kHz anyway- 96 and 192 kHz being the examples. I would not be surprised at all to find out that the clocks in these devices runs at 48kHz and are then multiplied.
 
Last edited:
No, 24 bit is a bit depth. A bit rate would be 24bit/44.1k even using your definition. Without the qualifier, it isn't a rate.
48k was the 'profesional' sample rate only when compared to the 'consumer' sample rate of 44.1k. The pro vs. consumer thing went out the window when consumer DAT machines didn't catch on 16 years ago.

Yes, it is semantics but people are confused enough around here without trying to use these words interchangeably.
 
Back
Top