Questions about inputs, impedances, and signal degradation

  • Thread starter Thread starter ahrenba
  • Start date Start date
A

ahrenba

New member
Hey guys,

Some of you may have gotten to know me as the guy who is trying to get an electrical engineering degree on this forum. :):) Just kidding, but I find this stuff pretty fun and it is something that I have been trying to learn in my spare time.

Gear up, this is a long post. I have tried to format and number the questions for your viewing experience :), so please try to number responses. Thanks!!

First off, let's take a look at some example circuits that I created and calculated. Please note that I just picked an arbitrary source voltage.

Typical line output Z to hi-z input resistance

1v ----------> 600 ohm ----------> 1,000,000 ohm ----------> }}}
Current: .000000994 amps
Vdrop 1: .0005996 volts
Vdrop 2: .9994 volts (nearly max voltage transfer)
Typical guitar output Z to line-input resistance
1v ----------> 100,000 ohm ----------> 3,000 ohm ----------> }}}
Current: .000009708 amps
Vdrop 1: .9708 volts
Vdrop 2: .02913 volts (extreme loss of voltage)
Typical guitar output Z to Hi-z input
1v ----------> 100,000 ohm ----------> 1,000,000 ohm ----------> }}}
Current: .0000009091 amps
Vdrop 1: .0909 volts
Vdrop 2: .909 volts (nearly max voltage transfer)

As you can see, plugging a lower impedance into a higher impedance input will allow for maximum voltage transfer. Usually you want about 10x the impedance on the load than source. I know that signal (or level) is based on the voltage. This is why you would like maximum voltage transfer in an audio circuit.

Ok, I realize that when manufacturers make inputs (mic, line, instrument), they design them to work with the expected levels (voltage, impedance) that their respective sources will provide.
------------------------------
Here is my first question:
1. Why don't manufacturers design inputs all with high impedances so that everything plugged into them will have maximum voltage transfer? What's the downside? What happens to the signal that makes them not do this?

I realize that if you did this, and the source voltage was higher than the input is expecting, your signal will be too hot. (another question):

2. Couldn't you just put an attenuator or pad button on the input so that you could reduce the signal when needed?

So, why don't they do this? What happens?
------------------------------
I have heard of signal degradation and resistive loss. From what I have heard, it occurs when something gets loaded down (like a guitars pickups). From what I know, loading down occurs when you plug a higher source impedance into a lower load impedance. This leads me to my next question:

3. Why does signal degradation and resistive loss occur? What causes it?
4. Is signal degradation also known as frequency rolloff?
-----------------------------
Ok, I know that voltage is what provides the level of the signal. Current is how the voltage "travels"....I think.

5. However, what exactly does current do for the audio signal? Like, theoretically if you have a lot of current, then "a lot" of voltage would be traveling, and visa verca. But, what does it actually mean (and what happens) when you have a lot, or too little, current? For example, you could have a really high voltage, and low current....what would happen?
-----------------------------
6. When you see a "dedicated instrument input" on a mixer, interface, etc., I realize that this is geared down to the expected voltages and impedances of a guitar/instrument. This would provide maximum voltage transfer. HOWEVER, isn't this level still below line level? Would this mean that there has to be a preamp somewhere in the circuitry that would boost the signal?

If a dedicated instrument input didn't have a preamp somewhere in its circuit, what would the benefit of it versus a normal input....other than minimal frequency roll off and max voltage transfer? This is why I would assume that these instrument inputs are preampped in order to get the signal up to line level. Would it be safe to assume so?

I guess what I am confused about is audible and recording levels. I realize that preamps just get the signal to where the input needs it to be so that it can work with it from there. The thing that confuses me is what level is actually loud enough to record? Because if you have a line input and an instrument input, they contain different voltages, and thus have different levels running through the mixer etc....

7. Also, I do not need an external preamp to plug a guitar into an instrument input, do I? Since it is expecting the voltage and impedance of an instrument, it is therefore not line level, and will be ok with accepting a guitar direct, correct?
----------------------------

Thanks, and please don't get too frustrated with me asking these questions. I know I should probably take an engineering course, LOL, but I figured that I could at least try on this board. :):)
 
To go this deep requires a little more background than anyone will be willing to type. If you aren't familiar with some basics of electricity and circuit design, the answers to your questions won't make any sense.

I suggest you take a course at the local college and start from the beginning.

I've already answered questions #6 and #7 for you (several times, actually).
 
To go this deep requires a little more background than anyone will be willing to type. If you aren't familiar with some basics of electricity and circuit design, the answers to your questions won't make any sense.

I suggest you take a course at the local college and start from the beginning.

I've already answered questions #6 and #7 for you (several times, actually).

Oh, woops. That was my mistake. I made this post before you answered those questions, and was unable to edit it due to this forum's time restriction.
 
Back
Top