Uh...
Read that first link in context, folks. That is not the way to set gain, it is an observation to whether the gain or fader add more noise to a signal, all other things equal.
Set your gain after you've run your faders to 75%, or whatever indication of "gain setting" you have on your mixer/whatever. Otherwise, you're getting a fucked up dB/vu level (which is measured in most cases at the end of the signal).
Think about it. You could max your gain through the roof, getting distortion left and right. But if your fader is set low to begin with, your end-level meter can only monitor that signal after it's been cut by the fader resistors. So you're left with no way (outside of hearing the signal itself) of knowing whether you've maxed the signal out. In this case, it would still indicate headroom when in fact the pre-amp is maxed out.
The post we've been talking about concerns whether to use the fader or pre-amp (in this case, the built-in one) to amplify a given signal. The gain (controlling the pre-amp) is the strongest (less noisy) link in the chain, so get as much as you can out of that first. Then AFTER you've recorded, go back and screw with the faders.
The 75%-percent-fader thing is there for a couple of reasons:
1) You can boost that track's relative signal without having to reduce every other track. It's like having 25% (or actually, 33%) leeway after the pre-amp/source recording.
2) Provides an standardized starting point for setting gain, so as to maximize the pre-amp's share of amplification (which is better than pushing your faders up; adds more noise than pre-amps, been over that).
3) The faders have been calibrated to this mark as to ensure the signal meters are correct regarding the pre-amp gain. You've only got one meter per channel in most cases, so setting the fader in this way allows for accurate level measurement by the dB/vu meters.
Hell, this was probably covered by that second link. Didn't go there, was eager to help kill this thread (pissy mood today).