I really think this whole issue is generally overstated and overestimated. Any input on any interface is going to have a differential amplifier - most likely opamp based - as the first the thing it hits. The difference between one meant for a line input and one made for a microphone input is the amount of gain* available. The mic "preamp" is pretty much the same circuit as line input's dif amp. In most interfaces, both of these things are about as clean and flat as possible, and with today's technology that is pretty damn clean and flat even on the cheapest units because the chips are cheap, the circuit designs are tried and proven, and it pretty much just works. If you're using an external pre, it must be that you don't want clean and flat, but running it through the clean and flat "pre" in the interface isn't going to somehow undo whatever your fancy fucking thing is doing. Plug it in and go.
*Actually, maybe more like available attenuation. I'm not real familiar with the circuitry inside these things, but I do know that many of these digital chips max out around 5V p2p. Most line inputs are specified to take signals up to 22 or 24dbu which is getting toward 30V. That signal must be attenuated quite a bit before it hits the actual ADC. A mic input wouldn't need that much attenuation, so technically might really be the "purer" path.