This topic is very confused - the initial question was with pops and clicks, so what we should have done is asked to listen - this gives the steer towards electronics and driver issues, and away from preamp, mic and input problems.
Then we came to all the stuff on sample rates and bit depths. Clearly we have a number of members interested in old analogue formats, so it's a bit laughable that we insist that we should use the highest settings on every project. Much of my work is with an archive of recordings from around 94, with a few that were originally analogue in the 80s. If I open an old project then they're going to be 16 bit 44.1KHz and frankly it really doesn't matter. Many of my projects feature old sound sources too, and there is no need to mess around with sample rate conversion when changing a sax part, or adding some vocals - especially when the end result is going to be on a CD. What's the point converting things then back again, because that's not good for purity.
Any new projects will be 32/96 which gives me the best balance between processing power, files sizes and system capacity. I've got a collection of old VST instruments too - I have yet to hear the difference between 48K Vs 96K, let alone 192K. What exactly is the point of recording information that is in most cases, absent - Nyquist was a clever bloke, and loads of my synths and other source sounds are frankly a but harsh at HF and eq usually tames that nicely - so why would I wish to record it?
I'm perfectly happy that individuals can record in whatever format they like - but with more and more distribution actually on mp3s, the actual point eludes me. Quarter century old DAT recordings at 48K still sound good in my collection. Adding an extra two octaves of emptiness doesn't convince me that advice to record at 192 makes any sense whatsoever. Sorry. I think personally, it's PT Barnum all over again.