Okay, a few thoughts here...
First, there's no "best" input level setting. It's a control designed to get the optimum level from each source you use and this means adjusting it for each recording.
Second, the waveform is merely a graphical representation of the levels you're getting and can be somewhat deceiving, particularly with some DAWs that let you adjust the range of the display. It's much more accurate and important to use the meters and let the waveform go where it wants.
Third, on you meters, be aware that there are several different standards for what they're telling you. The two most common for the purposes of home recording are dB(VU) and dB(FS). Your DAW will almost certainly be reading dB(FS) which stands for "full scale". In this scale, 0dB is equivalent to the very loudest signal your system can handle in the digital domain. If you're recording at 24 bit, this means that you have a reading of 111111111111111111111111. This means there's no way to go louder so anything more will cause distortion. (There was no good reason to put those 24 number 1s there other than my own amusement.)
Anyway, on the other scale (dB(VU)) is used mainly in analogue and 0dB(VU) is a relatively arbitrary voltage well below the point at which the signal will clip. It's pretty common to record at +8dB or even more in analogue and still have headroom before clipping.
This brings us to the important part. Typically, 0dB(FS) equates to +18dB(VU). What this means is that you ideal level is probably going to have peaks somewhere in the -10dB(FS) area. I often push it a bit harder but try to never go above -6dB(FS) when tracking. This means relatively small looking waveforms on most DAWs.