I'm going to do tests of 100,1k,10k with both 35-90 and lpr35 to what happens. If everything looks relatively okay I think it's best if I leave things as they are for the moment. There's some other small issues I've been having that I could attend to first before I buy a test tape, audio voltmeter etc.
However, I'm now back to thinking, why is tascam saying the 388 is setup for 357 (355nw?) if the electronics and meters are setup for 250nw?
Maybe I'm just not getting it here, I'm sorry if this is the case. I do understand now how a tape's fluxivity can be different now despite being classed as belonging to the the same reference standard (lpr35 and 457 for example). Wouldn't using 35-90 (200nw) be better however if my 388 was (and presumably still is) setup for 250nw? I don't mind the earlier saturation point and would rather not deal with cross talk if that's the biggest factor.
Again, sorry for confusing everyone. I'm think I'm getting there though?
Part of what is confusing you (I think) is a particular tape's output level class (i.e. +3, +6, etc.) is based on the flux level that elicits that 3% harmonic distortion level. For a +6 tape that is 355nWb/m (that number may be different depending on whether you are using the DIN standard or NAB standard for instance...let's just say 355nWb/m). Well, are you really going to setup your machine so 0VU means you are already pushing your tape into saturation? The typical practice was to reference 0VU to something that was getting good strong signal to tape, and if you wanted to saturate the tape you would push your average levels during tracking to something higher. Remember, you can set the meters to reference whatever you want, and that may or may not reflect the strength of the signal that's actually printing to tape. Calibrating the machine is about two things:
1. lining up all your tracks so they are the same levels
2. setting your input, record and reproduce levels so you are maximizing signal to noise while staying within your line amplifier's headroom, keeping things controlled for noise reduction processing (if applicable), and hitting the tape with whatever signal strength you want in order to get what you want out of the tape (maybe that's clean, maybe that's saturation, maybe something in between) ---AND--- setting your meters so 0VU means something relevant to your setup.
That #2 is a mouthful and that's why when you are starting out its valuable to set the machine up the way it was specified to be setup by the factory, and then branch out from there depending on what you identify you want to achieve after getting the basic setup under your skin. There are a ton of possibilities and ways to use different tape and machine setups making analog tape a valuable tool for your canvas, but there needs to be some fundamental understanding and experience with it first.
SO...back to your question...457...its a +6 tape which means 3% harmonic distortion occurs when a 1kHz sinus tone is printed to tape at 355nWb/m flux level. But the "standard setup" for a +6 tape is to reference 0VU to
250nWb/m flux level applied to tape, so your average peaks at 0VU are still clean. Again, the VU meter is a visual reference for you during tracking. The "standard setup" assumes if you want to achieve some saturation you are pushing your average peaks to +3VU.
Does that help?