Tascam 388 Calibration and Bias

migrant worker

New member
I have a Tascam 388 and have just recently bought an MRL 21T204 calibration tape. I' hopig to check the bias and set the operating level (both record and reproduce levels). I need help from some of you who've been though this before. It would be great if you could walk me through the process. I also have a Heathkit model IG-18 signal generator at my disposal. It will generate frequencies up to around 100,000Hz. I have a manual as well. I need help determining which inputs will allow me to calibrate which level, and what I will calibrate using the MRL tape (I'm assuming the playback level?). Thanks in advance for any info you can impart.

Matt
 
Last edited:
Matt, welcome!

The MRL tape is used to calibrate the playback level and playback frequency response.

You use your own fresh tape to calibrate the record level and record frequency response and to set the bias.

The manual is really good on the step-by-step process (i.e. it is laid out in the steps as you should go through them).

A suggestion: rather than try and lay out/recreate detailed instructions here, Teac already did it. Crack open that manual and put your specific questions in this thread if you hit stumbling blocks as you go through the steps.

Also, look at the sticky thread on calibration at the top of the thread list.
 
Hey, thanks for the quick reply! Here's one other question. I'm planning to use Quantegy 457 (a +3 db OL tape as I understand it). Currently, all I have on hand is Quantegy 456. Would it be okay to go ahead and calibrate with 456 since it has the same fluxivity etc. or should I wait until I order the 457? Would there be any significant difference? Thanks for any of your opinions!

Matt
 
My opinion, and hopefully others will chime in, but the only real difference between 456 and 457 is the thickness. 456 is a 1.5mil class tape and 457 is 1mil. Both are classed as a +6 tape. 406/407 are the +3 1.5mil/1mil equivalent tapes. You can certainly calibrate with the 456. Any differences in level calibration should be minimal. There may be very minor differences because of the thicker tape and it may effect the tape tension which can effect levels to and from the tape at the heads, but no harm setting it up with the 456 and getting a feel for the process. You'll at least get each track/channel consistent with each other which is one of the major goals of calibration. I do suggest you re-bias when you get the 457 (make sure it is Quantegy, not Ampex), or take a look at RMGI LPR35, a good quality 457 replacement. 456, 457, LPR35 are all considered "bias-compatible", but there will be minor differences for sure.
 
Cool! Thanks again...

Thus far, my understanding has been that 456/457 are indeed +3 db standard operating level (SOL) tape, but have a maximum operating level (MOL) rating of +6 db.

Comments made by Dave (A Reel Person) in the thread below from the Tascam forums led me to this conclusion...

http://tascamforums.com/index.php?showtopic=10829&mode=threaded&pid=62409


I think he's right that there is substantial confusion over SOL and MOL. Personally, my impression so far is that SOL is, on average, where you want to keep your meters to prevent tape distortion (which, of course, is not always an effect you want to entirely eliminate, but I'll leave that concept for discussion in another thread). And the MOL value (in db) corresponds to the maximum fluxivity (in nWb/m) inherent to a given type of tape and is the point past which that tape distortion will begin to occur.

Is this correct?

If so, I think what I need to know is... which of these two measures of operating level is pertinent to the calibration procedure? Will I be using SOL or MOL to calibrate?

One reason for my confusion is that at US Recording (where I bought my MRL tape) they refer to the MRL 21T204 tape as having a refrence fluxivity of 250 nWb/m, but they state that this corresponds to an "operating level" of +3 (without differentiating between SOL and MOL).

Wait... I think I just found the answer (let me know if this is wrong)...

In the 388 manual under the "RECORDER/REPRODUCER SECTION ADJUSTMENT" heading it says: "if you use 250 nWb/m tape, adjust the trim for -10db reading."

On the spec sheet (below) for the MRL tape it states that the frequency response section was indeed recorded at -10 db

http://home.comcast.net/~mrltapes/pub101.pdf

So, it looks like they're just referring to them by reference fluxivity and not really using the OL (in decibels) at all. Is this correct?

Finally, thanks again for all your help. I have downloaded TrueRTA for use in frequency analysis. Once I get this working, I hope to contribute to the forum by putting together a pretty comprehensive tutorial on calibrating the 388 using the MRL tape and this software (complete with pics, etc.).

Sorry for such a rambling post, and...as always...thanks again!

Matt
 
Ah.

Okay.

Yes. Before you just referred to "OL" and I assumed you were talking about the tape classes +3, +6, +9 etc.

Yes indeed there is a difference between SOL and MOL. ;)

You can set a machine up for ANY SOL with any level of test tape...185nWb/m, 250, 355...if you play a 1kHz tone back on a 250 tape and set the repro level for "0" on the VU, and then put the 185 tape on and reproduce the same tone the meter will show about -3VU...put the 355 tape on and the meter will show about +3VU. So the idea is that you set the deck up so that the meters show "0" at an average signal level that takes advantage of your recording tape's headroom without pushing the tape into noticeable distortion (unless you are going for that). 250nWb/m is indeed +3, which is the SOL of the +6 classed 456 tape...+6 over the Ampex standard (which would be 0). So setting the repro level to 0VU at 1kHz using a 250nWb/m tape will set the deck up as you typically would for 456/SM911, etc.

Switching gears for a sec...If you tend to like to run levels hot maybe you set the record levels so that the meters show -2VU when you are recording a -10dBv 1kHz tone. "-10" because the 388 mixer is a -10 mixer. That's 0.316VAC RMS. That way when you are setting your record levels when recording program material, your input trims will be turned up higher to get your average levels to 0VU. So you see, calbration accomplishes several things:

  1. Gets your meters calibrated to a known standard
  2. Gets your tape electronics consistent with each other...makes it so what you record on 1 track would sound the same in level and character (i.e. record and reproduce frequency response) if recorded on a different track
  3. Sets your meters to show 0VU for the average peaks in the program material you record for the sound you are wanting to get out of the tape you are using

I hope that makes sense and I hope I'm accurate in what I'm saying. I'm looking to others to chime in and correct me or if I'm wrong lest I lead you astray. :eek:
 
Alright,
Well... I've got the head aligned (used a software x-y scope for this) and I've set the record level of the meters with a test tone. Now I need some clarification about how to adjust the Reproduce Frequency Response. Dave mentions this in another pne of his posts I think:

https://homerecording.com/bbs/showthread.php?t=169453&highlight=388

...but is not really specific about how it's done. I have read section 1-5-2 in the manual and I'm not sure that I understand what to do or what the figure that is associated with it is trying to convey. If you've done this step, would you mind explaining what it is I'm trying to achieve and which meters I'll be looking at while performing the calibration, etc.

Thanks!

Matt


PS: Read your post on restoring your 388 today... nice job (very thorough) ! WOuld love to see some pics of the finished product.
 
I'll cut to the chase:...

I found I got the flattest Repro Freq Resp on the 388 by setting the proper level using the 12.5KHz test tone. The adjustment is limited to one pot, with which you must adjust to get the flattest average freq resp over the whole bandwidth. It's a balancing act. Try that and tell me if I'm wrong.
 
Thanks Dave!

I think I was able to get it done. If I did it right, it just involved adjusting all eight VU meters for each channel while reproducing the 12.5kHz tone using the appropriate trimpot (either 124 or 224) for the corresponding channel. All of them seem to read the same now, but as you alluded to in the other thread, there are a few points along the frequency curve at which they read inaccurately (i.e., the various frequency tones on the MRL tape are recorded at -10dB, but the machine reads less or more for a couple of the frequencies upon playback). If you think 12.5 works best to achieve the closest reading across the curve then I'll stick with that.

As always... I'f I'm doin' wrong, please let me know!!

Matt


PS: If any of this post just seems like reiteration of what's already been said, it's just so that others can follow should they want to use it as a reference.
 
Now I've just got to wait a couple of days for some 457 tape to come in before I can set the record level. (I could use this 456, but I figure as long as I'm goin' to all this trouble I'll wait so as to get the right calibration with the exact tape). Although, as I've set the machine up (at +3 over 0), I really wish I could get my hands on some 407 (as I think it might give more of the results I'm looking for on drums and electric guitar).
 
That's just what I remember worked best for me.

They will not all be even, is the point. I got the best results with 12.5KHz, but you're welcome to try the other bands to see if you could better your results.:eek:;)
 
Yeah...start with what Dave is saying...out of experience you may find you like to bump it or drop it from flat at 12.5kHz because, remember, a change at 12.5kHz effects a fairly wide swath in the spectrum. See what you like. And it may change when using 407 vs. 457 but the point here is that you're getting a good grip on the process.

You're doing good.

Also, +1 to what Dave said about them not all being exactly the same...you'll typically notice poorer response on the edge tracks especially on the more narrow 1/4" 8-track format, and some more wavering of the meters on those edge tracks too. Don't sweat it though...we don't (or at least most of us don't) listen to sinus tones for enjoyment and what may appear to be an unsteady response or a diminished HF response with a sine wave will most likely be completely imperceptable with program material.

Have fun!
 
Damn... This is taking forever. For that reason, I'd like to make sure I'm doin' it right. I'm setting the reproduce level of each track by recording a test tone from a software signal generator program on my computer. The signal coming out is 400Hz and 0.316VAC (which I think corresponds to -10dBV). I'm then reproducing and monitoring the signal I have recorded on the tape and adjusting the trimpots (143 and 243) such that the signal measures exactly (or as close as I can get it) 316mv (0.316V) using an AC voltmeter. This is the best equipment I've got, so I had to go with this improvised method. Let me know if anything sounds wrong. Thanks!

MG
 
Not sure if it's true RMS or not, but it is an auto-ranging digital model and reads down to one decimal place on the mv scale. Uner the section on setting the recording level, the manual specifies to use a signal at 400Hz and -10dBV (which it states is 0.3V). I was able to get all channels calibrated within just a couple of millivolts of one another.

What do you think?

MG
 
brand and model of multimeter pleez.

I haven't cal'ed my 388 before so you're well ahead of me as far as how to do it.

If the manual said 400Hz then obviously that's what you use.

I ask about the true RMS meter thing because regular non RMS meters measure voltage differently...they measure the peak voltage whereas the RMS meters are measuring an average level that is more accurately representative of the audio energy. So a true RMS meter is going to read lower than a non-RMS meter...0.316VAC is not the same as 0.316VAC RMS. To complicate matters most meters are not accurate for much of the audio spectrum...probably still good at 400Hz and most are accurate at 1kHz but for other tones typically utilized when calibrating (like 100Hz and 10kHz and higher) they can become very inaccurate.

So what am I saying??

You've covered a really important part and that's getting all the channels consistent with each other and the repro level consistent with the input level, but if you are using a non-RMS meter you've dialed things in at less than -10dBv. I can't recall the conversion, but my guess is that if you've set your meters up at 0.316VAC (not RMS) and reproduced a 1kHz test tone from a 250nWb/m tape your meters would read around +2VU. If somebody knows the conversion of 0.316VAC at 400Hz to 0.316VAC RMS at 400Hz then you could at least compensate at that basic tone...your meter is likely accurate enough at that frequency, but when you start checking frequency record frequency response and such (unless you have a true RMS meter) you'll be chasing phantoms because it won't be accurate.

This is where having a hardware oscillator is nice because it outputs the tone you select at the level you dial in on the unit...no multimeter needed, but I do it the multimeter way...I have a Fluke 87 true RMS meter that is pretty accurate from 20Hz to 20kHz.
 
(Wow... as I read my last thread I'm struck by what an idiot I was back then... Life'll do that to ya I guess.)

:laughings:

Here it is five years later, and although I'm much more prepared now I'm back at it trying to dial in the bias (finally) on this 388 (as well as that of a friend).

So far we've run through all the calibration procedures preceding bias in the manual.

I'm sending in a 12.5 kHz tone from ProTools. I have a Heathkit IM-5258 Harmonic distortion analyzer and a Leader LBO-2060 analog scope.

I've done a good deal of reading about tape biasing. I think I've got it, and I know I should be seeing a drop in the dB level the machine is reproducing as the bias pot is turned up with each iteration of: record 12.5 kHz tone, playback tone, turn bias pot slightly clockwise... repeat.

The problem is that I have been plotting the values I'm obtaining as a basic X-Y graph of mV at the test points and dB output from tape (respectively) using Excel, and I'mobtaining a LINEAR relationship! I've adjusted the bias pot pretty much through its entire range (from about 22 mV to 120 mV) and as I turn it clockwise (increasing the mV reading at the test points), the dB level upon tape playback decreases.

:eek:

What the heck is going on there?

Additionally... I'm noticing that the manual (which I'm obviously trying to best with this method) calls for 120 mV at each test point (and that's the extent of its biasing procedure).

However... The range of the bias pots tops out right around 120 mV, so what's that all about?

I'm just wondering if I've made some weird oversight or what... what do your test points read normally? (For anyone willing to measure).

Thanks in advance for any knowledge you can serve up!

Matt
 
Sweetbeats may be the best resource on this one.

I've biased a 3 head deck, but generally have avoided it on 2-head decks, rightly or wrongly.

However, the "record, rewind, evaluate, adjust, repeat method, looking for that (X)dB drop on the meters" may not be a practical method on 2 head decks, and therefore the "put in rec-pause and measure TP(X) for (X)mV" method may be the most effective method overall, and was devised in product design to alleviate the obvious downfall of no 'off-tape monitoring' limitation of 2 head decks.

Maybe relevant or not, is what I've observed on 3 head decks is the tiniest little nudge of the trimmer will affect the "needle" a fair amount, so trying to dial this in basically 'blindly', as on a 2 head deck, would seem impractical and prohibitive.

Nowhere in the procedure does it reference any input tones or reference to the tape or meters, but simply that when record is envoked that the "bias section" per the test point has 120mV. I would follow that procedure and move on. I know that's not as satisfactory as "seeing the needle peak and drop".

If Sweetbeats happens upon this thread, perhaps he can enlighten us. Failing that, go dig thru his many paged essay titled "388 Story", where he gives a great amount of detail of this and many other refurbishment procedures he went thru while hot rodding his former 388.

I admire you for attempting to tackle this methodically and with solid logic, based on what we've all read and know about biasing a 3 head deck, but at a certain point it's apples and oranges. The written procedure may be as close as it gets to "precision", in this case.

My only other random thoughts would be to use 10kHz, as 12.5kHz might be on the upper usable frequency response of the device entirely, @ 7.5ips. Not only that, but to ensure the test signal is fed to the 388 at a verified input voltage that's recommended, whatever that is, either 0.316v or whatever else (i.e., -20dBVU). Pardon me for pulling those random references out of a hat,... but follow whatever trusted literature you have.

Meaning: verify the ouput level in volts from your PT rig, as well as the frequency of the test signal. If the reference voltage of the test signal is off or unknown, all bets for accuracy are off.

I'm sorry if I could not be any more help than these comments.

:spank::eek:;)

PS: If you have any breakthrough or revelation, please drop in and give us an update.
 
Back
Top