Best bass rig ideas

Have tried the DI thing. I'm all to happy with my Carvin head, 15 cab, with either my Thunderbird or Schecter 5 string. And my Sans Amp in the middle for some snarl. Recorded with a 57. Love my tone.
 
Wow. Another SansAmp DI recommendation. I might have missed something in the past, and will need to look at the Sansamp.

Carvin makes nice gear.

As a dedicated ADA user, I believe in their sound. I trust it. To get a NEW modern sound with 'nothing' is an exciting promise. Also I need to spend some time with those VST's in the Apollo. They have a Ampeg SVT demo.
 
Last edited:
ADA? Are you talking about the solid state preamps? That brings me back to the 80's....

Again, I now more use PSP Vintage Warmer in the box. It gives a cool drive to bass guitar. I have a Sansamp and Eden WTDI. Both are kind of noisey, but they are kinda cool when added to clean direct signal.

Sometimes a mic on a bass rig is also cool to mix in. It just depends on the band and player. Always the direct signal.
 
ADA had full page centerfold ads in every guitar magazine. Back covers too.

So you are aware their preamps have tubes.

Only AX7 preamp tubes back in the day. My guitar players then used tube power amps after the pre. Still, an old school sound that did seem to work.

More all tube amps for modern rock music. Was so back in the day as well. Whatever works. Not judging. :)
 
The ADA T100S is the tube power section for the MP-1/MP-1 classic/MP-2 tube preamps. The MicroTube 200 was a first hybrid amplifier. Beating the marshall valvestate and fender cyber to the market. The MicroFet 100 was the high fidelity solid state power amp, that had an extended freq to 20-32k.

They are extremely dependable and have NO knobs to break off or pots to get scratchy. Gig after gig they worked.

Digital control of an all analog path.
 
That being said, many try to work on individual tone separately without knowing how the mix works together. Anyone could tell you how to get a tone for an instrument, but it will not always/usually not work in a given mix. It takes all of the variables to combine together. It not just one instrument that makes a composition work. Once you have a good mix, then you can go into the individual sounds, and find what works best with the mix.

Nobody ever knows what is best bass/drum/guitar/vocal sound without all of them together. Shit does not mix itself on it's own tone alone. It is a combination of all.

^
^
^
This.
 
There might not be any difference between a hit or miss song. Every ones a winner and every one a loser. The tools in the DAW are seemingly just as powerful as the studio.

That one song I completed 111b , I added the cymbals back. The file became much bigger. Now remember I put this together in a very short amount of time with one take charlies. So I did not correct voices, EQ, add effects. Just the instruments as played with a P420 and SM58. I am using different render settings. the CBR constant bitrate, was changed to maximum. If you find my 'render problem thread' there are pictures of the screen for what I did.

I agree that every remix is different, but you can repeat proven techniques and make the methods work in your project.

With cymbals and normalized to -14 Lufs as the youtube standard. Is this better? I included a NOT normalized -16 LUFs of the same wav. I was going for an "all she wants to do is dance " Don Henley type of synth bass line.
Annotation 2020-01-28 083254.jpg
 

Attachments

  • 111bc-006.mp3
    7.1 MB · Views: 2
  • 111b15LUFS.mp3
    7 MB · Views: 6
Last edited:
Am I supposed to normalize?

Nope.

Basically the need for 'normalize' is in my experience only good for say recording old un-mastered cassette tapes to digital. Making sure the peaks don't go over a certain level. It does not limit or compress the file, just finds the peak level set and lowers everything down so that it doesn't pass it.

I have never used 'normalize' ever in a music recording situation. No reason for you to be using that. Fader works just fine and does the same thing.
 
Ok, but if I normalize every track, I know what value they are at. When I render the wav tracks to MP3, faders at 0 , it looks like I get a -10 db loss. Is this what happens for everyone? Could you check your files with the analyzer? The more I compensate for the render loss with the main slider, the worse the ambient compression echo phenomena happens.
 
as of 2020 I went and added a Radial JDI for ease of plugging in an laying down a good sounding bass track.
i still have the realtime-amp sims and all that podfarm offers but a lot of the time i dont want to be bothered with plugins and the JDI is great for that creative- moment without suffering gear setup buzz kill.

im not a bass player as in real-bass player, but i do like having a Jazz and a P-Bas, I planned to have one or the other but really enjoy the different necks etc, or even using both on a demo tune.

as for normalizing, just in my perception isnt that getting rid of "feel" somewhat, like playing drums you might hit the snare hard and ride the cymbal with a light touch, but if you Normalize all the tracks wouldnt the cymbal all of a sudden become louder and not have the dynamics of the player?
the tape hiss crap is gone so theres no reason to fill up the space is there? ...I dont think so.
 
but if you Normalize all the tracks wouldnt the cymbal all of a sudden become louder and not have the dynamics of the player?
.

Ok so that is going to be a problem in digital land? I think I saved the project already with the tracks already normalized...shit. Post #11 shows the normalized tracks , and it was saved that way.


If I do not use normailze to get it to-14 standard, I will be blindly moving the main slider up and down till I get a MP3 near -14 lufs. It would be impossible to make it exactly -14 LUFS. Because the levels drop 8-10 db on the render.

To be clear I am not using a limiter. No VSTs yet. Besides the drums vst and ezkeys or whatever. I am NOT applying anything to the mains except the loudness analyzer from the action tab.
 
Last edited:
Ok so that is going to be a problem in digital land? I think I saved the project already with the tracks already normalized...shit. Post #11 shows the normalized tracks , and it was saved that way.


If I do not use normailze to get it to-14 standard, I will be blindly moving the main slider up and down till I get a MP3 near -14 lufs. It would be impossible to make it exactly -14 LUFS. Because the levels drop 8-10 db on the render.

To be clear I am not using a limiter. No VSTs yet. Besides the drums vst and ezkeys or whatever. I am NOT applying anything to the mains except the loudness analyzer from the action tab.

Well then you need to work harder to learn how to use your DAW. It isn't an easy thing. That is why people get paid to record and produce music. There is no single button that makes everything perfect.
 
Have you experienced a difference in latency recording bass or guitar? If I start with a midi quantized keys track, it takes more nudging to align the bass than the guitar.

Red is Bass
Orange is Guitar

Annotation 2020-03-11 194032.jpg

ooops. Ruined it. Chopped it up and lost my place.
 

Attachments

  • ifyouleave-002.mp3
    1.6 MB · Views: 1
Last edited:
I can't see any reason that bass would be any different from guitar. Maybe you're slower at hitting the bass notes vs the guitar. In the scheme of things, there is absolutely no difference between a 50Hz note and an 500Hz note for the computer and interface. It really doesn't even know that its a note. For the computer, its just a 1 or 0.
 
My thinking was some sort of non linear latency compensation, but that looks to be disabled in the options menu. The instruments never layer properly, so I nudge and try and fix the delay. IfyouleaveX is out of time pretty bad.

The top says 10.4/3.8 for latency in Reaper. So should I fill that area out in the options tab?

Annotation 2020-03-12 033445.jpg
 
I have no idea what a "non linear latency compensation" would mean.

I know that when I first got Reaper, I tried to adjust those parameters and ended up making things worse. I shifted it back to using the system reported latency.

The compensation works by actually doing a "nudge" by the reported amount. It understands that there's a delay between the time the computer outputs a note, and the time you theoretically hear it, and can respond. So it calculates the delay and moves the track back by that amount.

If you want to do the full LoopBack test and set it manually, follow these instructions.

 
Yeah, my tracks dont line up like that at all.

20200313_033039.jpg

I was told that 10 ms is not noticeable, but it REALLY is.

My ping shows 0, so either it does not work or its that awesome. I dont know. I dont have computer skills. So if it dont work, it dont work.
pingzero.jpg
 
Last edited:
Back
Top