Midi ins and outs

  • Thread starter Thread starter Paul881
  • Start date Start date
Paul881

Paul881

Look Mom, I can play!
I am trying to understand more about midi on my HS2002.

In Options>midi devices, I have two input options:

SB Live! In [B000]
TTS Virtual Piano in

I have five output options:

A:SB Live!Synth [B000]
B: Live! Synth [B000]
Midi Mapper
Creative S/W synth
SB Live! Midi Out [B000]

Currently, I have the TTS VP and A:SB Live! Midi out selected.

I can understand why I have selected my TTS as the input. Its cos I use VP to input my soundfonts onto my clips/tracks.

But I don't understand the A and B channels of the outputs. Why are there two channels, why shouldn't I select both the channels, what are the implications? Plus what is the Midi mapper and creative Live! synth?

Why can't all the inputs and outputs be selected if they are all valid to be used within Sonar?

I just need to understand what is going on here so I can understand what choices I am making, so I would be grateful for any supprt/help. Thanks.
 
You could select them all, but why should you if you don't need them.
 
Paul881,

The A and B synths are two separate devices, each capable of responding to 16 MIDI channels. You should probably select them both. Only one, though, can be used to play Sound Fonts, in which case its name in the list of devices in the Track Properties changes to Sound Font Device...

I think the new Sound Blaster Audigy might allow you to load both A and B with Sound Fonts but I'm not sure.

The Creative SW Synth is pretty useless, another 16 channels that can address a software synth that comes along with the LiveWare software. Latency makes it pretty unusable for serious purposes. If I remember correctly, the Cakewalk release notes for specific sound cards suggest not using it.

The MIDI Mapper is a Windows device to allow MIDI messages to be intercepted and remapped to another destination. Don't bother using it unless you have some specific reason to. I'm not even sure what utility it has.

Finally, the MIDI Out is the MIDI Out port on the MIDI interface. If you send MIDI data to this, the messages go out to an external device.
 
Many thanks for the information, AlChuck. At least now I can understand some of what I'm doing with midi!
BTW, is this a late night or an early morning? Do you ever get to sleep?
 
It was before midnight for me... and now it's after 7:30 am. I've been mostly sleeping in between...
 
AlChuck,

> The A and B synths are two separate devices ... Only one, though, can be used to play Sound Fonts, in which case its name in the list of devices in the Track Properties changes to Sound Font Device... <

Not so! Both the A and B synths can play whatever SoundFonts are loaded into memory. The Audigy lets you load SoundFonts for the A and B synths separately (a useless feature IMO), but an SB Live absolutely can play all instruments from both synths. I don't know why Sonar shows the name "SoundFont device" for A only by default, but you can change that in Instruments so it knows that both synths are SoundFont devices. That's what I did.

--Ethan
 
Well today I have finally sussed what is loaded where and why in terms of midi....nearly.

All I have to do now is understand why there are some instruments called generic Midi 0-127 and generic midi 1 to 128 and general Midi and...and....and... What is all that about? And why does my midi still play when I choose "none" in channel on Track properties?
 
Ethan,

Thanks for enlightening me. I assumed it couldn't because of the fact that A got labeled that way alone.

So far I had not encountered any limitation because my recordings tend to be track-sparse -- drums, bass, keys, sometimes some melody parts on this or that, and an occasional pad... and I usually replace the bass with real electric bass, and record audio guitar parts. So I never felt constrained by my assumption of A only being Sound Font-able...

Now when I take that orchestration class I'll be all set!

Paul881,

Re the "generic Midi 0-127" and "generic midi 1 to 128" and "general Midi"...

The first two are really the same -- there are 128 possible setting for MIDI parameters. Some manufacturers count from 0-127; others count from 1-128. If you used one instrument definition for an instrument of the other type, your messages would be off by exactly one.

General MIDI is an agreed set of standards that further define what a MIDI device responds to. The "regular" MIDI standard defines what the messages are and what there structure is and types of messages, but it does not at all specify things like whether or not the device has reverb, or how many voices it has, or what instrument sounds are labeled by what patch number. Hence if you had synth 1 and your friend had synth 2, you could send him your MIDI sequence and dollars-to-donuts he would have to get a list from you telling him that patch #12 was a vibraphone, patch #7 a fretless bass, and so on, or he would have no way of knowing which part was supposed to be played by which instrument, and he would have to remap the patches to what his synth had...

General MIDI added an extra layer of compatability to compliant devices, one of which is a standard instrument set of 128 voices, with the same instrument type identified with the same patch number. For example, #1 (or #0) is an acoustic piano patch, regardless of who makes it, if it's GM-compliant. This makes exchanging sequences much more straightforward and eliminates or reduces mapping issues.

there's more to it than that but that'll get you started. Read a good MIDI overview book or article.
 
Alchuck, thanks for that. You would have thought that all the manufacturers have agreed a common standard.

A1MixMan has just posted a basic midi tutorial on this forum so I'll read that.

Thanks again for the help.
 
Back
Top