Can someone help me understand midi?

  • Thread starter Thread starter copperandstars
  • Start date Start date
C

copperandstars

New member
Hello there,

I am lost on this midi stuff. I also posted this thread in the Cubase section because it applies to both. Here is my situation:

I use Cubase SX for recording audio. I have never goofed around with anything else, until today. I clicked some things and brought up a virtual synth. I was quite amused...there was a little mini keyboard and I could click the tiny keys with my mouse and get cool sounds. I could even select different sounds...hundreds of them.

I have known about midi for a while but I do not quite understand it. Here is what I would like to accomplish: I would like to be able to play the softwear vst instrument in Cubase SX with a real keyboard. I have been looking around at these, I guess they are called midi controllers? Then I would like to be able to record what I play on the keyboard through the synth onto a track in SX, and mix them with my audio trakcs. I have no clue how to do this...I have looked through both the html and actually manuals with no luck.

Can someone please explain to me this process? Any help is extremely appreciated.

Ryan
 
If you read the manual and help stuff and don't get it, I don't know how I'm gonna be very helpful...

The basic idea is that MIDI is a collection of messages that are sent between MIDI-compliant devices that tell the receiving MIDI instrument what to play and when. They can be recorded as a so-called MIDI sequence in an application like Cubase and therefore played back at will, and also edited and otherwise manipulated.

MIDI information has a source -- the keyboard -- and a destination -- the MIDI instrument.

In order to physically route MIDI messages from a keyboard into the computer, you need a MIDI interface. This allows Cubase to accept see and accept these messages when you set up a track to record them.

The track also routes the message out to its destination -- in your case, you want to play a VSTi. I don't have Cubase, but it can't be too different than in other applications. The track properties are set up to receive the MIDI data from the interface and to send it to the MIDI device for playback. You choose the MIDI interface and the channel it's transmitting its info on for the source, and the VSTi and the bank and patch that you want for the destination. If your instument was an external synth, you'd send it out the MIDI interface's out portand connect that to the external box's MIDI In.

To mix the sounds played by the VSTi with any audio you recorded, you have to record it as audio. There is usually a simple way to get the VSTi tracks "rendered" as WAV files like the other files. Basically this just means the sound from the synth is routed into the WAV device and recorded just like anything else rouited to the WAV device can be. Once this is accomplished, they can be mixed with the other audio tracks to a stereo master file for use in burning to CD or encoding into MP3s or other forms.
 
Remember, midi is not sound data it is event data over time.

Specifically midi captures keyboard hits, pressure of keys, bender information, modulation information, all of this from the keyboard.

Once the data is stored in a track it plays back the events which are redirected at a soundsource such as an external synth or sound module or an internal source such as a soundblaster card or even a soft synth or soundblaster emulator which stores soundfonts.

These sound sources take the midi data events and plays back the sound using the key pressure, key note, modulation etc to recreate your original performance.

Not to confuse you but all of the above applies to using a keyboard to store events to trigger a sound source. You can also create midi data that triggers non-sound source devices. This kind of midi data is simply used for switching data, to tell external devices to change from one setting to another. I use this on my external reverb unit, a TC Electronics M1 to make it change from one reverb setting to another on the fly while my midi track is playing. I can put a long deep reverb on a vocal for chorus and verse and a short delay on the voice for the bridge, all triggered from midi commands stored in my software.

That's the 1000 foot level of midi. It can get much more complicated when you get down into controllers messages and system exclusive commands.
 
More questions about midi

Being a career IT person and a lifelong musician you'd think I'd know more about midi. But I don't.

I'd like to be able to create my own drum tracks in Cakewalk. I've done some neat things with a Roland TR707 and I'm sure I can get something usable in Cakewalk.

But I want some better drum sounds. What confuses me are all the options available with sources, ports and instrument patches. I imported a number of .ins files from a Cakewalk directory but they all sound the same. Lots of different note names, but only one set of sounds. What am I missing?

I also don't understand why the reverb effect in the Cakewalk's track strip does not work. Someone told me the fx only apply to audio, not midi. Tell me it ain't so! There's no way to apply digital reverb to digital information! I'm missing a vital part of the picture. Can you help?
 
Midi, once again, is only event data.

Cakewalk ins files are merely text files for different sound sources.

A sound source is one of the following:

-The sounds in a chip located on a sound card, such as a soundblaster.
-A soundfont loaded into memory on a soundblaster
-A soundfont loaded into a soundblaster emulator i.e. resides in actually computer memory but contains sounds. No soundblaster needed in this case.
- An external sound module.

Thus you could load ins files all day long and its still referencing the same sounds, in this case probably your soundcard.

The ins files are for the variety of soft synths and hardware synths in the market. Each one puts its violins, guitars etc in a different bank location and thus you need the ins file to tell you what is where. The ins files themselves have no sounds attached to them.

Reverb is either a part of your soundsource or not. If it is not turned on then you won't hear it. Although most sound cards allow for reverb and chorus so it may just be a matter of turning it on so Cakewalk can use it.

Regarding midi effects, there are some but they are limited. Basically these use midi data to try and coax a sound effect. I have never found one midi effect useful in a mix. Delay is about the best one you can find, the rest are interesting but not applicable in my opinion.

What a majority of users do in Cakewalk is record the midi track into an audio track and then apply audio effects. This also gives you the ability to EQ the midi audio track, something that is very necessary to make things fit in a mix.
 
Re: More questions about midi

EddieRay said:
What confuses me are all the options available with sources, ports and instrument patches. I

The sources can also be various sources of stored MIDI code, as in external sequencers, software sequencers or those embedded into some keyboard controller.

Ports refer to a collection of 16 MIDI channels which can transmit simultaneously along a MIDI cable, sharing the available bandwidth of the medium as it is a serial protocol.

If you need more than 16 channels you need more than one port.

Multiport MIDI interfaces are how this is accomplished.

Patches usually come in GM-compliant sets but wacky collections of sounds also get grouped into one patch set which names 128 sounds. GM-compliant means that the same number stands for the same instrument. e.g. 32 being stand-up bass.

Drums work a little differently in that the "pitch" data on a drum channel (typically channel 10) defines which drum sound (out of a drum set) is called out.
 
Being a career IT person and a lifelong musician you'd think I'd know more about midi. But I don't.

Why? Just because it's something you can use with a computer? I've never seen an IT department anywhere that has anything whatsoever to do with MIDI in their day-to-day work. Granted, it is a computer messaging protocol so in some ways it's like TCP/IP, but the analogy breaks down fast. Perhaps the closest everyday analog to it is whatever scheme is use to interpret keystrokes from a computer keyboard. The computer listens to the keyboard's port. When you hit the letter "S," for example, the keycode for this letter is sent by the keyboard's onboard chip. The computer will react in different ways depending on the context. If you are just displaying the desktop, for example, nothing happens. If you're in a Word document or email composition window, the letter S appears at the next position. If you're in a dialog box it might or might not do something else.


Cakewalk ins files are merely text files for different sound sources.

To amplify this a bit, Cakewalk's ins files, or instrument definitions, are text files that describe how particular MIDI instruments respond to MIDI messages. They are needed because the MIDI specification has some gray areas -- it specifies the form and types of messages, but it does not specify that, for example, a program change 12 means switch to a piano sound. Some devices also "count" from 0-127 while others "count" from 1-128. The INS files spell out these details so that Cakewalk can communicate properly with the device with a minimum of fuss on your part. So, if you have a synth that has a pre-existing ins file, you use that file, and Cakewalk shows you the correct patch names for that instrument, and the correct commands coresponding to them, so you don't have to get the synth manual and read the manufacturer's MIDI implementation spec for yourself.

There is an extension of the MIDI Spec called General MIDI that narrows down some of these gray areas a bit. The main thing about the General MIDI specification is that the instrument sounds and their patch numbers are agreed upon. So any GM-compliant synth will respond with a piano sound if you choose program #1, or a fretless bass if you choose program #35, or an oboe with program # whatever. This makes it a little easier to make MIDI sequence files portable from one MIDI setup to another -- without this, if you gave your sequence to a colleague who had different MIDI instruments, he might have to change a lot of patch-change messages to get an approximation of what your sequence sounded like -- there would be no gurantee that track one on your system selecting a French horn would be played by a French horn oatch or anything even similar if he opened the file in his sequencer.

There is also the way that drum patches are handled in General MIDI, as drstawl mentioned, that makes this kind of file sharing far more straightforward than it would be otherwise.
 
Maybe this will help.

First of all I'm not as literate as most of you wiz kids but I'll try anyway.
just go through the manual and check things out step by step just to get through the frustration period first. Have friend try to help.
In the back of the manuals usely are all the different instraments. Learn about editing and that should give you a hint about set-ups. You will need to know set-ups and editing before you can evan think about linking different midi divices together.

Checking out the make's web sights you might find helpfull hints and solutions.
Good luck.
Kevin
 
Although I don't use cubase, in logic the way you use such soft synths is through a feature called audio instruments. It is a type of recording channel found on the arrange window (or any other window for that matter) nut you select the synth as an insert. So it would seem the only midi is the triggering the rest is audio based which causes the problem with such devices draining proccessing power. Sop maybe try to find a similar thing in Cubase, or just get logic!
 
Back
Top