MIDI for Dummies

  • Thread starter Thread starter Chater-La
  • Start date Start date
I need to make at least some of my own sounds, and organize these sounds in my way. I have no idea how to have any sort of control over GMIs, soundfonts and soundbanks. So I end up -for instance- with a song in my head but having to go through 100s of preset sounds, or making something based around what soundfonts or samples I can scrounge out of some chaotic placements.

Then I tried to tweak some wav files and copy paste to reaper / change pitch.

I am extremely limited at this time.
 
1. Introduction
For some reason this post was made a sticky. I'm going to try and make this thread more useful. There are three basic ways to communicate MIDI to and from your computer:

1. Firewire (aka 1394)
2. USB
3. Internal sound card with a MIDI (DB-15) connector

I'm going to focus on the internal sound card because that is where the interface began. Admittedly, this is legacy and the other two paths are currently the state of the art but the PC card was never brought to its full potential in my opinion. Maybe we can change that in this thread. All MIDI requires rendering which roughly means making sounds from the MIDI text descriptions. Most sound cards have chips which render MIDI and some have the additional capability of direct memory access (DMA). DMA allows very low latency which is arguably more important than bandwidth with MIDI because it's so size efficient. Let's have a look at interface bandwidth:

ISA (16MB/s)
PCI (133MB/s theoretical ~60MB/s real)
USB (1.5MB/s)
USB 2.0 (60MB/s theoretical ~45MB/s real)
USB 3.x (approaching/surpassing GB/s theoretical)

What you will note is that even the lowly 1st generation USB is capable of single channel audio I/O. The connections of the interfaces are described below:

PC card (PCI or ISA) uses a D15 pin male connection to various DIN type MIDI male ends ~$50
USB uses a standard 4 conductor cable with various ends ~$10

In the next installment, I will attempt to explain MIDI rendering and where it takes place.

2. Rendering
In this segment we will briefly go over the rendering or fleshing out of MIDI text. The Audiofanzine link in the first post gives some good analogies of MIDI to actual sound so if you don't understand the basic concept of MIDI you should read that first. Rendering is done by a processor, usually a dedicated one but not necessarily. As chip integration continues marching forward we find these synthesizing functions as part of other multi-function chips. The only way to know the capability of your particular chip is to become familiar with its specs. This is not always as simple as is should be. Once you know your chips and the synth functions they support, the next thing to do is know where they are. This is important because where they are will effect how useful they will be for a particular function. In the part above this one, I highlighted the common MIDI recording interfaces and their bandwidth. What I did not include was the processing latency. This parameter is critical in certain situations and dependent on both your particular hardware and software configuration along with the limitations of each. For example, an ISA card that cannot do DMA on a hardware level may have the necessary bandwidth for 8/8 channels but will be severly limited because of its abysmal latency. Conversely, an ISA card may have excellent DMA hardware but poor driver software preventing it from performing optimally. Drivers are a concern for all interfaces both internal and external.

Okay, so where is your synth rendering done? If you have an internal sound card, it will be on that card. It can also be part of the supporting chipset on your motherboard but not typically. Even if the chipset has the hardware it's not usually supported by a driver to make it functional. Nowadays, the rendering is done in an external box connected to the PC with a cable. First generation USB did not have DMA and it was only partially implemented in the second generation. It really depends on how it was done. Firewire is DMA capable but again, it depends on how it was implemented on your hardware and driver. Rendering DMA can reduce your latency by a factor of 10x but this may or may not be important in your application. If you are doing live monitoring of your recording, it's critical. The complete MIDI command set describes everything about the sound the same way a dictionary describes everything about a book. You make art by choosing the right words in the right sequence.

MIDI can be decoded (rendered) and encoded (transcribed). Let's say you have a synthesizing instrument like a MIDI keyboard with an on board processor similar to the one on your computer. That keyboard is issuing MIDI commands which are rendered by the processor to analog wave forms. These can be amplified, stored directly or re-encoded back to MIDI by another processor. Conversely, the MIDI input you create as you play can be passed directly to or combined with other MIDI files at another location. All that is required is an identical dictionary and compatible storage/transfer. When the unaltered MIDI is stored as MIDI and rendered again, it plays back exactly as it was originally produced. Now, let's say you are recording an analog instrument like a clarinet into a microphone. Here again the wave forms are encoded into MIDI text. How much of the recording was captured and how well the encoder can describe that capture will dictate how closely the recording will match when it is rendered again. Things become interesting when you start to edit these MIDI files. You can transform the original instruments to mimic others but for this to work with expected results you need good virtual instrument descriptions and we'll talk about those next.

3. MIDI Instruments
We need to make a distinction here between MIDI physical and virtual instruments. Neither produce analog sound signals directly. You should be familiar with MIDI keyboards. These are virtual instruments because at their heart is a MIDI computer and clock that processes the tactile input which then creates the MIDI. You may have seen other digital instruments that work on the same principle. What distinguishes these from the purely software virtual instruments is the tactility. You can achieve the same sound with either if the MIDI logic is indentical. These purely software instruments reside in processor memory. The memory can be on a sound card chip or a file on your hard drive. These files can be in many formats and some are exclusive to specific manufacturers so be aware of this when you commit to one for your use. You may be in love with a particular soundfont but discover the inconvenience of this format with the rest of your workflow. MIDI instruments contain the waveform signatures that flesh out the other aspects of the sound beside pitch, duration, inflection and amplitude. The great thing about using these is the ability to change the instrument in an arrangement. Is your string arrangement too soppy? Try some clarinets. Everything about the recording will stay the same except for the sound. This is amazingly powerful and probably the best reason to compose in MIDI. I'm not going to go into more detail here because the field is still developing. Your best bet is to remain active on boards like this to exchange your experiences with others to form opinions of what will work for you.
 
Last edited:
Good afternoon everyone

Among other stuff, I have a Yamaha PSR-510 Keyboard, it has its own sounds, rhythms and such. I want to modernize it with new sounds but at or nearby this keyboard using plugins in my DAW
I spotted this and wanted to know if it is what I am looking for

MIDI Captain MINI 6 Controller with HID Multi-state Cycling

What I am looking for is a remote device I can connect to Reaper, select some sounds and assign them to this device and select those sounds to play one at a time through my Yamaha.
What I saw was this device had 5-pin midi I/O rather seemingly more common usb plug
 
USB was not part of the deal when this keyboard was designed, so your DAW will need a MIDI interface with MIDI in and MIDI out. Most audio interfaces have this.
You have a bit of a learning curve to get under your belt. I don't know how much you know.
These keyboards as you know let you play a right hand melody, left hand chords, or one finger accompaniment and have built in drums and stuff. This is proprietary and not a standard, so while your keyboard can drive another of the same vintage, what happens, for instance, when you press a left hand 'C' is up to Yamaha. Reaper or other DAWs will record a C note, but NOT what that C note actually does - so the auto accompaniment and stuff is generated in the Yamaha. It means editing this stuff has to remain in the keyboard. MIDI just tells your computer what note you pressed, NOT, what you are hearing. That said, you can record these into the computer and I think from memory, yamaha used some odd MIDI modes so melody was on channel 1, chords on 2 and others on the first 4 or 5 channels. It means you can edit mistakes and then press play and the keyboard will follow. It's sounds are fixed and NOT upgradable. They are what they are. However, your right hand melody could allow reaper to play new sounds if you have them installed in the computer. It means the keyboard speakers do the keyboard sounds and your computer's sound system does the reaper sounds. Your keyboard IS, I believe velocity sensitive so playing quietly or loudly is sent along with note info, which Reaper will record.
Frankly, by modern standards, it's a bit of a dinosaur - but if the idea is to use reaper properly to record stuff, you can use just the keyboard to send notes to the computer and ALL the modern sounds come from the computer. Forget the installed sounds - they were not exactly brilliant in the 90s, and lacking badly by today's standards. You will lose all the auto stuff - so no auto accompaniment, or drums, or starts and stops.

You either are good with reaper already or just starting, and that is where progress comes from. The oom-pah, oompah accompaniment and one finger chords are not upgradable. Using it as a master keyboard works fine, if the key feel suits you. Reaper with some VSTi instruments can be amazing once you sus it out, and the yamaha sounds just get retired. If there is a sound or two you really like, it is possible to play it from reaper, then record the audio back into reaper but that's a big new set of skills to grow.
 
@Eclipse4449 I would add, you don't need a controller, if your interface has a MIDI DIN plug. My controller, a PC-200 is from 1989 (it was designed for computers), it still works well. I am currently looking for an external synthesizer to add to the fun.

You can use the keyboard as Rob described, but you can also record your notes and events (you will have to map the events so the computer knows what to send). The sound output you will have to route back to the interface as stated. It an be done, but it is a deep dive. I think it would be fun. Your keyboard, if it has touch sensitive keys, then would send out velocity information.

Not sure how much your keyboard will support - here is some information that should show the events mappings https://usa.yamaha.com/support/manuals/index.html?l=en&c=&k=psr-510

As Rob stated, you can use a plugin, let your keyboard be the controller and Reaper can record the notes. But if you are adventurous, you could get your keyboard sound recorded as well.
 
USB was not part of the deal when this keyboard was designed, so your DAW will need a MIDI interface with MIDI in and MIDI out. Most audio interfaces have this.
You have a bit of a learning curve to get under your belt. I don't know how much you know.
These keyboards as you know let you play a right hand melody, left hand chords, or one finger accompaniment and have built in drums and stuff. This is proprietary and not a standard, so while your keyboard can drive another of the same vintage, what happens, for instance, when you press a left hand 'C' is up to Yamaha. Reaper or other DAWs will record a C note, but NOT what that C note actually does - so the auto accompaniment and stuff is generated in the Yamaha. It means editing this stuff has to remain in the keyboard. MIDI just tells your computer what note you pressed, NOT, what you are hearing. That said, you can record these into the computer and I think from memory, yamaha used some odd MIDI modes so melody was on channel 1, chords on 2 and others on the first 4 or 5 channels. It means you can edit mistakes and then press play and the keyboard will follow. It's sounds are fixed and NOT upgradable. They are what they are. However, your right hand melody could allow reaper to play new sounds if you have them installed in the computer. It means the keyboard speakers do the keyboard sounds and your computer's sound system does the reaper sounds. Your keyboard IS, I believe velocity sensitive so playing quietly or loudly is sent along with note info, which Reaper will record.
Frankly, by modern standards, it's a bit of a dinosaur - but if the idea is to use reaper properly to record stuff, you can use just the keyboard to send notes to the computer and ALL the modern sounds come from the computer. Forget the installed sounds - they were not exactly brilliant in the 90s, and lacking badly by today's standards. You will lose all the auto stuff - so no auto accompaniment, or drums, or starts and stops.

You either are good with reaper already or just starting, and that is where progress comes from. The oom-pah, oompah accompaniment and one finger chords are not upgradable. Using it as a master keyboard works fine, if the key feel suits you. Reaper with some VSTi instruments can be amazing once you sus it out, and the yamaha sounds just get retired. If there is a sound or two you really like, it is possible to play it from reaper, then record the audio back into reaper but that's a big new set of skills to grow.
Very fast and informational, Thank you.
Yes I am just getting started, I'm in the hooking everything up stage, this is It'll be awhile before i start working with Reaper 7. I probably need more things and probably have things I don't need because I've been collecting all this stuff for years, some I just unboxed!.
I just bought and installed and connect as Arturia Keylab 61 MK3 and installed Ableton Live Lite and downloaded the VSTs and software they also gave me and Oh My God it sounds great! I also have a M-Audio Oxygen25 i still haven't messed with yet and then there is the Yamaha I mentioned, and there is more but for somewhere else on this wonderful Forum.

So here is the reasoning for the question above. I'd bet most creative people create everything they want keyboard wise right on the keyboard they have right in front of them. Mine is setup Yamaha left side, Arturia right side, Oxygen25 Center. I have been playing keyboards since I was a boy, I can read a write music, connecting everything to a DAW and Windows PC is all new. So I want to compose music, but for me it would be like playing "Live" with thousands of restarts meaning I've got the sound for this first part and I reach to the Yamaha for the next bar, nope not the right sound (stop change the sound, restart) ok that's better until I want another sound. Now before you get all confused, I know about tracks that will be applied on top and simultaneously of what I'm describing. So I am creating music across multiple keyboards rather than from a midi controller. The 3 keyboards/Midi Controllers would have they own midi tracks. It's just my ebb flow. Later I can piece it all together on the Arturia.
 
Last edited:
well, er, it's a novel approach. I don't quite get your process though? Almost certainly the sounds in the computer are steps above the yamaha, so I'm guessing the yamaha is clearly doing something you want that you cannot get from the computer system?
Can I make a suggestion? Youtube has a pile of videos from composer Guy Michelmore. He's an established composer and maybe watching how he does things would show you how other people do things? I'm not sure I understand your workflow enough to advise - but it sounds to me like torture. Have a look at Guy's way of composing - he often builds things up in little short bits so maybe you can steal some ideas? Keep in mind you are not piecing it together on the arturia - that is just a controller, in the same way a mouse, or indeed your yamaha is - it's being done in the computer - all the arturia does is give you easier access to the stop rewind record etc, and the pads work drum sounds better than a key. The arturia is just a control surface, all the sounds are done elsewhere.
 
Last edited:
"well, er, it's a novel approach. I don't quite get your process though? Almost certainly the sounds in the computer are steps above the yamaha, so I'm guessing the yamaha is clearly doing something you want that you cannot get from the computer system?"

No I just want the whole keyboard and assign sounds from my PC

I'll certain check out Guy, yes just starting out (have tons of musical ideas) as long as I can change sounds on demand however it is done I'll figure it out, But I'll bet you anything Guy will be a valuable asset
Thanks again!!
Ray
 

@rob aylestone



USB was not part of the deal when this keyboard was designed, so your DAW will need a MIDI interface with MIDI in and MIDI out. Most audio interfaces have this.

You have a bit of a learning curve to get under your belt. I don't know how much you know.
These keyboards as you know let you play a right hand melody, left hand chords, or one finger accompaniment and have built in drums and stuff. This is proprietary and not a standard, so while your keyboard can drive another of the same vintage, what happens, for instance, when you press a left hand 'C' is up to Yamaha. Reaper or other DAWs will record a C note, but NOT what that C note actually does - so the auto accompaniment and stuff is generated in the Yamaha. It means editing this stuff has to remain in the keyboard.

This last part I did not know. My goal with this keyboard “hopefully in an option/menu somewhere on the keyboard” is to disable the chords and accompaniment and just have 61 key keyboard, if not I will box it up and sell it cheap. I have an eye on a Native Instruments Knotrol A61 or S61


MIDI just tells your computer what note you pressed, NOT, what you are hearing. That said, you can record these into the computer and I think from memory, Yamaha used some odd MIDI modes so melody was on channel 1, chords on 2 and others on the first 4 or 5 channels.

Ugh, I’m using the Mio XL (IConnectivity) to connect all of “Controllers”. The cabling is all ran but I haven’t taken the needed time to setup the I/Os in Auracle X software this came with, I’m using the network cable connection option to my Windows PC, had to buy a PCIe card to accommodate this.

Detail; Midi is connected to a well above average Motherboard Network connection (I9 processor), internet is running through the new network card


It means you can edit mistakes and then press play and the keyboard will follow. Its sounds are fixed and NOT upgradable. They are what they are. However, your right hand melody could allow reaper to play new sounds if you have them installed in the computer. It means the keyboard speakers do the keyboard sounds and your computer's sound system does the reaper sounds. Your keyboard IS, I believe velocity sensitive so playing quietly or loudly is sent along with note info, which Reaper will record.

I knew the first part about editing the notes in Reaper 7 and again I didn’t know about the left hand stuff and if that keyboard is going to play its sounds through its speakers while my monitors play a VST that isn’t going to work.

I’d would hope the volume knob does not affect the Midi notes being sent to the DAW?


Frankly, by modern standards, it's a bit of a dinosaur - but if the idea is to use reaper properly to record stuff, you can use just the keyboard to send notes to the computer and ALL the modern sounds come from the computer. Forget the installed sounds - they were not exactly brilliant in the 90s, and lacking badly by today's standards. You will lose all the auto stuff - so no auto accompaniment, or drums, or starts and stops.

Yes I agree, probably fair back in the day but not what I am looking for right now. I have some VSTs and plan on downloading some free stuff and purchasing some others.

You either are good with reaper already or just starting, and that is where progress comes from. The oom-pah, oompah accompaniment and one finger chords are not upgradable. Using it as a master keyboard works fine, if the key feel suits you. Reaper with some VSTi instruments can be amazing once you sus it out, and the Yamaha sounds just get retired. If there is a sound or two you really like, it is possible to play it from reaper, then record the audio back into reaper but that's a big new set of skills to grow.

I have the Reaper7 version and Intend on using as my main DAW, I do Have Ableton Live Lite and my Arturia Keylab 61 mk3 connected to it via PC’s USB (and Midi I/O to the Mio but now seen in my setup yet). I also have a Scarlett 18i20 as the audio interface.

What do you use to connect your various Midi device to?
 
Last edited:
I still have some keyboard synths and things like Roland 1080s but now I’m using the Roland soft synth packages plus the amazing ones in things like contact. I will need to dig up some midi cables for a forthcoming project and I’ll just use the midi out on the audio interface. I still have two midiman 8x8 devices so if I ever needed to I could talk to 16 separate midi devices. If you like the feel of the Yamaha the only benefit to your system of a swap to the kontakt s61 is quicker access to all your sounds. I have an s61 mk II but usually select sounds on the computer screen. Look at crow hill and spitfire both have excellent freebies and crazily good paid for things
 
Need help with understanding this "key point" in Reaper 7
"128 channels per track: Each individual track can handle up to 128 MIDI channels."
Reaper claims to offer unlimited midi tracks, but what does channels mean?
 
Well, those are CC channels, and that is standard MIDI CC mappings.

1736889155687.webp


The instrument channels are 1-16.
 
The way manufacturers mangled not just the MIDI spec but the terminology has been a pain for years. 128 always annoyed me because it means 1 through to 128, but that is wrong as continuous controller has a 0 parameter! So it is really 0 to 127. Sometimes you had to edit files people sent you because 7 was always volume, 11 expression etc but typing 11 into might say 12, out of 128!
 
Funny you wrote that @rob aylestone , I thought I should come back and explain that. Since it still has its origins in computer talk, 0 counts. And while MIDI is a standard, that is a loose definition. I am going through that right now with that UBx I just purchased. I am having to repint out the parameters so I am not guessing at the 128 (and those above 112 are reserved for ???). But, most software does send the written number (from the UI) to the correct channel, since it is binary.
 
Ok so I am just using these keyboards to create music, this global mapping is just for very complex stuff, not for simple sound recording right?
 
I would say, don't over think it at first. Revisit it later if you need to do automation to the synth or plugin. But, you should revisit it as you progress. Baby steps.
 
The continuous controllers were how you could get individual sounds to repeat each time you loaded the file, so the cc for volume, expression, modulation, pitch bend, sustain pedal and reverb and balance were things that with the program change message worked. General midi is still with us, but kind of disguised. Notes on ch 10 still are standard for drums, and 34 always gives you electric bass. Other cc numbers are much less common because they often get ignored by the gizmo producing the sound. Chorus, phasing and stuff like that might be implemented, but not always? After GM came XG and GS, and those other available codes could produce amazing stuff. There was a great jimi hendrix wah wah track that was really clever programming.

Nowadays, i just play the keyboard and add the stuff directly in the vsti, not from the keyboard. I use pitchbend, mod and expression the most. Most vsti instruments can be mapped to whatever controller you have. Dont forget that many external controllers like my cc121 steinberg use controllers to make the controls work.
 
Back
Top