V
VGreen
New member
Below is an attempt to explain some basics in analogue/digital/midi. Hope this isn't too long. I'd appreciate any contributions, and hopefully we might be able to build a useful thread for those like myself who are very green.
All sounds begin as pressure waves, and are then converted by either analogue (electro-mechanical, such as vinyl record, cassette tape) or by digital (electro-digital, such as compact disc) signals by the process of recording. Sounds are always audio, either analogue audio or digital audio, but never midi. Midi isn't sound, but the information about a sound, such as sheet music. It allows us to write or rewrite sounds but isn't sound itself. It is an interface, a Musical Instrument Digital Interface - MIDI.
When a virtual sound is recorded onto say a Yamaha unit, this is done by mic-ing say a violin, recording the analogue signal, converting it to digital, then storing it on a soundcard inside the Yamaha. Then when a note is played on the yamaha, the soundcard converts the signal from digital back to analogue (because we hear in analogue) and we hear the sound.
If I now record a sound in "audio mode" to my ProTools sequencer (recording software) via my MBox (recording hardware) and then play it back through my Yamaha via my MBox, the digital sound travels down the cable to my computer, out of my computer into my MBox where it is preamped (amplified), back to my computer and into ProTools where it is stored as a wav. file (signal) on the sequencer. Then when I play it back, the process is reversed, the digital signal finally reaches the Yamaha and is converted back to analogue through the soundcard and I hear it. Or, if I want to play it back through a speaker (monitor), the digital signal travels from the sequencer to the MBox, is converted back to its analogue form via the soundcard in the MBox, and is then sent through the speaker.
If I now repeat the process but this time in "midi mode", a midi signal containing information about the note I have just pressed (which note, how hard I hit it, how long I held it, etc.) travels to the sequencer. This information is stored as a midi file, and the note can now be adjusted where necessary (moved, made louder/softer, etc.). Then when I play it back, the adjusted information for this note is sent back to the Yamaha, and this adjusted information is now processed through the soundcard, converted to analogue and heard. In other words, midi technology enabled me to do electronically what I failed to do mechanically with my finger. No sound signal travelled between the Yamaha and the sequencer, only the midi signal. The sound component kicked in via the soundcard in the Yamaha only after it had received the midi information - from the sequencer - about the note played. The midi process served only as an interface between the Yamaha and the sequencer.
Please feel free to correct or add to anything I have said.
V

All sounds begin as pressure waves, and are then converted by either analogue (electro-mechanical, such as vinyl record, cassette tape) or by digital (electro-digital, such as compact disc) signals by the process of recording. Sounds are always audio, either analogue audio or digital audio, but never midi. Midi isn't sound, but the information about a sound, such as sheet music. It allows us to write or rewrite sounds but isn't sound itself. It is an interface, a Musical Instrument Digital Interface - MIDI.
When a virtual sound is recorded onto say a Yamaha unit, this is done by mic-ing say a violin, recording the analogue signal, converting it to digital, then storing it on a soundcard inside the Yamaha. Then when a note is played on the yamaha, the soundcard converts the signal from digital back to analogue (because we hear in analogue) and we hear the sound.
If I now record a sound in "audio mode" to my ProTools sequencer (recording software) via my MBox (recording hardware) and then play it back through my Yamaha via my MBox, the digital sound travels down the cable to my computer, out of my computer into my MBox where it is preamped (amplified), back to my computer and into ProTools where it is stored as a wav. file (signal) on the sequencer. Then when I play it back, the process is reversed, the digital signal finally reaches the Yamaha and is converted back to analogue through the soundcard and I hear it. Or, if I want to play it back through a speaker (monitor), the digital signal travels from the sequencer to the MBox, is converted back to its analogue form via the soundcard in the MBox, and is then sent through the speaker.
If I now repeat the process but this time in "midi mode", a midi signal containing information about the note I have just pressed (which note, how hard I hit it, how long I held it, etc.) travels to the sequencer. This information is stored as a midi file, and the note can now be adjusted where necessary (moved, made louder/softer, etc.). Then when I play it back, the adjusted information for this note is sent back to the Yamaha, and this adjusted information is now processed through the soundcard, converted to analogue and heard. In other words, midi technology enabled me to do electronically what I failed to do mechanically with my finger. No sound signal travelled between the Yamaha and the sequencer, only the midi signal. The sound component kicked in via the soundcard in the Yamaha only after it had received the midi information - from the sequencer - about the note played. The midi process served only as an interface between the Yamaha and the sequencer.
Please feel free to correct or add to anything I have said.
V
Last edited: