*November 23, 2023* # A MIDI Daydream ![[Midi_ports_and_cable.jpg]] > The MIDI (Musical Instrument Digital Interface) connector; one of many lasting designs from the 80s > > *By :en:Pretzelpaws with a Canon EOS-10D camera. Cropped 2/9/05 using the GIMP. - en:Image:Midi_ports_and_cable.jpg, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=142551* The original synthesizers of the 60's and 70's like the [Moog](https://en.wikipedia.org/wiki/Moog_synthesizer) were analog devices. They worked *subtractively* by trimming down a handful of waveforms. You would generate a sound by choosing which basic waveforms to include (sine, square, sawtooth ...), then routing the signals through filters to change their characteristics, and combining all of the signals in an output. You would program the synthesizer by plugging in actual cables to route the signals from each oscillator along the desired path. The Moog denoted the start of a seismic vibe shift in music, introducing iconic sounds like the ones heard in Donna Summers' [*I Feel Love*](https://www.youtube.com/watch?v=yEbaeLv-aOo) (1977), produced by Giorgio Moroder ^[Who's name might trigger a [random access memory](https://www.youtube.com/watch?v=zhl-Cs1-sG4) in Daft Punk fans]. The need for distinct physical oscillators, and the nest of patch cables used to orchestrate them meant a limited (though still expansive) repertoire of sounds. Digital synthesizers opened up a wider creative space with *additive* synthesis. Digital synthesizers could manage a huge number of oscillators which allowed them to build sounds from the ground up out of individual harmonics. With them you could build precise sounds out of fundamental components, and with a good ear and patience, learn to imitate aspects of sounds you hear in real life. In this short interview clip, Wendy Carlos explains additive and subtractive synthesis brilliantly, demonstrates the MOOG, and emulates a xylophone by hand with a digital synthesizer ![Wendy Carlos ](https://www.youtube.com/watch?v=Z3cab5IcCy8) At the start of the 80's it was plain that digital instruments would become an integral part of music production, and that an open standard was needed to interface with them with synthesizers. Thus midi was born. The midi protocol is lean, mean, and does its job so effectively that that it hasn't changed in the ~40 years of its existence. Each midi message describes a very simple musical event and consists of just a few bytes. For example, slamming middle C on a midi keyboard will prompt it to send ``` 0x90 0x60 0xFF 0x00 ``` which means "start, middle C, at full volume, now". The first two nibbles `0x9` and `0x0` describe the message type (note on) and channel (0) respectively. The bytes that come after depend on the kind of message. In this case the byte `0x60` (96) is the assigned code for middle C, `0xFF` (127) is the volume or *velocity* in midi speak, and `0x00` is the amount of time to wait before execution. There are other message types to specify things like tempo, pitch bend, sustain, and data to synchronize the instrument with the synthesizer. In a digital synthesizer, messages like these are streamed over one pin of the midi connector from the instrument to the synthesizer. The synthesizer would use something like a [UART](https://en.wikipedia.org/wiki/Universal_asynchronous_receiver-transmitter) to collect the stream of bytes into complete messages, then decode and execute the messages as they finish, much like the control unit of a CPU. The synthesizer is constructed so that the execution of the message will trigger oscillators, filters, and other physical components, resulting in appropriate changes to pitch, volume, and tone in the sound output. The midi messages have nothing to say about how the sounds are implemented; the quality of sounds is programmed into into the synthesizer the way Wendy did in the second part of the video. Something about it strikes me in some way. A CPU's instruction set is Turing-complete; it's a conduit into any program that can exist. Midi does something kind of similar. As the instruction set is a heap of bricks we assemble in programs to transform data, midi messages form a kind of instruction set to transform emotions. Given the instructions you just need hardware to animate the program or the music. Anyway. Designed as a control protocol for synthesizers, it now also represents a kind of universal music language. Today, midi input is handled easily at a software level by the operating system of ordinary computers, which can decode and emit midi messages with no sweat. The only difference with computers is that they don't need physical oscillators to generate sounds; they can play them from memory. This amounts to essentially unlimited sound complexity at our fingertips. We can use "samples" of highly specific and complex sounds as building blocks, which would be too complicated to synthesize from the ground up, casting light onto new and exotic experiences.