*October 30, 2023* ## Outer Products are Like Convolutions In signal processing, a *convolution* is an operation used to extract information from, or modify a time-based or sequential signal. It is done by *convolving* a "filter" over the signal. A mechanical example of convolution is what a [guitar pedal](https://www.youtube.com/watch?v=gagO8sm4RiI&t=27) does to the raw signal from a guitar to produce a new sound; you might think of it as augmenting certain features of the sound and muting others. Another example is how information is extracted from radar pulses, allowing us to infer the shape or size of something by reflecting sound waves off of it. In vector convolution, one vector is the "filter", and the other is the "signal". You slide the filter vector over the signal vector, take the dot product at each position, and record the result into a third vector which you would call the convolution. ![[convolution.svg]] > A black dot represents a positive number, a white dot represents a negative number. The first index contains the similarity between the "leading edge" of the filter and the first piece of the signal. Each subsequent index contains the similarity at that point in the signal, all the way until the "trailing edge" of the filter lines up with the last piece of the signal. In this notation, the $ith entry of the convolution is written as $ \mathbf{x} \circledast \mathbf{w}[i] $ > In the way I'm doing this, we would say the first index is $i=-3$ for a filter width of $4$, and the last one is $i=7$. $i=0$ corresponds to the trailing edge of the filter lining up with the start of the sequence. What's the relation to an outer product? I think of the outer product between two vectors as their multiplication table. In vector notation it's a column vector followed by a row vector: $\mathbf{x}\mathbf{w}^\intercal$. The $ijth entry of the resulting matrix is just $x_iw_j$. Here is how I imagine this: ![[outer_product.svg]] Now if we look closely, we can see the convolution lurking along the diagonals of the outer product! It just hasn't been summed up. The rightmost column represents the leading edge of the filter, the leftmost column the trailing edge, and the diagonal connecting them has the intermediate parts of the filter. The result is that a diagonal that terminates at the i'th row of the last column describes the overall shape of the similarity when the leading edge of the filter is at the i'th position. If we were to sum along these diagonals we would get the convolution ![[outer_conv_connection.svg]] > Notice how the diagonal corresponding to a perfect match (maximum convolution value) is all black, and the diagonal corresponding to a perfect anti-match (minimum convolution value) is all white. Every entry on the i'th diagonal of the outer product contributes to $\mathbf{x} \circledast \mathbf{w}[i]$. The entire diagonal describes the "shape of similarity" between the signal and the filter, or what you might call *their relative phase*. - If a diagonal is overall positive, the filter and signal are *in phase* and interfere constructively - If it is negative, they are *out of phase* and interfere destructively - If it is mixed, they are *orthogonal* and tend not to interfere So, you could say that an outer product is like a convolution over a signal *where you don't know what order the signal or the filter comes in*. The groups of entries that you sum, and the order that you sum them in is what commits you to a specific order. For example, if we were to sum over the off-diagonal instead, we would be describing a convolution using the mirror image of the filter $\mathbf{w}$. If we reversed the order of the rows before the summation, we would be describing a convolution over the signal played backwards. The same idea holds for any way that you permute the matrix $\mathbf{x}\mathbf{w}^\intercal$.