Autocorrelation

I tried to explain this process on my podcast, Small Findings, but I found that it was just way too hard to do without showing numbers. So, instead, I went through an audio analogy for the process on the podcast and moved the math explanation here.

I looked into autocorrelation a few weeks ago. My friend Hugh introduced it to me as a way to detect repeating patterns in words and sounds.

There are several autocorrelation algorithms, but like the book Why You Hear What You Hear says, most people have an innate sense of autocorrelation. They can predict the recurrence of choruses in pop songs. In a city with a grid system, they can tell how often cross streets come up when walking a long way.

Given an explicit mathematical process, computers can do this, too. They can also go further and can pick out patterns in domains that are beyond human comprehension, like space radiation or subsecond changes in air pressure.

Now that we know why we want to autocorrelate, we can get back to what it is. The “auto” part means “self” rather than “automatic”. So it’s not automatic correlation, it’s self-correlation.

It’s finding out where a list of numbers lines up with itself.


To get a sense of how it’s supposed to work before we get into any numbers, try listening to this audio clip:

You probably notice a pattern that repeats every three beats.


Now, let’s overlay that clip with a copy of itself that is delayed by one beat.

You can probably hear that those aren’t very well-coordinated.


Here’s the original overlaid with a copy delayed by two beats.

Again, not that correlated.


Here’s the original overlaid with a copy delayed by three beats.

Now, I imagine that sounds fairly correlated to you! Almost like the original clip itself*.

So, we’d say that the signal is very correlated with itself at a lag of 3, indicating that it’s likely that there is a pattern that repeats every three beats.


Now, let’s do the same thing, except with numbers, like these:

1 2 3 1 2 3 1 2 3

We can overlay that with a copy of itself at a lag of 1 like so:

1 2 3 1 2 3 1 2 3
  1 2 3 1 2 3 1 2 3

You can see intuitively that (given the scale) 2 doesn’t correlate with the 1 under it, 3 doesn’t correlate with 2, etc. A computer can’t see that, though. Instead, it can:

1) Calculate the difference between each number in the signal and the average of the signal. **

The average is:

(1 + 2 + 3 + 1 + 2 + 3 + 1 + 2 + 3)/9 = 2

When we subtract 2 from each number in the signal, we get:

-1 +0 +1 -1 +0 +1 -1 +0 +1
   -1 +0 +1 -1 +0 +1 -1 +0 +1

2) Multiply the vertical pairs to see how well a number in the signal correlates with the corresponding number in the offset signal.

(We’ll ignore numbers that don’t have a correspondent in the other signal for this example.)

   +0 +1 -1 +0 +1 -1 +0 +1
    x  x  x  x  x  x  x  x
   -1 +0 +1 -1 +0 +1 -1 +0
   _______________________
   +0 +0 -1 +0 +0 -1 +0 +0

If the pairs of numbers are both on the same side of the average (greater than 2 or less than two), we’ll get a positive value, which indicates a positive correlation. If they’re on opposite sides of the average, we’ll get a negative value because those two numbers are negatively correlated.

3) Average those together to get a correlation score for the lag

Here is our score for a lag of 1:

(0 + 0 + -1 + 0 + 0 + -1 + 0 + 0)/8 = -0.25

To save precious web page space, I’m not going to go through getting the scores for a lag of 2. I’ll just tell it to you: -0.4286.

We’ll go through the process for offset = 3, though. Here’s the original lined up with a copy that lags by 3:

1 2 3 1 2 3 1 2 3
      1 2 3 1 2 3 1 2 3

Differences with the average:

-1 +0 +1 -1 +0 +1 -1 +0 +1
         -1 +0 +1 -1 +0 +1 -1 +0 +1

Correlation values:

__ __ __ +1 +0 +1 +1 +0 +1 __ __ __

The score for this is:

(1 + 0 + 1 + 1 + 0 + 1)/6 = 0.66

Lag scores table:

Lag Score
1 -0.25
2 -0.42
3 +0.66

The score for a lag of 3 (0.66) is higher than the score for a lag of 1 (-0.25) and higher than the score for a lag of 2 (-0.4286), and all of the other lags except lag = 0 (you can calculate them to check). So, the computer can say the signal most correlates with itself at lag = 3 and guess that there is a pattern every three numbers in that signal.


That is one way to calculate autocorrelation, but it is too slow for many of the signal processing situations you might want to use it in. There are many other faster ways, but this way involving the Fourier transform is explained by Thibauld Nion using Python.

I followed along with his post in a Jupyter notebook and implemented this way of calculating autocorrelation in JavaScript. An earlier Node package did the same, but I implemented it anyway in order to understand it better and to feel more confident about it.


* I added a tiny bit more lag when overlaying the copy of the clip here, for dramatic effect. If I had overlaid it exactly at three beats, it would sound just like the original clip, except louder, and you’d be like, huh?

** This step in which you subtract the average is called “centering the signal” and isn’t strictly necessary, I think.