Rational Girl

Attempting to be rational while dreaming of 3.141592653589793...

large correlation

I recently needed to run cross correlations on some largish data sets, and the standard numpy.correlate was a bit too slow.

In [30]: import numpy as np
In [31]: sample = np.random.random(1000)
In [32]: sample_same = np.correlate(a,a, mode='same')
1000 loops, best of 3: 801 us per loop

So moved to use fftconvolve in scipy.signal

In [34]: from scipy import signal
In [35]: # need to pad second array with zeros (twice size of a)
In [36]: padded_sample = np.zeros((1000*2))
In [37]: start = 1000 /2
In [38]: stop = 1000 /2 + 1000
In [39]: padded_sample[start:stop] = sample
In [40]: # note , we need to flip a to do correlation (instead of convolution)
In [41]: timeit signal.fftconvolve(padded_sample, sample[::-1],mode = 'valid')
1000 loops, best of 3: 548 us per loop

Results should be the same, though the fft_convolve pads

In [53]: sample_same.shape
Out[53]: (1000,)

In [54]: sample_fft.shape
Out[54]: (1001,)

In [55]: np.allclose(sample_same, sample_fft[:-1])
Out[55]: True