From Academic Kids

Autocorrelation is a mathematical tool used frequently in signal processing for analysing functions or series of values, such as time domain signals. It is the cross-correlation of a signal with itself. Autocorrelation is useful for finding repeating patterns in a signal, such as determining the presence of a periodic signal which has been buried under noise, or identifying the fundamental frequency of a signal which doesn't actually contain that frequency component, but implies it with many harmonic frequencies.



Different definitions of autocorrelation are in use depending on the field of study which is being considered and not all of them are equivalent. In some fields, the term is used interchangeably with autocovariance.


In statistics, the autocorrelation of a discrete time series or a process Xt is simply the correlation of the process against a time-shifted version of itself. If Xt is second-order stationary with mean μ then this definition is

<math>R(k) = \frac{E[(X_i - \mu)(X_{i+k} - \mu)]}{\sigma^2}<math>

where E is the expected value and k is the time shift being considered (usually referred to as the lag). This function has the attractive property of being in the range [−1, 1] with 1 indicating perfect correlation (the signals exactly overlap when time shifted by k) and −1 indicating perfect anti-correlation. It is common practice in many disciplines to drop the normalisation by σ2 and use the term autocorrelation interchangeably with autocovariance.

Signal processing

In signal processing, given a signal f(t), the continuous autocorrelation Rf(τ) is the continuous cross-correlation of f(t) with itself, at lag τ, and is defined as:

<math>R_f(\tau) = f^*(-\tau) \circ f(\tau) = \int_{-\infty}^{\infty} f(t+\tau)f^*(t)\, dt = \int_{-\infty}^{\infty} f(t)f^*(t-\tau)\, dt<math>

where f* represents the complex conjugate and the circle represents convolution. For a real function, f* = f.

Formally, the discrete autocorrelation R at lag j for signal xn is

<math>R(j) = \sum_n (x_n-m)(x_{n-j}-m )\,<math>

where m is the average value (expected value) of xn. Quite frequently, autocorrelations are calculated for zero-centered signals, that is, for signals with zero mean. The autocorrelation definition then becomes

<math>R(j) = \sum_n x_n x_{n-j}.\,<math>

Multi-dimensional autocorrelation is defined similarly. For example, in three dimensions the autocorrelation would be defined as

<math>R(j,k,\ell) = \sum_{n,q,r} (x_{n,q,r}-m)(x_{n-j,q-k,r-\ell}-m).<math>


In the following, we will describe properties of one-dimensional autocorrelations only, since most properties are easily transferred from the one-dimensional case to the multi-dimensional cases.

  • A fundamental property of the autocorrelation is symmetry, R(i) = R(−i), which is easy to prove from the definition. In the continuous case, the autocorrelation is an even function
<math>R_f(-\tau) = R_f(\tau)\,<math>
when f is a real function, and an Hermitian function
<math>R_f(-\tau) = R_f^*(\tau)\,<math>
when f is a complex function.
  • The continuous autocorrelation function reaches its peak at the origin, where it takes a real value, i.e. for any delay τ, <math>|R_f(\tau)| \leq R_f(0)<math>. This is a consequence of the Cauchy-Schwarz inequality. The same result holds in the discrete case.
  • Since autocorrelation is a specific type of cross-correlation, it maintains all the properties of cross-correlation.
  • The autocorrelation of a white noise signal will have a strong peak at τ = 0 and will be close to 0 for all other τ. This shows that white noise has no periodicity.
<math>R(\tau) = \int_{-\infty}^\infty S(f) e^{j 2 \pi f \tau} \, df<math>
<math>S(f) = \int_{-\infty}^\infty R(\tau) e^{- j 2 \pi f \tau} \, d\tau.<math>


  • In optics, normalized autocorrelations and cross-correlations give the degree of coherence of an electromagnetic field.

External links


Academic Kids Menu

  • Art and Cultures
    • Art (
    • Architecture (
    • Cultures (
    • Music (
    • Musical Instruments (
  • Biographies (
  • Clipart (
  • Geography (
    • Countries of the World (
    • Maps (
    • Flags (
    • Continents (
  • History (
    • Ancient Civilizations (
    • Industrial Revolution (
    • Middle Ages (
    • Prehistory (
    • Renaissance (
    • Timelines (
    • United States (
    • Wars (
    • World History (
  • Human Body (
  • Mathematics (
  • Reference (
  • Science (
    • Animals (
    • Aviation (
    • Dinosaurs (
    • Earth (
    • Inventions (
    • Physical Science (
    • Plants (
    • Scientists (
  • Social Studies (
    • Anthropology (
    • Economics (
    • Government (
    • Religion (
    • Holidays (
  • Space and Astronomy
    • Solar System (
    • Planets (
  • Sports (
  • Timelines (
  • Weather (
  • US States (


  • Home Page (
  • Contact Us (

  • Clip Art (
Personal tools