In this lesson you will learn the principles at the core of the decimation-in-time fast Fourier transform algorithm. The (re)discovery of the fast Fourier transform algorithm by Cooley and Tukey in 1965 was perhaps the most significant event in the history of signal processing. There is evidence that Gauss first developed a fast Fourier transform-type algorithm in 1805.
Fourier analysis and the discrete Fourier transform (DFT) are central players in signal processing. The fast Fourier transform algorithm (FFT) reduces the computation of a length DFT from order to order operations when is a power of 2. The FFT achieves a very large reduction in the cost of computation as becomes large. There are FFT algorithms for many different factorizations of .
Understanding the principles behind the fast Fourier transform algorithm will position you to effectively apply the FFT and appreciate its significance in signal processing. Furthermore, the general approach for reducing computation in the FFT – breaking a large problem up into a combination of smaller problems – also applies in many other algorithms.
Key Concepts and Screenshots
- A.V. Oppenheim and R.W. Schafer, Discrete-Time Signal Processing. 3rd Edition, Pearson Higher Education, 2010. Chapter 9.
- M.T. Heideman, D.H. Johnson, C.S. Burrus, “Gauss and the history of the fast Fourier transform,” IEEE ASSP Magazine, pp. 14-21, October 1984.
- J.W. Cooley and J.W. Tukey, “An algorithm for the machine calculation of complex Fourier series,” Math. Comput., vol. 19, no. 2, pp. 297-301, April 1965.
- J.W. Cooley, “How the FFT gained acceptance,” IEEE Signal Processing Magazine, vol. 9, pp. 10-13, January 1992.
- E.O. Brigham, The Fast Fourier Transform and Its Applications. Prentice Hall, 1988.