# fourier series

**Topics:**Fourier series, Fourier analysis, Fourier transform

**Pages:**7 (1690 words)

**Published:**February 10, 2014

From Wikipedia, the free encyclopedia

Fourier transforms

Continuous Fourier transform

Fourier series

Discrete-time Fourier transform

Discrete Fourier transform

Fourier analysis

Related transforms

The first four partial sums of the Fourier series for a square wave In mathematics, a Fourier series (English pronunciation: /ˈfɔərieɪ/) decomposes periodic functions or periodic signals into the sum of a (possibly infinite) set of simple oscillating functions, namely sines and cosines (or complex exponentials). The study of Fourier series is a branch of Fourier analysis. Contents

[hide]

Definition[edit]

In this section, s(x) denotes a function of the real variable x, and s is integrable on an interval [x0, x0 + P], for real numbers x0 and P. We will attempt to represent s in that interval as an infinite sum, or series, of harmonically related sinusoidal functions. Outside the interval, the series is periodic with period P. It follows that if s also has that property, the approximation is valid on the entire real line. The case P = 2π is prominently featured in the literature, presumably because it affords a minor simplification, but at the expense of generality. For integers N > 0, the following summation is a periodic function with period P:

Using the identities:

Function s(x) (in red) is a sum of six sine functions of different amplitudes and harmonically-related frequencies. Their summation is called a Fourier series. The Fourier transform, S(f) (in blue), which depicts amplitude vs frequency, reveals the 6 frequencies and their amplitudes. we can also write the function in these equivalent forms:

:

where:

When the coefficients (known as Fourier coefficients) are computed as follows:[7]

approximates on and the approximation improves as N → ∞. The infinite sum, is called the Fourier series representation of The Fourier series does not always converge, and even when it does converge for a specific value x1 of x, the sum of the series at x1 may differ from the value s(x1) of the function. It is one of the main questions in harmonic analysis to decide when Fourier series converge, and when the sum is equal to the original function. If a function is square-integrable on the interval [x0, x0+P], then the Fourier series converges to the function at almost every point. In engineering applications, the Fourier series is generally presumed to converge everywhere except at discontinuities, since the functions encountered in engineering are more well behaved than the ones that mathematicians can provide as counter-examples to this presumption. In particular, the Fourier series converges absolutely and uniformly to s(x) whenever the derivative of s(x) (which may not exist everywhere) is square integrable.[8] SeeConvergence of Fourier series. It is possible to define Fourier coefficients for more general functions or distributions, in such cases convergence in norm or weak convergence is usually of interest. Example 1: a simple Fourier series[edit]

Plot of a periodic identity function, a sawtooth wave

Animated plot of the first five successive partial Fourier series We now use the formula above to give a Fourier series expansion of a very simple function. Consider a sawtooth wave

In this case, the Fourier coefficients are given by

It can be proven that the Fourier series converges to s(x) at every point x where s is differentiable, and therefore:

(Eq.1)

When x = π, the Fourier series converges to 0, which is the half-sum of the left- and right-limit of s at x = π. This is a particular instance of the Dirichlet theorem for Fourier series.

Heat distribution in a metal plate, using Fourier's method

Example 2: Fourier's motivation[edit]

The Fourier series expansion of our function in example 1 looks much less simple than the formula s(x) = x/π, and so it is not immediately apparent why one would need this Fourier...

Please join StudyMode to read the full document