Taylor series

In mathematics, a Taylor series is a representation of a function as an infinite sum of terms that are calculated from the values of the function's derivatives at a single point.

In the West, the subject was formulated by the Scottish mathematician James Gregory and formally introduced by the English mathematician Brook Taylor in 1715. If the Taylor series is centered at zero, then that series is also called a Maclaurin series, after the Scottish mathematician Colin Maclaurin, who made extensive use of this special case of Taylor series in the 18th century.

A function can be approximated by using a finite number of terms of its Taylor series. Taylor's theorem gives quantitative estimates on the error introduced by the use of such an approximation. The polynomial formed by taking some initial terms of the Taylor series is called a Taylor polynomial. The Taylor series of a function is the limit of that function's Taylor polynomials as the degree increases, provided that the limit exists. A function may not be equal to its Taylor series, even if its Taylor series converges at every point. A function that is equal to its Taylor series in an open interval (or a disc in the complex plane) is known as an analytic function in that interval.

Source: Taylor series
TIPS: To create new content for your website or blog...
1. Enter the title of a Wikipedia article in the box above.
2. Select your options using the checkboxes, or use default settings.
3. Click the 'go' button to retrieve the article.
4. Click the 'Copy Code' button to copy the source code of the article to your clipboard.
5. Paste the source code into your favorite HTML editor.
6. Edit the content to suit your needs.