![]() |
|
|
#1 |
|
Jul 2014
3·149 Posts |
As I understand it (from my Open University course material), generally if an infinite number of terms are taken for a Taylor Approximation Polyonomial of a function the resulting Taylor Series is not a perfectly accurate version of the function.
Can anyone tell of a few functions from this majority? Is the class of functions for which there is a perfectly accurate Taylor Series simply defined? |
|
|
|
|
|
#2 | |
|
∂2ω=0
Sep 2002
República de California
103·113 Posts |
Quote:
Um ... polynomials? OK, so more generally, from Wikipedia: "A function that is equal to its Taylor series in an open interval (or a disc in the complex plane) is known as an analytic function in that interval ... If f(x) is equal to its Taylor series for all x in the complex plane, it is called entire. The polynomials, exponential function e^x, and the trigonometric functions sine and cosine, are examples of entire functions. Examples of functions that are not entire include the square root, the logarithm, the trigonometric function tangent, and its inverse, arctan." Polynomials are a special case in being exactly approximable via a finite-term Taylor series expansion. Last fiddled with by ewmayer on 2017-07-19 at 22:37 |
|
|
|
|
|
|
#3 |
|
Jul 2014
3·149 Posts |
So if a function is entire, it's Taylor Series is accurate for real numbers then?
|
|
|
|
|
|
#4 | |
|
∂2ω=0
Sep 2002
República de California
103·113 Posts |
Quote:
A good example of this is f(x) = 1/(1+x^2) ... the denominator is nonvanishing along the entirety of the real line, but the convergence domain of the TS about any point x0 (or more generally, complex point z0) is defined by the location of the nearest of the complex poles (considering the same function over the complex numbers, i.e. f(z) = 1/(1+z^2)) at z = +- i. A good exercise for you: compute the TS about the origin and show that it will only converge for |x| < 1. (More generally, the complex TS about the origin only converges within the open disc |z| < 1.) Last fiddled with by ewmayer on 2017-07-20 at 00:48 |
|
|
|
|
|
|
#5 | ||
|
Dec 2012
The Netherlands
6A616 Posts |
Quote:
\[ f(x)=\left\{ \begin{array}{ll} e^{-\frac{1}{x^2}} & \mbox{if }x\neq 0 \\ 0 & \mbox{otherwise} \end{array} \right. \] This function is so flat at 0 that all its derivatives are 0! Quote:
For a function f defined on an open ball with centre a in the complex plane, f is equal to its Taylor series at a if and only if f is differentiable (as a complex function). This is one example of how analysis with complex numbers is more elegant than with real numbers! |
||
|
|
|
|
|
#6 |
|
Bamboozled!
"𒉺𒌌𒇷𒆷𒀭"
May 2003
Down not across
29×3×7 Posts |
Another cute function is f(x)=x if x is rational, 0 otherwise.
Substitute "transcendental", "algebraic", "computable" as desired. |
|
|
|
|
|
#7 | |
|
Feb 2017
Nowhere
4,643 Posts |
Quote:
f(x) = 0, if x is irrational, and 1/b if x is rational, where b is the least positive integer for which x = a/b with a, b integers. This function has the amusing property of being continuous at irrational x, but discontinuous at rational x. As far as power series are concerned, I join the chorus that analytic functions (functions of one complex variable) are the way to go. The complex function f(z) = exp(1/z^2) for z \ne 0, of course has a Laurent development around z = 0 (just substitute 1/z^2 for z in the usual series for exp(z)). But the Laurent series does not terminate, so instead of a "pole," f(z) has what is called an "essential singularity" at z = 0. The behavior of a function which is analytic in the neighborhood of a pole is not terribly complicated; it just "blows up" like 1/z^k near z = 0, where k is a positive integer. But in any neighborhood of an essential singularity, an analytic function assumes all values (with at most one exception) infinitely many times (Great Picard Theorem). Another fun type of representation is Fourier series and integrals. They can be used for functions (especially periodic functions) which are piecewise continuous, with at most "jump discontinuities." But the partial sums of the series, or the incomplete integrals, exhibit weird oscillations in the vicinity of the jumps, called the "Gibbs phenomenon." These oscillations do not go away as you add more terms, or increase the upper limit of the incomplete Fourier integral. Rather, they get squashed closer and closer to the jumps. Last fiddled with by Dr Sardonicus on 2017-07-21 at 13:06 |
|
|
|
|
![]() |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| Fourier Series for Prime Number Counting Functions | SteveC | Analysis & Analytic Number Theory | 10 | 2016-10-14 21:48 |
| Inverse of functions | Raman | Math | 5 | 2011-04-13 23:29 |
| New exact theorem | Master Alex | Miscellaneous Math | 38 | 2007-03-05 18:30 |
| Error always at exact same spot... | Unregistered | Hardware | 37 | 2006-12-13 16:15 |
| Busted functions | Dougy | Miscellaneous Math | 4 | 2006-02-22 20:59 |