20170719, 21:49  #1 
Jul 2014
2×3^{2}×5^{2} Posts 
functions which don't have an exact Taylor Series.
As I understand it (from my Open University course material), generally if an infinite number of terms are taken for a Taylor Approximation Polyonomial of a function the resulting Taylor Series is not a perfectly accurate version of the function.
Can anyone tell of a few functions from this majority? Is the class of functions for which there is a perfectly accurate Taylor Series simply defined? 
20170719, 22:30  #2  
∂^{2}ω=0
Sep 2002
Repรบblica de California
11,743 Posts 
Quote:
Um ... polynomials? OK, so more generally, from Wikipedia: "A function that is equal to its Taylor series in an open interval (or a disc in the complex plane) is known as an analytic function in that interval ... If f(x) is equal to its Taylor series for all x in the complex plane, it is called entire. The polynomials, exponential function e^x, and the trigonometric functions sine and cosine, are examples of entire functions. Examples of functions that are not entire include the square root, the logarithm, the trigonometric function tangent, and its inverse, arctan." Polynomials are a special case in being exactly approximable via a finiteterm Taylor series expansion. Last fiddled with by ewmayer on 20170719 at 22:37 

20170719, 23:05  #3 
Jul 2014
2·3^{2}·5^{2} Posts 
So if a function is entire, it's Taylor Series is accurate for real numbers then?

20170720, 00:40  #4  
∂^{2}ω=0
Sep 2002
Repรบblica de California
11,743 Posts 
Quote:
A good example of this is f(x) = 1/(1+x^2) ... the denominator is nonvanishing along the entirety of the real line, but the convergence domain of the TS about any point x0 (or more generally, complex point z0) is defined by the location of the nearest of the complex poles (considering the same function over the complex numbers, i.e. f(z) = 1/(1+z^2)) at z = + i. A good exercise for you: compute the TS about the origin and show that it will only converge for x < 1. (More generally, the complex TS about the origin only converges within the open disc z < 1.) Last fiddled with by ewmayer on 20170720 at 00:48 

20170720, 06:57  #5  
Dec 2012
The Netherlands
5·353 Posts 
Quote:
\[ f(x)=\left\{ \begin{array}{ll} e^{\frac{1}{x^2}} & \mbox{if }x\neq 0 \\ 0 & \mbox{otherwise} \end{array} \right. \] This function is so flat at 0 that all its derivatives are 0! Quote:
For a function f defined on an open ball with centre a in the complex plane, f is equal to its Taylor series at a if and only if f is differentiable (as a complex function). This is one example of how analysis with complex numbers is more elegant than with real numbers! 

20170720, 19:37  #6 
Bamboozled!
"๐บ๐๐ท๐ท๐ญ"
May 2003
Down not across
10110011101100_{2} Posts 
Another cute function is f(x)=x if x is rational, 0 otherwise.
Substitute "transcendental", "algebraic", "computable" as desired. 
20170721, 13:03  #7  
Feb 2017
Nowhere
2^{5}·11·17 Posts 
Quote:
f(x) = 0, if x is irrational, and 1/b if x is rational, where b is the least positive integer for which x = a/b with a, b integers. This function has the amusing property of being continuous at irrational x, but discontinuous at rational x. As far as power series are concerned, I join the chorus that analytic functions (functions of one complex variable) are the way to go. The complex function f(z) = exp(1/z^2) for z \ne 0, of course has a Laurent development around z = 0 (just substitute 1/z^2 for z in the usual series for exp(z)). But the Laurent series does not terminate, so instead of a "pole," f(z) has what is called an "essential singularity" at z = 0. The behavior of a function which is analytic in the neighborhood of a pole is not terribly complicated; it just "blows up" like 1/z^k near z = 0, where k is a positive integer. But in any neighborhood of an essential singularity, an analytic function assumes all values (with at most one exception) infinitely many times (Great Picard Theorem). Another fun type of representation is Fourier series and integrals. They can be used for functions (especially periodic functions) which are piecewise continuous, with at most "jump discontinuities." But the partial sums of the series, or the incomplete integrals, exhibit weird oscillations in the vicinity of the jumps, called the "Gibbs phenomenon." These oscillations do not go away as you add more terms, or increase the upper limit of the incomplete Fourier integral. Rather, they get squashed closer and closer to the jumps. Last fiddled with by Dr Sardonicus on 20170721 at 13:06 

Thread Tools  
Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Fourier Series for Prime Number Counting Functions  SteveC  Analysis & Analytic Number Theory  10  20161014 21:48 
Inverse of functions  Raman  Math  5  20110413 23:29 
New exact theorem  Master Alex  Miscellaneous Math  38  20070305 18:30 
Error always at exact same spot...  Unregistered  Hardware  37  20061213 16:15 
Busted functions  Dougy  Miscellaneous Math  4  20060222 20:59 