![]() |
|
|
#1 |
|
Feb 2004
France
22·229 Posts |
Hello,
I know this has already been said in this forum (Mr Silverman I think), but I cannot find it again. What is the overall complexity of the LLT as implemented by George, with the IBDWT method (the book of Crandall and Pomerance does not say a world about this). I'm lost with the 2 following information: Wikipedia : The most efficient known multiplication method, the Schönhage-Strassen algorithm based on the Fast Fourier transform, requires O(p log p log log p) time to square a p-bit number, reducing the complexity to O(p2 log p log log p) or Õ(p2). Mersenne Wiki : When used with the Fast Fourier transform, the test has the complexity of O(n2 log n), where n is the length of the number. Thanks, Tony |
|
|
|
|
|
#2 | |
|
Jun 2005
2×191 Posts |
Quote:
Also, I don't believe the DWT represents an improvement in complexity over a typical FFT, it only reduces the proportionality constant. Drew Last fiddled with by drew on 2007-05-28 at 01:29 |
|
|
|
|
|
|
#3 |
|
Feb 2006
Denmark
2·5·23 Posts |
FFT multiplication of n-bit numbers is O(n*log n*log log n).
The log log n term grows very slowly and realistic register and operand sizes make it disappear in practice in most implementations, so the time is often but inaccurately simplified to O(n*log n). A prp or fast proof test gets another factor n because it makes O(n) multiplications. Optimizations of special forms like with IBDWT give a constant factor in implementations, but the asymptotic time is unchanged. |
|
|
|
|
|
#4 | |
|
Feb 2004
France
22×229 Posts |
Quote:
Let me add parenthesis to clarify. So, for a 2^n-1 Mersenne number, executing the full LLT (n-2 steps), is O( n * n* log(n) * log(log(n)) ) in theory, which simplifies in O( n^2 * log(n) ). Though the simple multiplication we learn at school would lead to O(n^3) for a whole LLT. Correct ? So it seems that Drew was correct and that our Mersenne Wiki is too vague by saying "the length of the number". Thanks, Tony |
|
|
|
|
|
|
#5 | |
|
Jun 2005
5768 Posts |
Quote:
Edit: I stand corrected, I just graphed the function, and it looks like the rate of growth of log(log(n)) decreases substantially as n grows. Last fiddled with by drew on 2007-05-28 at 18:00 |
|
|
|
|
|
|
#6 | |
|
Feb 2006
Denmark
3468 Posts |
Quote:
|
|
|
|
|
|
|
#7 | |
|
Feb 2004
France
11100101002 Posts |
Quote:
Thanks ! Tony |
|
|
|
|
|
|
#8 | |
|
∂2ω=0
Sep 2002
República de California
103·113 Posts |
Quote:
FFT gets you an asymptotic n --> log n reduction, assuming you can implement it in manner that actually scales that way on modern cache-based microprcessors. DWT (a.k.a. "implicit mod") halves your runlength, i.e. gets you an asymptotic factor of ~2x. (At finite runlengths of practical interest, the speedup may be appreciably greater, approaching a factor of 3, due to the aforementioned cache effects.) Balanced-digit arithmetic gives a significant improvement in the slowness of the growth of roundoff error. I leave it as a homework exercise as to the (heuristic) asymptotic behavior here. Pure-integer-based transforms are not subject to roundoff error, but still have input wordsize restricted by the magnitude of convolution outputs. In fact, for a careful implementation of balanced-digit arithmetic, this convolution-output-size effect tends to dwarf the roundoff-error-growth effect. In other words, as n gets larger, floating-point (counterintuitively) becomes relatively more attractive, not less. (This assumes that one starts with a floating-point precision sufficiently good to be competitive with pure-integer at modest n -- that typically means double precision). Last fiddled with by ewmayer on 2007-05-29 at 20:58 |
|
|
|
|
|
|
#9 |
|
Feb 2004
France
91610 Posts |
Since I plan to talk about that in a paper, what should I say about the complexity of LLT with basic multiplication compared to LLT with FFT/IBDWT ?
Is: "Thanks to FFT, we have about O(n^2*log n) instead of O(n^3) with scholar method" correct ? And what should I add (very short) about the impact of IBDWT ? Since I do not master that point, I need help to get the correct sentence. Thanks for helping ! Tony |
|
|
|
|
|
#10 | ||
|
∂2ω=0
Sep 2002
República de California
103·113 Posts |
Hi, Tony:
Quote:
(What you might call the "Ecole normal inférieur" method, to make a pun on the French school naming system. ;) I further suggest replacing "about" with the more mathematically precise "asymptotically." Quote:
Cheers, -Ernst |
||
|
|
|
![]() |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| Complexity of Chinese Remainder Theorem | carpetpool | Miscellaneous Math | 4 | 2017-02-09 19:26 |
| What is the time complexity of base conversion? | Mr. P-1 | Computer Science & Computational Number Theory | 5 | 2013-04-02 15:47 |
| Complexity analysis of 3 tests | kurtulmehtap | Math | 10 | 2013-03-20 14:15 |
| space complexity of common algorithms? | ixfd64 | Math | 4 | 2006-11-27 20:52 |
| complexity of Pepin's test | ixfd64 | Math | 14 | 2005-12-01 22:50 |