![]() |
Yes. Generally speaking, arithmetic complexity is one further step removed from reality (vs. bit complexity), and its use must be noted. Bit complexity is usually assumed. But to really tell properly, you must read the paper! Phrases like "arithmetic complexity" or "unit cost" are the giveaways. But there's much more to it than that, since you also need to consider the model. Are they using a Turing machine, a multitape Turing machine with fast arithmetic subroutines, or a PRAM?
|
[quote=CRGreathouse;156895]Yes. Generally speaking, arithmetic complexity is one further step removed from reality (vs. bit complexity), and its use must be noted. Bit complexity is usually assumed. But to really tell properly, you must read the paper! Phrases like "arithmetic complexity" or "unit cost" are the giveaways. But there's much more to it than that, since you also need to consider the model. Are they using a Turing machine, a multitape Turing machine with fast arithmetic subroutines, or a PRAM?[/quote]
I read the paper, and what gave it away was the "binary splitting" referenced in the abstract. However, it talked about n digits and not n [B]bits[/B] so I wasn't precisely sure (and wanted to be sure). I see so unit cost gives it away that they are talking about arithmetic complexity iff the phrase "arithmetic complexity" is not used. |
Now, would the bit complexity of [tex]\frac{n!}{n}[/tex] given an input of n bits be compexity of calculating n! given n bits + complexity of calculating [tex]\frac{n!}{n}[/tex] given n bits? If we use the Schonhage-Strassen algorithm for multiplication, then according to Borwein, compexity of calculating n! given n bits = [tex]O(n (\log n \log \log n)^2)[/tex]. However, what is the complexity of dividing the computed n! by n? My guess is to instead compute [tex]n! \cdot \frac{1}{n}[/tex] using S-S for multiplication instead of dividing n! by n using common methods (thus, the total bit complexity is [tex]O(n (\log n \log \log n)^2)[/tex] + complexity of finding the reciprocal of n (which I believe is [tex]O(n)[/tex]) + complexity of multiplying n! with 1/n using Schonhage-Strassen (which is [tex]O(n \log n \log \log n)[/tex]) = [tex]O(n (\log n \log \log n)^2 + n + n \log n \log \log n[/tex])).
|
[QUOTE=flouran;156852]I have a rather simple question which my own answer to it needs justification. I'm still learning about algorithmic complexity and am reading the article CRGreathouse so graciously gave me. Say I have an n-bit number and it is stored as the input in a simple computer algorithm. Next, this algorithm adds 1 to the original input to produce the output n+1. Wouldn't this operation only take 1 bit operation (as we are only manipulating 1 bit in the original input to produce the output), and hence, it would have a run-time bit-complexity of O(1)? If n=1000000, then using the algorithm proposed, the output, 1000001, would take O(1) time, right?[/QUOTE]It would in the absence of carries...
Try adding 1 to 999999. Paul |
[quote=xilman;156909]Try adding 1 to 999999.[/quote]
99999A. :wink: |
[quote=flouran;156902]Now, would the bit complexity of [tex]\frac{n!}{n}[/tex] given an input of n bits be compexity of calculating n! given n bits + complexity of calculating [tex]\frac{n!}{n}[/tex] given n bits? If we use the Schonhage-Strassen algorithm for multiplication, then according to Borwein, compexity of calculating n! given n bits = [tex]O(n (\log n \log \log n)^2)[/tex]. However, what is the complexity of dividing the computed n! by n? My guess is to instead compute [tex]n! \cdot \frac{1}{n}[/tex] using S-S for multiplication instead of dividing n! by n using common methods (thus, the total bit complexity is [tex]O(n (\log n \log \log n)^2)[/tex] + complexity of finding the reciprocal of n (which I believe is [tex]O(n)[/tex]) + complexity of multiplying n! with 1/n using Schonhage-Strassen (which is [tex]O(n \log n \log \log n)[/tex]) = [tex]O(n (\log n \log \log n)^2 + n + n \log n \log \log n[/tex])).[/quote]
Is my reasoning correct? |
[QUOTE=flouran;156896]I read the paper, and what gave it away was the "binary splitting" referenced in the abstract. However, it talked about n digits and not n [B]bits[/B] so I wasn't precisely sure (and wanted to be sure). I see so unit cost gives it away that they are talking about arithmetic complexity iff the phrase "arithmetic complexity" is not used.[/QUOTE]
"Binary splitting" tells you nothing about whether the paper uses bit complexity or arithmetic complexity. Digits of any constant base scale the same as bits, so it doesn't matter if digits or bits are referenced. I don't understand your iff statement above; I explained above that not all papers using arithmetic complexity use the phrase "arithmetic complexity". |
[QUOTE=flouran;156902]Now, would the bit complexity of [tex]\frac{n!}{n}[/tex] given an input of n bits be compexity of calculating n! given n bits + complexity of calculating [tex]\frac{n!}{n}[/tex] given n bits? If we use the Schonhage-Strassen algorithm for multiplication, then according to Borwein, compexity of calculating n! given n bits = [tex]O(n (\log n \log \log n)^2)[/tex]. However, what is the complexity of dividing the computed n! by n? My guess is to instead compute [tex]n! \cdot \frac{1}{n}[/tex] using S-S for multiplication instead of dividing n! by n using common methods (thus, the total bit complexity is [tex]O(n (\log n \log \log n)^2)[/tex] + complexity of finding the reciprocal of n (which I believe is [tex]O(n)[/tex]) + complexity of multiplying n! with 1/n using Schonhage-Strassen (which is [tex]O(n \log n \log \log n)[/tex]) = [tex]O(n (\log n \log \log n)^2 + n + n \log n \log \log n[/tex])).[/QUOTE]
1. What does "given n bits" mean in "complexity of calculating [tex]\frac{n!}{n}[/tex] given n bits"? n bits of what? 2. The complexity of calculating f(x)/y directly is the complexity of calculating f(x) plus the complexity of dividing by y. In particular, if you calculate n! in [tex]O(n (\log n \log \log n)^2)[/tex] bit operations and you divide n! by n with the asymmetric schoolbook method with [tex]O(\log n\log n!)=O(n(\log n)^2)[/tex], making the overall complexity [tex]O(n (\log n \log \log n)^2+n(\log n)^2)=O(n (\log n \log \log n)^2).[/tex] Basically, the last division is so small it doesn't matter. 3. But why would you do that anyway? n!/n = (n-1)!, so you can calculate it in [tex]O((n-1) (\log (n-1) \log \log (n-1))^2)=O(n (\log n \log \log n)^2)[/tex] operations. |
[QUOTE=flouran;156978]Is my reasoning correct?[/QUOTE]
No, because you give the complexity of multiplying n! by 1/n with S-S as [tex]O(n\log n\log\log n)[/tex] which is the complexity for an n x n multiplication. n! has about n log n bits. |
[QUOTE=philmoore;156864]This raises an interesting point, because potentially, you are manipulating n bits, but on average, you are only manipulating 2 bits. Is there a concept of bit complexity that takes "on average" into account?[/QUOTE]
Yes. Strangely enough it is called "average case complexity" :smile: |
[quote=CRGreathouse;156989]1. What does "given n bits" mean in "complexity of calculating [tex]\frac{n!}{n}[/tex] given n bits"? n bits of what?
2. The complexity of calculating f(x)/y directly is the complexity of calculating f(x) plus the complexity of dividing by y. In particular, if you calculate n! in [tex]O(n (\log n \log \log n)^2)[/tex] bit operations and you divide n! by n with the asymmetric schoolbook method with [tex]O(\log n\log n!)=O(n(\log n)^2)[/tex], making the overall complexity [tex]O(n (\log n \log \log n)^2+n(\log n)^2)=O(n (\log n \log \log n)^2).[/tex] Basically, the last division is so small it doesn't matter. 3. But why would you do that anyway? n!/n = (n-1)!, so you can calculate it in [tex]O((n-1) (\log (n-1) \log \log (n-1))^2)=O(n (\log n \log \log n)^2)[/tex] operations.[/quote] When I meant n bits I meant to say that I was talking in terms of bits, in other words, bit complexity not arithmetic complexity. And wow, that was blindsighted of me to not notice n!/n = (n-1)!. Essentially, I was just trying to get at the concept behind bit complexity with the above example rather than deal with arithmetic stuff. But seriously, that was stupid stupid stupid stupid stupid stupid stupid stupid stupid on my part and I apologize for wasting your time. Here is a better example: Calculate the bit complexity of [tex]\frac{n!}{n+1}[/tex]. In this example wouldn't it be better to use Schonhage-Strassen for the multiplication of [tex]{n!} \cdot {\frac{1}{n+1}}[/tex] rather than the asymmetric division of [tex]\frac{n!}{n+1}[/tex]? The total complexity is then, complexity of computing n! + complexity of computing n+1 + complexity of dividing n! by n+1 (using asymmetric "schoolbook" division) = [tex]O(n (\log n \log \log n)^2) + O(n) + O(n (\log n)^2) = O(n (\log n \log \log n)^2 + n + n (\log n)^2)[/tex]. However, if I used Schonhage-Strassen instead of standard division, then the total complexity would be, complexity of computing n! + complexity of computing n+1 + complexity of computing 1/(n+1) + complexity of using Schonhage-Strassen to multiply n! by 1/(n+1) = [tex]O(n (\log n \log \log n)^2) + O(n)[/tex] + (complexity of dividing 1 by n+1 which I believe is [tex]O(n+1) = O(n)[/tex]) + [tex]O(n \log n \cdot \log (n \log n) \cdot \log \log (n log n))[/tex]. Sorry, I didn't simplify because I care more about concepts than arithmetics. Could someone please help me correct any errors in this post? |
| All times are UTC. The time now is 08:22. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.