![]() |
|
|
#12 |
|
Nov 2003
164448 Posts |
"By my troth, I tried to edit my message, but notwithstanding I'm a registered poster, I could not do this and even could not find how to do this... Why do you totally disagree with my estimation of the probability of error? "
Because it is simply wrong! Why do you insist on being dogmatic when you admit that you have no math background? "The probability for N to be CarMichael Number is 1/N^(2/3), is not it? " No, it is NOT. Read the paper by Damgard, Landrock, & Pomerance. I am curious as to where you got your estimate. It is meaningless to talk about the probability of N being a Carmichael number. Why? Because for any GIVEN N, it either is or it isn't a Carmichael number. That is to say the probability is either 1 or 0. What is really meant is that N has been declared prime by a decision procedure that *sometimes makes mistakes*. The frequency with which it makes mistakes depends on the size of the number. But that probability is not a simple expression in terms of N. An *upper bound* can be computed, but the algorithm to do so is complicated. See the Damgard et.al. paper. It is also meaningless to talk about (or try to compute) an *exact* probability. The only way to do so would be to actually count ALL of the Carmichael numbers of a given size. This was done up to about 10^15 by Richard Pinch, but trying to count them for crypto-sized numbers is computationally hopeless. "And extra-factor 2^(-10) give five rounds of Rabin Test." No. Once again, go read the Damgard et.al. paper. After you do, if you still have problems, I will be happy to help out. Five rounds of Miller-Rabin do NOT give an "extra" factor of 2^-10. Why? Think about Bayes Theorem. You must combine the per-round probability of failing a Miller-Rabin test with the a-priori probability that a Fermat probable prime is actually a pseudo-prime. And the "1/4" per-round probability often (mis)quoted for Miller-Rabin is a worst case *upper bound*. This probability decreases as the numbers get larger. Furthermore the "1/4" is not a constant. It changes with each successive iteration. Why? Because once you do 1 iteration you will have eliminated a certain percentage of the numbers that fail the test. A 2nd iteration eliminates yet more, etc. etc. See the Damgard et.al. paper. Let me also add that if you do not understand O() notation then you should refrain from discussing this topic entirely. One absolutely needs a basic understanding of how complexity is measured before discussing the subject. Finally, your use of the term "Generalized Mersenne Prime" is a mis-nomer. Mathematicians have used the term for years, but it means something entirely different. Note that a Mersenne prime is a repunit in base 2. A Generalized Mersenne Prime is a prime that is a repunit in a different base. This includes, e.g. (3^n-1)/2, (5^n-1)/4, (10^n-1)/9 etc. What you mean is "Low Hamming Weight Primes". |
|
|
|
|
|
#13 |
|
Mar 2003
34 Posts |
Poor Bob! You have understood me literally!!!
When I said "exactly" I meant more exact estimation than merely "very-very small but still NOT zero". The integral distribution for CarMichal Numbers CM(N) is: N^(2/7) < CM(N) = ~N^(1/3) < N^(1-(1+o(1))*lnlnln(N)/lnln(N)) So the density of probability will be d[N^(1/3)]/dN = ~N^(-2/3). My estimation is PRETTY GOOD indeed. |
|
|
|
|
|
#14 |
|
Nov 2003
22×5×373 Posts |
"Poor Bob! You have understood me literally!!!
When I said "exactly" I meant more exact estimation than merely "very-very small but still NOT zero". The integral distribution for CarMichal Numbers CM(N) is: N^(2/7) < CM(N) = ~N^(1/3) < N^(1-(1+o(1))*lnlnln(N)/lnln(N)) So the density of probability will be d[N^(1/3)]/dN = ~N^(-2/3). My estimation is PRETTY GOOD indeed. " (1) This is mathematics. When one says "exactly" the audience expects that this is precisely what you mean. Say what you mean. (2) O(N^(2/7)) is the best proven *lower* bound. Note that I wrote O(N^ (2/7)) and not just N^(2/7). It is conjectured, and there is evidence to support CM(N) = O(N^{1- epsilon}) for any epsilon > 0. But this is a conjecture. Your estimate is LOUSY. There is a large range between 2/7 and 1-epsilon. Once again, stop being so dogmatic about a subject which (by your own admission) you know nothing about. Furthermore, you ignored my suggestion that you learn O() notation. CM(N) > O(N^(2/7)) means CM(N) > k * N^(2/7) for some unspecified positive constant. The estimate also only applies as N -->oo and k can be very large indeed. Your comparision N^(2/7) ~ N^(1/3) is wrong. And you ignore the upper bound and the implied constant in the O() estimate in deriving your "PRETTY GOOD" estimate. Asymptotic O() estimates are useless for the purpose of obtaining an actual estimate for a particular size of N. To obtain an actual estimate I once again suggest that you read the paper of Damgard et.al. Go learn the math behind all of this and stop being so dogmatic. And learn the meaning of O() estimates. And do not omit writing them when you discuss this subject. For a random 500 bit number passing 5 iterations of Miller Rabin, the Damgard at.al. paper yields an upper bound of 2^-85 on the probability that the number is NOT prime. Note that this is an upper bound. 10 iterations yields an upper bound of 2^-119. |
|
|
|
|
|
#15 |
|
Mar 2003
34 Posts |
Your comparision N^(2/7) ~ N^(1/3) is wrong.
You are not right at all. It's not mine. This is a numerical empirical fact. (I am too lazy to give you refference but it is true). |
|
|
|
|
|
#16 |
|
Mar 2003
5116 Posts |
For a random 500 bit number passing 5 iterations of Miller Rabin, the Damgard
at.al. paper yields an upper bound of 2^-85 on the probability that the number is NOT prime. Note that this is an upper bound. 10 iterations yields an upper bound of 2^-119. OK, lets take an upper bound of CarMichael distribution. 500*lnlnln(500)/lnln(500)*2(-20)=140 This is in good agreement with what you wrote
|
|
|
|
|
|
#17 |
|
Mar 2003
34 Posts |
Sorry!
500*lnln(500)/ln(500)*2(-20)=120 Even better!!! |
|
|
|
|
|
#18 |
|
Mar 2003
34 Posts |
Sorry again!!!
500*lnln(500)/ln(500)-20 = 120 |
|
|
|
|
|
#19 |
|
Mar 2003
34 Posts |
Oh, shit!!! I have confused with signes.
500*lnln(500)/ln(500)+20 = 160 So one can fast estimate probability of error as: < 2^(-N*lnln(N)/ln(N)) for N-bit number |
|
|
|
|
|
#20 |
|
Nov 2003
22×5×373 Posts |
I wrote:
Your comparision N^(2/7) ~ N^(1/3) is wrong. the reply was: "You are not right at all. It's not mine. This is a numerical empirical fact. (I am too lazy to give you refference but it is true)." I will try one last time. (1) "This is a numerical empirical fact" is simply wrong. (2) Your "reference by appeal to unknown authority" has a hollow ring. If a function of N, say W(N) is O(N^(2/7)), then indeed W(N) is also O(N^(2/3)). But to jump from this to N^(2/7) ~ N^(1/3), then to claim that a probability computed from N^(1/3) is accurate shows a complete lack of comprehension of big O notation. I suggested some time ago that you learn it. You have chosen to ignore the suggestion. I also asked that you not omit O() estimates during this discussion. You ignored that request as well. You also ignored what I told you about O(N^(2/7)) being a LOWER bound on the number of Carmichael numbers less than N. For the purposes of trying to compute the probability that a randomly chosen integer is Carmichael you need to look at the UPPER BOUND. And even if you use the upper bound information you still cannot get any kind of numerical answer because the implied constants given by the O() estimates are UNKNOWN. *If* you are interested in learning about what is going on, I am willing to help. But STOP making dogmatic assertions about a subject which you clearly have not studied. If you are not interested in learning, then please say so and I will stop wasting my time. |
|
|
|
|
|
#21 |
|
Mar 2003
1218 Posts |
Finally, your use of the term "Generalized Mersenne Prime" is a mis-nomer. Mathematicians have used the term for years, but it means something entirely different. Note that a Mersenne prime is a repunit in base 2. A Generalized Mersenne Prime is a prime that is a repunit in a different base. This includes, e.g. (3^n-1)/2, (5^n-1)/4, (10^n-1)/9 etc. What you mean is "Low Hamming Weight Primes". Totally agree with your remark. But this is not my delusion. Very often such kinda Low Weight Numbers they call Generalized Mersenne. See the following: J. Solinas. Generalized Mersenne numbers. Technical Report CORR 99-39, Dept.of C&O, University of Waterloo, 1999. http://www.win.tue.nl/~henkvt/GenMersenne.pdf And indeed the word "Generalized" should mean "the base other than 2". Generalized Mersenne Primes have a form (b^n-1)/(b-1), n must be prime. It is interesting that the prime page http://primes.utm.edu contains no information about GMP!!! At least, I couldn't find any... And what about decimal numeration record, i.e. b=10? I easily found that n=2, 19, 23, 317, 1031... Please, Robert, give me a reference. How one can prove the primality of GMP? |
|
|
|
|
|
#22 |
|
Dec 2003
Hopefully Near M48
2·3·293 Posts |
(10^p)-1 is never prime, since it would just produce a string of 9's, which is divisible by a string of 1's (of equal length).
|
|
|
|
![]() |
| Thread Tools | |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| Mersenne Primes p which are in a set of twin primes is finite? | carpetpool | Miscellaneous Math | 3 | 2017-08-10 13:47 |
| Distribution of Mersenne primes before and after couples of primes found | emily | Math | 34 | 2017-07-16 18:44 |
| Dirty Dancing | davieddy | Lounge | 6 | 2011-08-06 16:19 |
| Quick & Dirty | storm5510 | Programming | 37 | 2009-09-08 06:52 |
| possible primes (real primes & poss.prime products) | troels munkner | Miscellaneous Math | 4 | 2006-06-02 08:35 |