![]() |
[QUOTE=CRGreathouse;174866]I've read it (and the one by Damgård, Landrock & Pomerance). I was addressing only the correctness of flouran's result.[/QUOTE]
Mea Culpa. I responded to the wrong post. I meant for flouran to read it. |
[QUOTE=R.D. Silverman;174849]May I suggest that reading the Pomerance & Kim paper would be a
good idea?[/QUOTE] You're referring to [U]The Probability that a Random Probable Prime is Composite[/U] by Kim and Pomerance? I've read part of it. In your paper, Dr. Silverman, entitled [U]Fast Generation of Random, Strong RSA Primes[/U], you refer to equation 1.6 of the Kim and Pomerance paper: "Use of the Miller-Rabin algorithm combined with a Lucas or Frobenius strong probable prime test [5]. According to a very recent paper of Grantham, if one uses a Frobenius probable prime test, the probability that a candidate is composite when the tests say `prime' is less than 1/1770, as opposed to the 1/4 one gets with Miller-Rabin alone. If one performs a single Miller-Rabin test, followed by T Frobenius tests, the probability of error for 512-bit primes is then less than 1.5*10^-17(1/1770)^T [Pomerance-Kim, equation 1.6]. To achieve a probability of 2^-100, T = 4 sufficies. One should be able to apply the analytical techniques of [3] to the Grantham algorithm to arrive at even stronger probability estimates. That is to say the number 1.5*10^-17 can probably be reduced, but the method is as yet too new for anyone to have done this analysis. However, the correctness of the 1.5 * 10^-17 bound is not in question." May I ask your reasons? |
[QUOTE=flouran;174896]You're referring to [U]The Probability that a Random Probable Prime is Composite[/U] by Kim and Pomerance? I've read part of it. In your paper, Dr. Silverman, entitled [U]Fast Generation of Random, Strong RSA Primes[/U], you refer to equation 1.6 of the Kim and Pomerance paper:
"Use of the Miller-Rabin algorithm combined with a Lucas or Frobenius strong probable prime test [5]. According to a very recent paper of Grantham, if one uses a Frobenius probable prime test, the probability that a candidate is composite when the tests say `prime' is less than 1/1770, as opposed to the 1/4 one gets with Miller-Rabin alone. If one performs a single Miller-Rabin test, followed by T Frobenius tests, the probability of error for 512-bit primes is then less than 1.5*10^-17(1/1770)^T [Pomerance-Kim, equation 1.6]. To achieve a probability of 2^-100, T = 4 sufficies. One should be able to apply the analytical techniques of [3] to the Grantham algorithm to arrive at even stronger probability estimates. That is to say the number 1.5*10^-17 can probably be reduced, but the method is as yet too new for anyone to have done this analysis. However, the correctness of the 1.5 * 10^-17 bound is not in question." May I ask your reasons?[/QUOTE] Thus, it seems to me that you just multiplied the two probabilities together, rather than using Bayes' Theorem. Am I right or simply deluded? |
| All times are UTC. The time now is 22:12. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.