![]() |
|
|
#12 |
|
Nov 2003
22×5×373 Posts |
Note further that if what you want to ask for is:
"reduce the time without reducing the probability of success" this is IMPOSSIBLE. Last fiddled with by ewmayer on 2015-12-10 at 04:56 Reason: No need to requote your previous post in its entirety, Bob - once was more than enough. |
|
|
|
|
#13 | |
|
Dec 2014
111111112 Posts |
Quote:
The file is 53 MB (compressed 22 MB) for p to 42,549,973 . Let me know how I can send it to you. |
|
|
|
|
|
#14 |
|
Einyen
Dec 2003
Denmark
61278 Posts |
I "scraped" primenet for all factors back on Sep 3rd 2014: http://www.mersenneforum.org/showthread.php?t=19662 (the link in that thread is not valid now)
I got 35,504,331 factors from 28,300,589 different exponent between p=11 and p=999,999,893. gimpsfactors.zip (291 Mb) This is actually a 7-zip file, but the site does not permit .7z files, so rename it back to .7z and use 7-zip to exact the 902 Mb .txt file. You should use something like Textpad or Notepad++ to open that large file. |
|
|
|
|
#15 | |
|
∂2ω=0
Sep 2002
República de California
103·113 Posts |
Quote:
To the OP and anyone late to this thread: fivemack's brief note in post #4 tells you pretty much all you need to know about optimizing the aspect of p-1 the OP is asking about. Further posts which contribute no meaningful S to S/N will be disappeared with extreme prejudice. Last fiddled with by ewmayer on 2015-12-10 at 04:53 |
|
|
|
|
|
#16 | ||||
|
Nov 2003
22·5·373 Posts |
Quote:
Nothing, repeat nothing that I said could even remotely be considered a flame. The entire discussion was either about the technical issues, or requests for clarification and questions about the meaning of what the OP said. Quote:
objective other than just "reduce the run time". One must specify how much cost (i.e. reduction in probability) one is willing to tolerate in return for a given reduction in run time. Nowhere in this thread do I see such a specification. One can also turn it into a constrained optimization: reduce run time by as much as possible with a lower bound given on the probability. The title of the thread did ask about an OPTIMIZATION. Are you willing to accept a 90% drop in probability for (say) a 2-fold improvement in run time? No? What about a 75% drop? etc. The relationship is very non-linear. One can also express this problem in terms of an economic UTILITY function, but I did not want to go down that path. It is just a different formulation of the same problem. Quote:
Do you mean improvement in run-time? What fivemack gave CAN be improved in terms of run time. Quote:
|
||||
|
|
|
|
#17 |
|
If I May
"Chris Halsall"
Sep 2002
Barbados
2·5·7·139 Posts |
|
|
|
|
|
#18 |
|
Nov 2003
746010 Posts |
It shows one method/choice of parameters for implementing the method,
but performs no optimization. If you believe otherwise, please state what optimization is being performed. What is the objective function being minimized or maximized. What are the constraints? |
|
|
|
|
#19 | |
|
If I May
"Chris Halsall"
Sep 2002
Barbados
2·5·7·139 Posts |
Quote:
What would you advise works better? |
|
|
|
|
|
#20 | |
|
Nov 2003
22×5×373 Posts |
Quote:
One can choose parameters to give a higher probability of success. But it will run slower.... Is this better? The probability will not increase linearly with the run time. This is the heart of the whole discussion. Specify an objective function (or other metric) that let's us decide what "works better" means. |
|
|
|
|
|
#21 | |
|
If I May
"Chris Halsall"
Sep 2002
Barbados
2×5×7×139 Posts |
Quote:
In my view, results are better than insults. But sometimes, insults are useful as motivation. Believe it or not, I appreciate your methodology. Many others are scared away.... |
|
|
|
|
|
#22 | |
|
∂2ω=0
Sep 2002
República de California
265678 Posts |
Quote:
For p-1 stage 1 the compute effort is linear in the bitlength of the stage 1 small-primes product, thus we want our product of primes <= B1 to maximize the odds of capturing the primes <= B1 in one or more of the factors of the modulus in question. As you point out in post #2 (for N of no special form), "The probability that an integer N is divisible by p^k, is very nearly 1/p^k", thus the kind of small-prime-powers scheme Tom Womack laid out in #4 is optimal, or as close as we are likely to get given the granularity of the powers involved. If you have a nontrivially-better scheme for constructing a stage 1 primes product, by all means shout it from the rooftops! Do you? For moduli with factors of special form like our beloved M(p) we may need to make slight modifications to the above schema - M(p) have factors of form 2.k.p+1, so p-1 guaranteed to be divisible by the prime exponent, thus we make sure to 'salt' the stage 1 primes product with p if our overall stage 1 bound B1 is less than p. But you know all this already, thus I assert you are simply being a trollish, argumentative twit because such is your nature, as demonstrated in-depth in literally hundreds of threads on this forum. You appear to be the only one who thinks your behavior in this (and similar) threads is perfectly reasonable. Luckily you are not the arbiter of what constitues acceptable standards of discourse here. |
|
|
|
![]() |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| Probabilistic primality tests faster than Miller Rabin? | mathPuzzles | Math | 14 | 2017-03-27 04:00 |
| probabilistic number theory | wildrabbitt | Math | 57 | 2015-09-17 18:26 |
| Elementary Literature on probabilistic aspects of prime searching | hhh | Math | 7 | 2014-03-18 14:37 |
| time complexity of probabilistic factoring algorithms | koders333 | Factoring | 1 | 2005-12-13 13:58 |
| ASM Optimization | Cyclamen Persicum | Hardware | 4 | 2004-05-26 07:51 |