![]() |
|
|
#441 |
|
Jan 2011
Cincinnati, OH
22×52 Posts |
Thanks for your feedback. It does help to understand some of the nuances of this project. The math maybe above me, but I am learning, so thank you.
|
|
|
|
|
|
#442 | |
|
Jun 2010
Pennsylvania
947 Posts |
Quote:
In terms of both output and cost-effectiveness, it sounds so much superior to "ordinary" CPU crunching. I can't think of a better argument for putting one's GPU to work. So, why aren't more people doing this -- what's the catch? Rodrigo |
|
|
|
|
|
|
#443 |
|
"James Heinrich"
May 2004
ex-Northern Ontario
10000101101012 Posts |
|
|
|
|
|
|
#444 | |
|
Jun 2010
Pennsylvania
11101100112 Posts |
Quote:
I suppose one could buy the proper kind of card to put in a dedicated PC (not used for anything else). That would take care of the last three points. But in that case the second point would be a problem, as you'd have to keep going back to the "unattended" computer to get work and report results. Much appreciated. Rodrigo |
|
|
|
|
|
|
#445 | |
|
Banned
"Luigi"
Aug 2002
Team Italia
5×7×139 Posts |
Quote:
![]() Luigi |
|
|
|
|
|
|
#446 |
|
Jun 2010
Kiev, Ukraine
3·19 Posts |
I've failed the 4th point, so I'm somewhat of "a bit disappointed" :(
|
|
|
|
|
|
#447 |
|
Bemusing Prompter
"Danny"
Dec 2002
California
23×313 Posts |
I have a few questions about P-1 factoring:
1. I know that allocating more memory increases the chance of finding a factor (although the law of diminishing returns quickly kicks in after 1 GB or so). Does this just affect the chance of finding a factor, or does it also increase the size of a potential factor? 2. It's not unusual for large factors found by P-1 to be composite. In fact, most P-1 factors with 40 or more digits will split into two further factors. That having been said, has anyone ever found one that split into three factors? 3. Speaking of #2, are there any good ways of finding large prime P-1 factors? |
|
|
|
|
|
#448 | |
|
"James Heinrich"
May 2004
ex-Northern Ontario
7·13·47 Posts |
Quote:
You can play around with the balance of bounds, probability and RAM requirements on my P-1 probability calculator. Basic rule-of-thumb is that 10-30x the exponent is the suggested amount of RAM for a good P-1 (less than that will do a less-thorough P-1, more than that doesn't add much benefit). So if you're working on M50xxxxxx, then 500MB-1500MB is good. Now I step back and let other, wiser people answer the rest of your quesitons.
|
|
|
|
|
|
|
#449 |
|
Sep 2010
Scandinavia
3·5·41 Posts |
#2. Numbers exist such that any number of prime factors of a found factor is possible. I would bet a large sum of money that it has happened in Mersenne history.
#3. I don't think there is a way to do that. P-1 finds factors smooth to the specified bounds. The higher the bounds, the more likely it is that more than one factor is found. So strictly speaking... the lower the bounds, the more likely it is that the factor you find is prime rather than composite. But since lower bounds mean lower chance of finding a factor, that's just not a good idea. P-1 finds smooth factors, that's what it does. ECM can find factors that are practically impossible to find with P-1. And of course there are several other algorithms, all with their own strengths and weaknesses. Disclaimer: By posting this I'm not implying that I'm any wiser than James. |
|
|
|
|
|
#450 | |
|
"Brian"
Jul 2007
The Netherlands
2·11·149 Posts |
Quote:
|
|
|
|
|
|
|
#451 |
|
Dec 2007
Cleves, Germany
21216 Posts |
It would be extremely nice if double-checked exponents >10M could actually be reserved for TF/P-1. I'm obviously not the only one "blindly" processing such at the risk of conflicting work effort currently.
|
|
|
|