![]() |
|
|
#1574 |
|
May 2009
Russia, Moscow
13·199 Posts |
I've cracked ~400 C100's out of 2100 by ECM to t25. And I confirm that almost all composites beginning with 1 or 2 are ECM resistant.
|
|
|
|
|
|
#1575 |
|
"Nuri, the dragon :P"
Jul 2016
Good old Germany
809 Posts |
We just passed 1,2 Billion numbers in the database.
Last fiddled with by MisterBitcoin on 2018-07-11 at 16:57 |
|
|
|
|
|
#1576 |
|
"Daniel Jackson"
May 2011
14285714285714285714
12278 Posts |
I know I probably asked for this already, but I think it's time to increase the global DB limits, and sooner than you think. Given that the largest known prime (M77232917) is far beyond the 62 megabit limit, is there any chance that the DB size limit could be increased to 332 megabits, or even higher? It's extremely annoying to have the message "Error: Limit of about 10.000.000 digits exceeded" come up when I try to add 2^77232917-1 (the largest known prime) to the DB. Also, that error is very inaccurate, since both 2^43112609 and 2^57885161-1 are greater than 10,000,000 digits, and I could add them to the DB without any problem. It should've said "Error: Limit of 18.663.860 digits (62.000.000 bits) exceeded". Also, the limits on factorials (450000!, 2,348,517 digits) and primorials (5000000#, 2,170,852 digits) are way too low. They should be as high as the limit for all other numbers. Here's what the limits should be:
Factorials: n=14842907. Overall limit: 332192810 bits. Primorials: whatever the largest primorial <100,000,000 digits is. N+1 (or N-1) test for PRPs: 1000000 digits (put it in a worker queue, so as to reduce DB load) Eventually, any limit will be surpassed as larger and larger primes are discovered, so the DB should NOT have a maximum limit, anyway. When we do find the first billion digit prime (within the next few decades), we SHOULD be allowed to add it to the DB, without any errors at all. @Syd: I miss the "Magnifying Glass" feature. It allowed me to run P-1 and ECM on composite numbers (to a certain limit), and SIQS on numbers <80 digits (I think it was 80. It's been so long since that feature was removed that I forget the max size on it.). without using any of my own PC's CPU time (which I use for other things, such as Prime95 stress testing). Why the heck was it removed in the first place? Could you PLEASE bring it back? That's been bothering me for quite a few years now. |
|
|
|
|
|
#1577 |
|
"Nuri, the dragon :P"
Jul 2016
Good old Germany
809 Posts |
Computing power might be a problem. Maybe an RasPi System with 16 (or 32) RasPi is enough; each task is slow but more tasks can be started. (e.g. 5 for certificates; 4 for factoring below C=85 digits; 4 for checking "U" below 100000 digits, and so on)
Also FDB needs an better SSD, lets say ~1TB SSD plus 4 TB HDD for back-up. I can help on that front; I can throw in a few hundret bucks (if needed). |
|
|
|
|
|
#1578 |
|
Mar 2018
3·43 Posts |
Why would you want M77232917 on FactorDB in the first place? What purpose would that serve?
I'ld say, the limits are higher than they need to be, actually. And there is a very visible lack of perfomance on large numbers with a large amount of factors already. |
|
|
|
|
|
#1579 |
|
"Nuri, the dragon :P"
Jul 2016
Good old Germany
809 Posts |
I´d like to see an new option: ECM pretest depth reached for composite numbers. (given by the Perl script for autom. Yafu processing.)
Is there any hope, that things like that will be available at some point? |
|
|
|
|
|
#1580 |
|
"Nuri, the dragon :P"
Jul 2016
Good old Germany
809 Posts |
Quite a lot in the area from 95 up to 99 digits, I´ll start one worker on that range.
|
|
|
|
|
|
#1581 |
|
"Ed Hall"
Dec 2009
Adirondack Mtns
11×347 Posts |
I've been focused elsewhere the last few days. I'm sure that's allowed the buildup to be worse. I'm not sure when I may return to composites. Once we knocked the c100 group down a bit, I was less interested. Composite work is now more of a fallback for idle machines.
|
|
|
|
|
|
#1582 | |
|
"Nuri, the dragon :P"
Jul 2016
Good old Germany
809 Posts |
Quote:
I can extend my range from 90 up to 99 if needed; I can also run that range for a few weeks. My main focus will be searching for 2X and 3X digit factors on composite numbers with 121 digits. |
|
|
|
|
|
|
#1583 |
|
"Ed Hall"
Dec 2009
Adirondack Mtns
11×347 Posts |
Let's see how it goes for the next few days. I'm working on an Aliquot sequence and playing with ecmpi and cado-nfs. The LA portion leaves all but one machine free. I hope to have them automatically run the db composites during that free time.
|
|
|
|
|
|
#1584 |
|
"Nuri, the dragon :P"
Jul 2016
Good old Germany
809 Posts |
So many small composites from Aliquot.
Trying to catch-up. Still having 90-99 digits, also starting 81-89 now. |
|
|
|
![]() |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| Database for k-b-b's: | 3.14159 | Miscellaneous Math | 325 | 2016-04-09 17:45 |
| Factoring database issues | Mini-Geek | Factoring | 5 | 2009-07-01 11:51 |
| database.zip | HiddenWarrior | Data | 1 | 2004-03-29 03:53 |
| Database layout | Prime95 | PrimeNet | 1 | 2003-01-18 00:49 |
| Is there a performance database? | Joe O | Lounge | 35 | 2002-09-06 20:19 |