mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   Factoring (https://www.mersenneforum.org/forumdisplay.php?f=19)
-   -   Pascal's OPN roadblock files (https://www.mersenneforum.org/showthread.php?t=19066)

RichD 2018-02-22 01:30

[QUOTE=lavalamp;480600]Ah, very well. Do you know about the status of [URL="http://www.factordb.com/index.php?query=%28777223234256745831445688101300524172301%5E7-1%29%2F777223234256745831445688101300524172300"]777223234256745831445688101300524172301^7-1[/URL]?[/QUOTE]

That is beyond the window I am looking. My guess (gut feel) it still needs a bit of ECM before it is ready for NFS.

lavalamp 2018-02-22 09:04

I'm quite happy to do some ECM on it first, I'm not sure what the optimal depth would be though. Perhaps 45 or 50 digits?

hyramgraff 2018-02-22 16:08

Here are two more full factorizations from the t2100 file:

C814 = P24 * P791 [url]http://factordb.com/index.php?id=1100000000685526525[/url]

C709 = P28 * PRP681 [url]http://factordb.com/index.php?id=1100000000596692109[/url]

chris2be8 2018-02-22 16:31

34171^47-1 is done: [code]
p51 factor: 750065630746495891347078520927344182439087276088993
p158 factor: 47115858803957824371787060315158584426065865217944598936136624332513555797975014494412679255487210425958484446163395776371254319486232871740400297614168150669
[/code]
Chris

hyramgraff 2018-02-26 18:20

Here are four more full factorizations from the t2100 file:

C851 = P28 * PRP824 [url]http://factordb.com/index.php?id=1100000000685532095[/url]

C839 = P32 * PRP808 [url]http://factordb.com/index.php?id=1100000000685531216[/url]

C501 = P31 * PRP470 [url]http://factordb.com/index.php?id=1100000000689618127[/url]

C695 = P26 * PRP669 [url]http://factordb.com/index.php?id=1100000000685519287[/url]

lavalamp 2018-02-27 17:54

[QUOTE=hyramgraff;478756]I'm continuing my attempt to run ECM with B1=50e3 (25 digits) on all numbers in the t2100 file. With six cores devoted to this I'm still finding about ten factors per day.[/quote]I'm curious if this is still your current approach, and how far through the file you are. Are you running 1 curve per composite?

[QUOTE=hyramgraff;478756]By the way, is there a good open source program for creating a certificate of primality? I know about Primo but I don't want to install it on my machine because it's not open source.[/QUOTE]Just to signal boost this a bit as it seemed to get lost amid discussion earlier. I don't know of another, but honestly after I found primo I stopped looking. Surely there must be another though.

hyramgraff 2018-02-28 02:45

[QUOTE=lavalamp;481066]I'm curious if this is still your current approach, and how far through the file you are. Are you running 1 curve per composite?[/QUOTE]

Yes, I'm still running ECM with B1=50e3 (25 digits).on all numbers in the t2100 file. I'm running 216 curves per composite which should be optimal for GMP-ECM 6.4.4.

I've finished running ~52,000 of the ~65,000 composites in the t2100 file. I'm letting make pick which numbers to run so the remaining ~13,000 composites are randomly distributed. I found a factor for 381 different composites (although a few of those had already been reported to factordb.)

Once I've finished ECM testing at 25 digits I plan to continue testing at 30 digits. Also, the scripts that I've written will make it easy for me to detect new entries in the t2100 file and get them up to the same level of ECM coverage. My goal is to get to a state where anyone who wants to do SNFS factoring can be confident that any composite in the t2100 file has been thoroughly tested to 40+ digits and is unlikely to have a small factor.

VBCurtis 2018-02-28 03:51

Your factors-found rate is low. That indicates you're doing an ECM level that has (mostly) already been done. If you're finding more factors of 27-30 digits than 24-26 digits, that also indicates that most of a t25 has already been done.

I'd skip to T30-sized curves immediately, and I'd only run 300 or so on each composite before skipping to B1=1M. I'd guess that B1=250k on the remaining 13,000 composites would be 30-50% more efficient at finding factors per unit time than your current B1=50k.

lavalamp 2018-02-28 20:03

[QUOTE=hyramgraff;481128]My goal is to get to a state where anyone who wants to do SNFS factoring can be confident that any composite in the t2100 file has been thoroughly tested to 40+ digits and is unlikely to have a small factor.[/QUOTE]Getting all 65k composites to 40 digits (or even 35 digits) is a hell of an undertaking, as I'm sure you know. If it isn't too much trouble, for higher ECM levels can I suggest sorting the composites by either size or SNFS difficulty? That way the candidates that it's possible to SNFS are fully ECM'd sooner and the monsters that are out of reach of SNFS anyway can be run later.

Also, without trying it myself, here is an open source ECPP project you may be interested in:
[url]https://sourceforge.net/projects/gmp-ecpp/[/url]

henryzz 2018-03-02 09:26

C348 = P37 * C311 [url]http://factordb.com/index.php?id=1100000000838445673[/url]

hyramgraff 2018-03-02 17:02

Here are three more full factorizations from the t2100 file:

C645 = P30 * PRP616 [url]http://factordb.com/index.php?id=1100000000504439446[/url]

C575 = P29 * PRP546 [url]http://factordb.com/index.php?id=1100000000685483098[/url]

C700 = P26 * PRP674 [url]http://factordb.com/index.php?id=1100000000499311492[/url]


All times are UTC. The time now is 22:43.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.