![]() |
[QUOTE=chris2be8;470389]It's not that lucky. I'm factoring about half of them by ECM, so 3 in a row is a 1 in 8 chance.
[/QUOTE] That was tempting fate. I only found 1 factor for the last 4 numbers (for (37357^43-1)/37356), and the cofactor is composite. Which drags the success rate down a bit. Chris |
[QUOTE=fivemack;470401] (my guess is that the Murphy E-value computed by msieve will be closely correlated to the sieving yield and the runtime; it might be a more useful figure of merit than 'SNFS digits')
[/QUOTE] It would probably be much better than my script's estimate based on SNFS digits, size of largest coefficient and degree. But I don't know how to get msieve to print the E-value without doing anything else. And the script is generating polys for all the most wanted numbers so even a short delay for each number would be annoying. Chris |
[QUOTE=fivemack;470405]How much ECM has been done on (732541^47-1)/732540 ?[/QUOTE]
Found this old comment. [QUOTE=Pascal Ochem;364649]I do not keep track of the ECM work done. We can safely assume that the 37755 composites in t1600 have not been ECMed to the 40 digit level. You can work on a bunch of them, e.g. between lines 6200 and 6700, and get some factors.[/QUOTE] If a number is in the MWRB file, I would think it has much more attention. |
[QUOTE=RichD;470564]
If a number is in the MWRB file, I would think it has much more attention.[/QUOTE] Haven't chris2be8's ECM finds been from the MWRB file? |
[QUOTE=chris2be8;470289]See attachment...
I've excluded numbers that have already been factored. To get them in order of difficulty run: sort -n -k 4 order.txt And in order of size run: sort -n -k 9 order.txt The CPU hours are for a rather slow CPU, it will probably take about half as many core hours on a modern CPU. And are only a rough estimate so run times could vary from them by a factor of 2 or more. Chris[/QUOTE] I would be interested in the code that produces this (most notably SNFS difficulty adjustments, and NFS-diff-to-CPU-time bits). I assume you track reservations/progress manually? |
[QUOTE=Dubslow;470584]Haven't chris2be8's ECM finds been from the MWRB file?[/QUOTE]
Yes, if it is new to the file then it has less ECM work as stated [url=http://www.mersenneforum.org/showpost.php?p=468588&postcount=514]here[/url]. |
[QUOTE=Dubslow;470255]I made a small 50 liner to sort the roadblock file (by snfs difficulty, though it's easily modifiable).
[url]https://gist.github.com/dubslow/057407e71a8edda2bcb7541e73c0bb6e[/url] Converting it to work on the t files should be as easy as modifying lines 16, 36, and 37; sorting by gnfs difficulty=composite size or by weight is as easy as modifying 32. [/QUOTE] As much or more for my own reduction of skill rust as for future people-use, I've much improved this little [URL="https://gist.github.com/dubslow/057407e71a8edda2bcb7541e73c0bb6e"]gist[/URL]. Among other things, I realized the way I was controlling the sorting was quite stupid, and doing so is now an order of magnitude simpler. Also, it autodetects if it is passed a "t file" or "mwrb file" (which differ only in that the latter includes a weight for each number). The costs, such as they are, include double the code length, which is in no small part due to the very-much-overkill class hierarchy I made to handle the two file types, though I like to think this is both more maintainable and extensible (not that I ever anticipate it needing extending in any way, like I said, mostly for my own benefit). Though overkill, the code is hopefully very easy to read, even for non-Pythoneers. Completely tangentially to the subject at hand, I have a mild curiosity how readable other people find it, including (or even especially?) those who aren't familiar with Python. I would welcome any such comments over PM. Anyways. Use: [code]bill@Gravemind ~/bin $ ./opnfiles.py ~/Downloads/t600.txt bill@Gravemind ~/bin $ head ~/Downloads/t600.txt.sort 159.7 159.2 70253804098533303996256039060114483059484940828633741987845788959049023444144179 2 162.1 159.5 1149986550472855579648408239822856912298581122489969299470029337872846236797481461 2 164.1 159.3 11724355815660124915535088311032487705863837675251193935172812233877609917572819351 2 171.8 165.1 75530533451743557525338385814839050445571761854876146017004062930526523488696122961481 2 177.5 172.9 54272081997719936694932737782603739858367553756705713569128530720565556404525135777737041 2 178.0 161.3 101575129733962903176164717219488895595781584956747683545056554733469676083500321620734233 2 180.8 180.3 2559748633561915802707442930459223465566261478367639818017337979434186929768942720172398161 2 183.6 175.1 62818938916943713724293329489477630385792505500459422757805502240535452147851295164091552943 2 192.3 189.8 549568273 22 192.9 162.7 2878319791561117685582532984481924989856693782709674033037185004142735435821967737094116024989323 2 [/code] [code]bill@Gravemind ~/bin $ ./opnfiles.py ~/Downloads/mwrb2100.txt bill@Gravemind ~/bin $ head ~/Downloads/mwrb2100.txt.sort 191.9 191.9 7664 37061 42 191.9 191.9 7635 37159 42 192.0 192.0 7555 37357 42 192.1 192.1 7445 37579 42 192.3 192.3 7301 37963 42 192.4 192.4 7068 38201 42 192.6 192.6 7296 38609 42 192.6 192.6 6668 38611 42 193.5 193.5 5682 40387 42 193.8 193.8 5441 41051 42[/code] And although the default sort is by zeroth order SNFS difficulty (editable in the very first code line), you can also change the sort in an ad hoc manner via the command line (second argument, in quotes), this example sorting by which numbers have been the most factored so far: [code]bill@Gravemind ~/bin $ ./opnfiles.py ~/Downloads/t600.txt "line.gnfs_difficulty() - line.snfs_difficulty()" bill@Gravemind ~/bin $ head ~/Downloads/t600.txt.sort 243.2 145.3 127473943 30 268.6 179.3 307 108 252.6 168.2 262209281 30 248.5 164.8 506710914239632419773 12 267.4 185.7 1093 88 250.5 175.8 552781743698966779174737704265497702530829 6 252.0 178.2 1001523179 28 252.3 179.0 1024823381 28 256.7 183.6 86353 52 245.3 173.7 150332843 30[/code] I hope to perhaps incorporate some variant of chris2be8's more sophisticated difficulty and effort estimations, which would render the ad hoc sorting eminently more useful. Perhaps also it should track (or at least not-destroy) reservation information in the file (though such would have to be entered manually). |
1 Attachment(s)
[QUOTE=Dubslow;470588]I would be interested in the code that produces this (most notably SNFS difficulty adjustments, and NFS-diff-to-CPU-time bits). I assume you track reservations/progress manually?[/QUOTE]
See attachment: Note it uses phi to help build the .polys so you need to have that installed. Run in the same dir as mwrb2100.txt to build .polys for all the numbers in mwrb2100.txt. Then get stats with: [code] grep SNFS m[0-9]* > stats sort -gr -k 16 stats >order sort -nr -k 14 stats >weights sort -n -k 4 stats >diffs sort -n -k 9 stats >gnfs [/code]The SNFS difficulties are adjusted for coefficient size etc. Chris NB. I do track reservations/progress manually. But I'm not planning to do any more numbers myself. |
[QUOTE=Dubslow;470387]Evidently my memory fails me. ~3 days, though I perhaps got slightly unlucky with the ECM: Yafu found the P45, but not the P42 or P43, and thus switched to SNFS and probably overall wasted time... but hindsight is in this case 40/20.
[code]P45 = 151391679468422393528867290915149240097250107 P64 = 1486558991225034419006760671754467301035250060013637466479042797 P43 = 7085347601112630074165665314424610195957819 P42 = 491910406011895000965792057486858753525323 [/code] I'll take the next three smallest available: 5366319547249^17-1 671717139553^19-1 24671431560073^17-1[/QUOTE] [QUOTE=chris2be8;470393]Those are a lot harder. I've added SNFS difficulty and CPU hours etc estimated by my script to your post. Compare with: 37061^43-1 # SNFS diff: 207.601, degree 6, GNFS diff: 191.895, CPU hours: 579.188, weight: 7664, ratio 13.2323092955755 So if that took 3 days you are doing the equivalent of about 200 CPU hours per day. So the next 3 might take you 100 days, 66 days and 300 days respectively. They are probably more suited to NFS@Home. [/QUOTE] Yes, that was perhaps more than I can chew. Having already spent the better part of a week on the first one doing ECM, yafu estimates ~40 days of sieving to hit minrels, probably closer to 50 once you account for my everyday usage. I hereby unreserve the latter two listed above, namely 671... and 246... . I will continue and complete 536... . |
A few more low hanging fruit has showed up in the MWRB file.
[CODE]64081603 30 C157 982015669 18 C162 2523203593 18 C170 53003 36 C171 53051 36 C171 53233 36 C171 53279 36 C171 53401 36 C171 53617 36 C171 3056720295076650541 10 C185 45289 40 C187 45953 40 C187 46183 40 C187 41341 42 C194 41953 42 C195 42013 42 C195 33403 46 C209 33589 46 C209 33827 46 C209 33997 46 C209 34039 46 C209 34171 46 C209 34301 46 C209[/CODE] |
I'll take 982015669^19-1 C162.
|
| All times are UTC. The time now is 22:59. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.