![]() |
OK, I ran it against the 8x region and it did remove >17.5k, but I'm afraid I flooded Chris's region below 76 digits. Sorry about that!
I'll refrain from doing any more until I see the comments from you guys... |
It only seems to have added a few hundred 70-75 digit numbers. Which is not a problem for me. Adding a few thousand would be though.
When you add a list of small factors for a number I think factordb creates a new ID after dividing out each factor in turn. So factoring one such number can add 20+ IDs, which is why it's easy to get locked out. Go ahead searching for numbers that factor easily, just pace it to not swamp factodb with numbers below 70 digits faster that it can factor them. Chris |
Why am I getting "This page isn't working"? Here's the full error:
[code]This page isn’t working www.factordb.com didn’t send any data. ERR_EMPTY_RESPONSE[/code] |
This link [url]http://factordb.com/[/url] works for me.
|
Weird. It must be a transient error, because I was reporting factors for sequence [url=http://www.factordb.com/sequences.php?se=1&aq=2892465727171322&action=last20&fr=0&to=100]Shakespere[sub]36[/sub][/url] and I got that error the first time I attempted to post the factors. When I tried again, it worked, but the page didn't load completely, and I got the error when I tried to load the sequence again. It worked fine after that.
|
[QUOTE=chris2be8;469583]It only seems to have added a few hundred 70-75 digit numbers. Which is not a problem for me. Adding a few thousand would be though.
When you add a list of small factors for a number I think factordb creates a new ID after dividing out each factor in turn. So factoring one such number can add 20+ IDs, which is why it's easy to get locked out. Go ahead searching for numbers that factor easily, just pace it to not swamp factodb with numbers below 70 digits faster that it can factor them. Chris[/QUOTE] I did run the 9x region and added c100, since there were over 70k, (which also means not all are accessible). I ended up clearing just over 19k composites and the db (and your workers) seemed to handle everything sent down without being bogged down. Perhaps tomorrow I'll work a little further. If I can get a handle on the PRPs, I might move some more machines back, if I will have less chance of being locked out. |
I've looked at a few of the what looks like useless C100s that someone added (i.e. those with id's starting around 1100000000968200000). I factored a few with near id's in the hope of observing a pattern to the primes. They all yielded P50*P50, but I couldn't see any obvious pattern. It looks like it will be a real pain to clear all this junk.
|
[QUOTE=sean;471654]I've looked at a few of the what looks like useless C100s that someone added (i.e. those with id's starting around 1100000000968200000). I factored a few with near id's in the hope of observing a pattern to the primes. They all yielded P50*P50, but I couldn't see any obvious pattern. It looks like it will be a real pain to clear all this junk.[/QUOTE]
Could you PM me some of these IDs? I've been tinkering with CADO, and wish to factor a set of C100s to mess with parameters and try to improve the default CADO parameter suggestions. I've factored a C116 four times, improving the time each go, by editing default CADO params to be more like typical factmsieve params. I'm starting to think that CADO is slower mostly because of wonky default params. I've taken ~15% off the factoring time from the default C115 params, which seems like about the gap between past CADO tests and GGNFS tests. If it's easier, just tell me how to find these numbers; I've only used FactorDB to update aliquot sequences. I might just do a few C105s and C110s also, to see if those CADO param files are decent or deficient. |
[QUOTE=VBCurtis;471664]Could you PM me some of these IDs? I've been tinkering with CADO, and wish to factor a set of C100s to mess with parameters and try to improve the default CADO parameter suggestions. I've factored a C116 four times, improving the time each go, by editing default CADO params to be more like typical factmsieve params. I'm starting to think that CADO is slower mostly because of wonky default params. I've taken ~15% off the factoring time from the default C115 params, which seems like about the gap between past CADO tests and GGNFS tests.
If it's easier, just tell me how to find these numbers; I've only used FactorDB to update aliquot sequences. I might just do a few C105s and C110s also, to see if those CADO param files are decent or deficient.[/QUOTE] If you find improvements pass them onto the CADO development team. I need to try CADO again. I wonder whether it will compile/run on WSL. |
For some reason I cannot upload a huge certificate, factorDB keeps giving me an error. Or rather, my browser says the page cannot be reached (this is after several minutes of uploading). I tried both Firefox and MS Edge.
At the same time EdH has uploaded certificates so factorDB seems to work. Is it the size of the certificate (single number, more than 100MB uncompressed)? I tried uploading compressed and uncompressed, but the error is always the same. EDIT: MS Edge says the page cannot be reached. Firefox says the page uses an unknown or invalid compression. Again this happens with the outfile compressed and uncompressed. |
And, I have uploaded .zip files of lots of certificates at once that were over 100MB with no issues, as well as my normal stream of one at a time. I haven't been having any troubles that weren't caused by me, (e.g. forgetting to check the zip file box). I don't believe any of my individual certificates were over 1MB, though.
I also just tried a single certificate, zipped, and it processed it without trouble. Of course, that certificate was well less than 1MB, so my following query is probably moot. I had wondered, what would happen if you compressed your original with a small second one, just to have more than one in the zipped file? But, my test showed it working well with a single certificate zipped from here. Is it possibly a timeout issue? I know it takes several minutes to receive a response from the db when I send a zipped file of smaller certificates. I hope you get this resolved soon. I'm interested to see what you have. |
| All times are UTC. The time now is 21:04. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.