![]() |
|
|
#56 |
|
Mar 2004
Belgium
84710 Posts |
4100G => 4500G CedricVonck - Complete -
Scored : 2.034.627,6536 for 1841 factors 4500G => 4550G CedricVonck - Reserved - 645 factors expected Last fiddled with by ValerieVonck on 2007-03-22 at 19:11 |
|
|
|
|
|
#57 |
|
"Curtis"
Feb 2005
Riverside, CA
2×2,927 Posts |
I believe sr1sieve is faster for this sequence than srsieve. Have any of you tried it yet? It only works on NewPGen-formatted files, so you might need to srfile your current data file into NewPGen format.
Has anyone calculated sieve depth where factors are found at the same rate an LLR/Proth/PRP (which is it for +1 numbers?) test eliminates candidates? At 1M, 10M, or 33.5M exponent size? I'm quite curious how far you'll have to sieve before primality testing. -curtis |
|
|
|
|
|
#58 |
|
Feb 2007
24·33 Posts |
VBCurtis : I did use sr1sieve, see my commandline in the previous (my second) post (17-mar-07).
It IS fast : 8330278 p/sec - see also my previous post. Cedric: Your commandline is the same than mine in the cited post. This does not answer any of my questions. Essentially, I found 50 factors for 4040-4050. But noone seems to be interested in these factors (file ...factor...). On the other hand , the ...data... file did not change: in spite of the factors found, the corresponding exponents are still in the ...data... file. This is what I dont understand. Once again : the data file did not change ! I do want to help you. I did a 10T range in 30 min, so I can do a 500T range or more (1000T in 4 days) during the next weekend. But I want confirmation that what I did was useful, since I don't see what was the result ! |
|
|
|
|
|
#59 |
|
Jun 2003
22×11×37 Posts |
I will reply to all the questions tommorow, a little bit busy today.
Thanks for the patience.
|
|
|
|
|
|
#60 | |
|
Mar 2003
New Zealand
115710 Posts |
Quote:
However it is not really important to the project whether the input file is updated or not, as long as you report the factors recorded in <factors_file> it can be updated later. Using an input file that hasn't been updated will just mean that some factors you find may be for terms that have already been eliminated, but it doesn't normally slow the sieve down at all. If you think there is a bug, send me a copy of the input file and exact command line used (email address is in the README for sr1sieve) and I'll look into it. |
|
|
|
|
|
|
#61 |
|
Mar 2004
Belgium
7×112 Posts |
4500G => 4550G CedricVonck - Complete
|
|
|
|
|
|
#62 |
|
Feb 2007
6608 Posts |
Thanks for your patience. Maybe finally I made a mistake with renamed original vs new files. 1000x sorry.
As punishment, I'll do4010-4040 right now and send you factors in 2 hrs, then I'll do 4550 - 5000 for tomorrow, and send you the new data file (unless you prefer factors - I suppose you have a quick grep -v hack to patch the data from factors) ? PS: UPDATE: unable to connect to bigsharefile. :-( ! does this occur frequently ? So I'll do 4010-4040 with the old data file (as of 17-mar-07) Last fiddled with by m_f_h on 2007-03-23 at 13:33 |
|
|
|
|
|
#63 |
|
Feb 2007
24·33 Posts |
./sr1sieve -i 3_16data.txt -o 3_16data.txt -f 3_16factors.txt -p4010000000000 -P4040000000000 --verbose
sr1sieve 1.0.15 -- A sieve for one sequence k*b^n+/-1. L1 data cache 32Kb (default), L2 cache 512Kb (default). WARNING: --pmin=4010000000000 from command line overrides pmin=4050000000000 from `3_16data.txt' Read 657301 terms for 43046721*2^n+1 from NewPGen file `3_16data.txt'. Split 1 base 2 sequence into 15 base 2^240 subsequences. Recognised Generalised Fermat sequence A^16+1 Using 16 Kb for the baby-steps giant-steps hashtable, maximum density 0.21. Using 256Kb for the Sieve of Eratosthenes bitmap. Expecting to find factors for about 168.78 terms. sr1sieve started: 1000000 <= n <= 49999888, 4010000000000 <= p <= 4040000000000 4010038565569 | 43046721*2^8393056+1 (...) 4039844927233 | 43046721*2^37829968+1 p=4039846680673, 8406338 p/sec, 170 factors, 99.49% done, 20 sec/factor sr1sieve stopped: at p=4040000000000 because range is complete. Wrote 657131 terms for 43046721*2^n+1 to NewPGen file `3_16data.txt'. Found factors for 170 terms (expected about 168.78). PS: (1) - I CONFIRM RESERVATION OF 4550 - 5000 (2) - timing info for geoff: this range took me 45 minutes with 1 CPU @ 2Ghz (3) - still cannot connect to sharebigfile - so I'll use the old data file for 4550-5000. sorry for duplicates Last fiddled with by m_f_h on 2007-03-23 at 16:07 |
|
|
|
|
|
#64 |
|
Mar 2003
New Zealand
13·89 Posts |
|
|
|
|
|
|
#65 |
|
Feb 2007
24×33 Posts |
Fri Mar 23 12:18:14 2007 sr1sieve started: 1000000 <= n <= 49999888, 4550000000000 <= p <= 5000000000000
Sat Mar 24 03:12:16 2007 sr1sieve stopped: at p=5000000000000 because range is complete. Sat Mar 24 03:12:17 2007 Found factors for 2073 terms (expected about 2119.48). Factors file attached. (my) data file is now 11671394 bytes, 655059 lines. |
|
|
|
|
|
#66 |
|
"Erling B."
Dec 2005
6916 Posts |
I have made the sieving file available for K=7 on:
http://rafteikning.is/~prime/ Now we will see what happends. k*b^n+1 Sieved up to 12T. Remaining candidates: 1834 k*b^n-1 Sieved up to 10T. Remaining candidates: 798 |
|
|
|
![]() |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| Search of all even-15-digit Aliquot cycles | Drdmitry | Aliquot Sequences | 25 | 2016-12-16 15:26 |
| Polynomial search for 204-digit cofactor of M1009 | fivemack | Factoring | 45 | 2012-02-14 08:50 |
| Deep Sieving 10m Digit Candidates | lavalamp | Open Projects | 53 | 2008-12-01 03:59 |
| Help Sieving 10 Million Digit Candidates | lavalamp | Riesel Prime Search | 26 | 2008-05-25 08:24 |
| idea about 10 million digit search(possibly dumb) | jasong | Math | 5 | 2006-06-07 10:39 |