![]() |
[quote=Mini-Geek;202575]... Most likely, the text editor was just doing what it was told. If you didn't copy it in a way that included the newline at the end, (and didn't put it there yourself) then it doesn't know there should be a newline at the end, and it won't write a newline at the end. In any case, shouldn't msieve be able to read it even without the final newline? (just to not avoid the question: I used Metapad. I don't know what Joshua2 used, but I'd guess it's a pretty common occurrence.)[/quote]
This is an issue I really have to keep track of here, using both winXP and linux machines. For the linux ones, I need to ensure there isn't anything extra, while the winXP needs an extra CR/LF. If I mess up in either direction the particular machine scolds me.:smile: [SIZE=1] ----------------------------------------------------------------------------------------------------------------------------------------------------------------------- For those who may have noticed that my avatar is spinning counter to the proper rotation, it is an optical illusion, due to the incompatible shutter speed with which the galaxy was originally recorded.[/SIZE] |
The RSALS server has now received almost 3400 results out of 3900.
|
[QUOTE=Mini-Geek;202575]Yeah, I noticed. Hence my edit. :smile: I thought about taking out what I posted that was wrong, but never did.[/QUOTE]
Sorry, I didn't see that part earlier: [QUOTE=Mini-Geek;202510]Edit: This was before I read fivemack's post...he's right, it's just the lack of a new blank line at the end of the msieve.fb file. Oddly, with the 4 extra lines, it doesn't matter if there's a blank line at the end. Weird.[/QUOTE] By adding the extra lines, you added a newline at the end of the last coeff line as well, which is all that mattered. It didn't matter whether there was a newline after the extra lines, because they were going to be ignored anyway. FWIW, I sent Jason a patch that allows msieve to work in the absence of a final newline. The patch is copied below: [code]Index: gnfs/poly/poly.c =================================================================== --- gnfs/poly/poly.c (revision 177) +++ gnfs/poly/poly.c (working copy) @@ -93,7 +93,7 @@ /* read one coefficient per line; 'R<number>' is for rational coefficients, 'A<number>' for algebraic */ - while (!feof(fp) && (buf[0] == 'R' || buf[0] == 'A')) { + while ((buf[0] == 'R' || buf[0] == 'A')) { signed_mp_t *read_coeff; char *tmp; @@ -114,7 +114,8 @@ read_coeff->sign = POSITIVE; } mp_str2mp(tmp, &read_coeff->num, 10); - fgets(buf, (int)sizeof(buf), fp); + if (fgets(buf, (int)sizeof(buf), fp) == NULL) + break; } for (i = MAX_POLY_DEGREE; i >= 0; i--) { [/code] |
[URL=http://www.sendspace.com/file/uf8qmu]111-115 done[/URL]: 3845345 relations
Reserving 115-119M. |
[URL="http://www.sendspace.com/file/535dx5"]4788_2509_40-41M.tar.bz2[/URL] - 1237326 relations
[SIZE=1]----------------------------------------------------------------------------------------------------------------------------------------------------------------------- For those who may have noticed that my avatar is spinning counter to the proper rotation, it is an optical illusion, due to the incompatible shutter speed with which the galaxy was originally recorded.:rolleyes:[/SIZE] |
My range of 67-70M is due to finish tomorrow morning, I can upload it in ~24 hours from now.
|
110M-111M done: [url]http://www.sendspace.com/file/inqbba[/url]
110.0-110.5: 483196 relations 110.5-111.0: 481140 relations edit: [B]reserving 119.0-119.4[/B], this should be done by tomorrow evening. |
At 8.30 p.m CET, 3567/3900 WUs returned.
I had progressively decreased the WU timeout to 150000 seconds. That was arguably too low for some users (e.g. computers turned off over a week-end), so I recently increased it to 3.5 days - but the unwanted side effect is that the last WUs for a given integer are returned disappointingly slowly... |
[QUOTE=Andi47;202729]
edit: [B]reserving 119.0-119.4[/B], this should be done by tomorrow evening.[/QUOTE] Seems that I did my edit just after 10metreh's update in the first posting. I'm now quoting my edit because I think that an edit in an "old" post would be easily missed for the next update... |
[quote=Andi47;202739]Seems that I did my edit just after 10metreh's update in the first posting. I'm now quoting my edit because I think that an edit in an "old" post would be easily missed for the next update...[/quote]
Noted. I noticed it before I noticed your post, but it was probably still good that you posted again to bring attention to it. :smile: |
[code]:\4788\dat\all>wc * -l
1237326 4788_2509_40-41M 483196 alq4788.2509_110.111_a.out 481140 alq4788.2509_110.111_b.out 478703 gnfs_111000000-111500000-alg.dat 484066 gnfs_111500000-112000000-alg.dat 481304 gnfs_112000000-112500000-alg.dat 482151 gnfs_112500000-113000000-alg.dat 488965 gnfs_113000000-113500000-alg.dat 477739 gnfs_113500000-114000000-alg.dat 477586 gnfs_114000000-114500000-alg.dat 474831 gnfs_114500000-115000000-alg.dat 6047007 total[/code] any idea what the 180mb file msieve.dat.lp is in my msieve folder? result of filtering I did or something? |
[quote=Joshua2;202747]any idea what the 180mb file msieve.dat.lp is in my msieve folder? result of filtering I did or something?[/quote]
Yep. I don't know exactly what it is, but it is created at very early stages of filtering. |
should I keep it or delete it
|
[QUOTE=Joshua2;202749]should I keep it or delete it[/QUOTE]
Doesn't matter. If it's there, msieve will zap it the next time you filter, anyway. I am in the habit of keeping my relations files in a subdirectory, with no other type of file allowed in that directory. It makes it easier to do relation counts and backups, and it reduces the chance of accidental deletion. When I'm ready to filter, I just cat them all into a single .dat file in the main directory, and then filter using that file. |
could someone explain this output to me?
[code]commencing duplicate removal, pass 1 error -15 reading relation 33810676 //whats this? found 10351595 hash collisions in 57228325 relations //what is a relation vs a free relation vs unique relation added 122614 free relations commencing duplicate removal, pass 2 found 9668004 duplicates and 47682935 unique relations reading ideals above 50003968 commencing singleton removal, initial pass commencing in-memory singleton removal begin with 47682935 relations and 62501316 unique ideals reduce to 1596536 relations and 753242 ideals in 26 passes //what is an ideal max relations containing the same ideal: 8 reading ideals above 30000 commencing singleton removal, initial pass keeping 5218095 ideals with weight <= 200, target excess is 8625 //what is weight commencing in-memory singleton removal begin with 1665848 relations and 5218095 unique ideals reduce to 34 relations and 0 ideals in 4 passes max relations containing the same ideal: 0 filtering wants 1000000 more relations elapsed time 00:23:30[/code] |
This won't be completely enlightening, but it's the best I can do in a couple of minutes.
[QUOTE=Joshua2;202776] error -15 reading relation 33810676 //whats this?[/QUOTE] One of the relations is naughty: perhaps an error in the format, or the (a,b) pair doesn't factor over the primes named in the relation [QUOTE] found 10351595 hash collisions in 57228325 relations //what is a relation vs a free relation vs unique relation[/QUOTE] Relations are lists of primes (numbers or ideals) which factor the non-linear polynomial value for the value a, and the linear polynomial for the value b. Free relations are the ones which fall out of the factor base without any sieving - there are only a few of them. Unique relations are what's left over after the duplicates have been dumped. [QUOTE]reduce to 1596536 relations and 753242 ideals in 26 passes //what is an ideal[/QUOTE] An ideal is a subset of the real numbers lying between Q and R with some nice properties. A semester of abstract algebra plus a basic course in algebraic number theory should tell you all you need to know :razz: For our purposes, an ideal is sort of a prime number, but in a ring (look it up!) which is not the integers but shares some abstract properties with the integers. [QUOTE]keeping 5218095 ideals with weight <= 200, target excess is 8625 //what is weight[/QUOTE] Weight is the number of relations in which an ideal makes an appearance. Again, think of relations as lists of primes which factor the values of our non-linear and linear polynomials, where, in the case of the non-linear polynomial, "primes" means "ideals". This is a simple (!) explanation of what is going on here, and certainly contains some inaccuracies. I have no doubt that someone will step in to correct any errors I have made - this forum is not known for being chock full of painfully-reticent shrinking violets. :smile::nuke: |
67-70 done, I will upload the relations tonight.
[CODE]67-68.5: 1711677 relations 68.5-70: 1708114 relations [B]total:[/B] 3419791 relations[/CODE] |
A few small corrections to what FactorEyes wrote: Both the rational and algebraic polynomials are evaluated at both the 'a' and 'b' values for each relation; the value of the rational polynomial at (a,b) is factored into primes, and the value of the algebraic polynomial at (a,b) is factored into 'things' that have a 1:1 correspondence with ideals.
Your job as the postprocessing guy is to choose a collection of relations such that every prime and every 'thing' appears an even number of times. Then the number field that generated the algebraic polynomial has a 'trap door' that moves from number-field-land back to the integers. This gives you two different numbers whose squares are identical modulo the number N being factored, which leads to a factor of N about half the time. Just to be confusing, the filtering calls both primes and 'things' [i]ideals[/i], as a proxy for things-that-be-made-even. The primes and the things are all tossed into a giant bucket and treated identically. The job of the filtering is to set up a huge linear algebra problem, by throwing away relations whose primes-and-things occur too often to be useful. The ones that are left lead to a linear algebra problem that is smaller and better behaved. Solving the matrix created leads to the collection of relations described above. When the filtering works, it will be because the number of unique relations exceeds the number of unique ideals by at least a certain amount, leading to a matrix that is large enough. The difference between the number of relations and number of ideals (what the code calls 'the excess') is the minimum size of matrix that can be made from the current dataset. Also, the .lp file is basically a compressed representation of each relation, which is useful for speeding up the filtering. It's recreated from scratch whenever the filtering runs. |
[url=http://www.sendspace.com/file/k48btl]115-119 done[/url]: 3811160 relations.
|
[QUOTE=Andi47;202815]67-70 done, I will upload the relations tonight.
[CODE]67-68.5: 1711677 relations 68.5-70: 1708114 relations [B]total:[/B] 3419791 relations[/CODE][/QUOTE] 119-119.4 done: 119.0-119.2: 186858 relations 119.2-119.4: 190525 relations edit: link to a tar.bz2 archive which holds the ranges 67-70 and 119-119.4: [url]http://www.sendspace.com/file/hgsp8g[/url] |
Nearly 3800 out of 3900 results for RSALS.
To whom should I send the information that enables downloading the data directly from the RSALS server ? |
[quote=debrouxl;202846]Nearly 3800 out of 3900 results for RSALS.
To whom should I send the information that enables downloading the data directly from the RSALS server ?[/quote] Joshua2. Here's the link to PM him: [url]http://www.mersenneforum.org/private.php?do=newpm&u=1286[/url] |
What is the status of postprocessing? We should have enough relations by now?
|
[QUOTE=Andi47;203085]What is the status of postprocessing? We should have enough relations by now?[/QUOTE]
Joshua2 has probably started the linalg but is unable to do anything on the computer because of it. Looking at the "Last Activity" on his profile, that may well be the case. |
[code]memory use: 3906.5 MB
saving the first 48 matrix rows for later matrix is 9794898 x 9795123 (2873.1 MB) with weight 741367806 (75.69/col) sparse part has weight 655209426 (66.89/col) matrix includes 64 packed rows using block size 65536 for processor cache size 4096 kB commencing Lanczos iteration (4 threads) memory use: 3070.6 MB linear algebra completed 62469 of 9795123 dimensions (0.6%, ETA 336h34m)[/code] |
[QUOTE=Joshua2;203137][code]
linear algebra completed 62469 of 9795123 dimensions (0.6%, ETA 336h34m)[/code][/QUOTE] 2 weeks on the nose seems about right. If you have a crash - most likely due to a power failure, because msieve linear algebra is very reliable - make sure you restart with the -ncr flag, and not -nc or -nc2, so you don't overwrite the checkpoint (*.chk) file. This may have changed since the last time I checked, which was several releases ago, but I wouldn't want to find out the hard way that it hadn't. |
[QUOTE=Joshua2;203137][code]memory use: 3906.5 MB
saving the first 48 matrix rows for later matrix is 9794898 x 9795123 (2873.1 MB) with weight 741367806 (75.69/col) sparse part has weight 655209426 (66.89/col) matrix includes 64 packed rows using block size 65536 for processor cache size 4096 kB commencing Lanczos iteration (4 threads) memory use: 3070.6 MB linear algebra completed 62469 of 9795123 dimensions (0.6%, ETA 336h34m)[/code][/QUOTE] So 3.9 GB for building the matrix, and 3 GB for the matrix step itself. Thus, anything above ~c165 will probably need more than 4 GB. BTW: What was the total relations count (raw and unique)? |
Well, my guess for a week of LA was within a factor of two of being right. :lol:
|
[code] don't know which numbers u mean so...
found 23571633 hash collisions in 104490602 relations added 4 free relations commencing duplicate removal, pass 2 found 24543879 duplicates and 79946727 unique relations memory use: 660.8 MB reading ideals above 83820544 commencing singleton removal, initial pass memory use: 1506.0 MB reading all ideals from disk memory use: 1401.5 MB commencing in-memory singleton removal begin with 79946727 relations and 75124962 unique ideals reduce to 38203445 relations and 27416635 ideals in 18 passes max relations containing the same ideal: 26 reading ideals above 720000 commencing singleton removal, initial pass memory use: 753.0 MB reading all ideals from disk memory use: 1430.9 MB keeping 36943195 ideals with weight <= 200, target excess is 205761 commencing in-memory singleton removal begin with 38203451 relations and 36943195 unique ideals reduce to 37901371 relations and 36640758 ideals in 14 passes max relations containing the same ideal: 200 removing 3780522 relations and 3380522 ideals in 400000 cliques commencing in-memory singleton removal begin with 34120849 relations and 36640758 unique ideals reduce to 33866739 relations and 33002482 ideals in 11 passes max relations containing the same ideal: 194 removing 2760613 relations and 2360613 ideals in 400000 cliques commencing in-memory singleton removal begin with 31106126 relations and 33002482 unique ideals reduce to 30946183 relations and 30479952 ideals in 9 passes max relations containing the same ideal: 186 removing 1525944 relations and 1298396 ideals in 227548 cliques commencing in-memory singleton removal begin with 29420239 relations and 30479952 unique ideals reduce to 29367177 relations and 29128103 ideals in 7 passes max relations containing the same ideal: 180 relations with 0 large ideals: 672 relations with 1 large ideals: 754 relations with 2 large ideals: 11635 relations with 3 large ideals: 125059 relations with 4 large ideals: 740444 relations with 5 large ideals: 2630834 relations with 6 large ideals: 5886789 relations with 7+ large ideals: 19970990 commencing 2-way merge reduce to 18229991 relation sets and 17990917 unique ideals commencing full merge memory use: 2084.7 MB found 9830576 cycles, need 9799117 weight of 9799117 cycles is about 685979439 (70.00/cycle) distribution of cycle lengths: 1 relations: 1397966 2 relations: 1371671 3 relations: 1304319 4 relations: 1122293 5 relations: 934181 6 relations: 783598 7 relations: 645832 8 relations: 521104 9 relations: 419278 10+ relations: 1298875 heaviest cycle: 22 relations commencing cycle optimization start with 50378352 relations pruned 1034751 relations memory use: 1722.2 MB distribution of cycle lengths: 1 relations: 1397966 2 relations: 1401124 3 relations: 1347708 4 relations: 1141268 5 relations: 947827 6 relations: 784477 7 relations: 642297 8 relations: 513240 9 relations: 410654 10+ relations: 1212556 heaviest cycle: 22 relations RelProcTime: 4943 commencing linear algebra read 9799117 cycles cycles contain 29145139 unique relations read 29145139 relations using 20 quadratic characters above 1073741568 building initial matrix memory use: 3906.5 MB read 9799117 cycles matrix is 9798940 x 9799117 (2985.7 MB) with weight 924467746 (94.34/col) sparse part has weight 665081683 (67.87/col) filtering completed in 2 passes matrix is 9794946 x 9795123 (2985.4 MB) with weight 924333719 (94.37/col) sparse part has weight 665055415 (67.90/col) read 9795123 cycles matrix is 9794946 x 9795123 (2985.4 MB) with weight 924333719 (94.37/col) sparse part has weight 665055415 (67.90/col) saving the first 48 matrix rows for later matrix is 9794898 x 9795123 (2873.1 MB) with weight 741367806 (75.69/col) sparse part has weight 655209426 (66.89/col) matrix includes 64 packed rows using block size 65536 for processor cache size 4096 kB commencing Lanczos iteration (4 threads) memory use: 3070.6 MB linear algebra at 0.0%, ETA 336h 3m5123 dimensions (0.0%, ETA 336h 3m) linear algebra completed 190000 of 9795123 dimensions (1.9%, ETA 310h42m) dropping fast [/code] |
[quote]begin with 79946727 relations and 75124962 unique ideals
reduce to 38203445 relations and 27416635 ideals in 18 passes[/quote] Out of curiosity, was such a large cut (more than half !) on the number of usable relations expected at that early stage ? |
The large number of singleton removal passes means that you don't have very much excess beyond the minimum needed to construct the matrix; if you had sieved for slightly longer then many more relations would survive. Unfortunately, experience shows that the matrix must be very large before the time saved solving a smaller matrix exceeds the extra sieving effort.
|
linear algebra completed 3853266 of 9795123 dimensions (39.3%, ETA 187h 2m) a day ahead of schedule?
|
[QUOTE=jasonp;203689]The large number of singleton removal passes means that you don't have very much excess beyond the minimum needed to construct the matrix; if you had sieved for slightly longer then many more relations would survive. Unfortunately, experience shows that the matrix must be very large before the time saved solving a smaller matrix exceeds the extra sieving effort.[/QUOTE]
Does that mean it might fail? |
I don't think anything has gone wrong; it's just that most projects for inputs this large have had a little more sieving. Your matrix should work out fine.
|
ok 92.2% ETA 24 hrs going home for weekend result sunday/monday
|
The matrix size for the previous c161 of iteration 2483 was
6676995 x 6677221 but the matrix dimensions for this c163 is 9794898 x 9795123 Could only just TWO digits cause a significant difference in the size of matrix that is being produced up simply? :alien: PS: My post processing jobs for 6,355- 2,935- have been submitted up into the compute cluster. Their details will only be known after the entire thing has been completed up. |
[QUOTE=Raman;204756]The matrix size for the previous c161 of iteration 2483 was
6676995 x 6677221 but the matrix dimensions for this c163 is 9794898 x 9795123 Could only just TWO digits cause a significant difference in the size of matrix that is being produced up simply? :alien:[/QUOTE] The c161 was slightly oversieved, but this c163 was not. |
C16x matrix sizes
I am factoring up a C162 (size 5.2x10^161). I did not sieve it much up past the minimum to get up a workable matrix. The matrix has up dimensions around 8430000 by 8430000.
I would not sieve it any more up. The linear algebra is taking up a long time to finish up, but longer time to sieve up the matrix does not bring up corresponding savings in the time you need up to Lanczos it up. |
[QUOTE]Last fiddled with by 10metreh on 07 Feb 10 at 12:35 PM Reason: adding a few more ups ;) [/QUOTE]
I have a problem with moderators editing posts. If you dislike the tone of my post, you can confront me, or PM me about it. If you can't refrain from abusing your moderator privileges, then perhaps you should find another forum which welcomes your hundreds of irrelevant "me-too" postings. |
[QUOTE=FactorEyes;204803]If you can't refrain from abusing your moderator privileges, then perhaps you should find another forum which welcomes your hundreds of irrelevant "me-too" postings.[/QUOTE]
Out of my last 25 postings (excluding this one), 21 are definitely of some use, 3 more could possibly be deleted, and the last was a post that gave me the opportunity for a little joke. I admit that out of my first 500 posts probably about 20 were useful, but I have changed a lot since then. OK, the edit was useless. I admit it. |
[url]http://factorization.ath.cx/search.php?se=1&aq=4788&action=last20[/url]
|
Well done!
|
Excellent!
And, the base composite is still getting smaller, (218...[sub]<172>)[/sub], 212...[sub]<172>)[/sub], 206...[sub]<172>)[/sub], 205...[sub]<172>)[/sub].:smile: Shhhhh! Don't scare it... Take Care, Ed |
[quote=EdH;204893]Excellent!
And, the base composite is still getting smaller, (218...[sub]<172>)[/sub], 212...[sub]<172>)[/sub], 206...[sub]<172>)[/sub], 205...[sub]<172>)[/sub].:smile: Shhhhh! Don't scare it... Take Care, Ed[/quote] Line 2510 starts with 2^6 * 59. It will rise in the next index. Probably to about 2097...[sub]<172>[/sub] (from [URL="http://factordb.com/search.php?se=1&aq=2059364571286410677020543484408287892989855438065912821638727573659611264191658972984692429222265590897324972168961605068543456232356190666778897778782385787610683529527104"]this[/URL], which is like the line but with p63*p63 that are almost the c125). |
[quote=Mini-Geek;204901]Line 2510 starts with 2^6 * 59. It will rise in the next index. Probably to about 2097...[sub]<172>[/sub] (from [URL="http://factordb.com/search.php?se=1&aq=2059364571286410677020543484408287892989855438065912821638727573659611264191658972984692429222265590897324972168961605068543456232356190666778897778782385787610683529527104"]this[/URL], which is like the line but with p63*p63 that are almost the c125).[/quote]
See, now you scared it...:smile: Heading back to the subproject... |
| All times are UTC. The time now is 09:57. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.