![]() |
[QUOTE=R.D. Silverman;253272]I will drop another ~10M relations in snail mail to you later today.
You should get them in ~2days. Our firewall prevents a direct transfer of the data.[/QUOTE]Just wondering, can't you send them from home? Even a slow ADSL would probably take only a couple of hours and would save the trouble of copying the files to CD. Maybe it's time to store relations in a more efficient format or a separate utility to compress them in a more suitable format then *zip? |
Yes,
You can get the CD home and from home with the [URL="http://www.chiark.greenend.org.uk/~sgtatham/putty/download.html"]PuTTY[/URL] psftp.exe (or on linux, sftp) and you are all set, I'll send you the instructions by PM. (Especially that US post has a day off today, I believe) --Serge |
[QUOTE=smh;253299]Maybe it's time to store relations in a more efficient format or a separate utility to compress them in a more suitable format then *zip?[/QUOTE]
One could just store the a,b coordinates for each relation, which would compress down to about 8.7 bytes per relation with bzip2. Or, you could squeeze it down further by recording the lattice basis vectors for each specialq,root and then store the i,j lattice coordinates for each relation instead, which I guess would require about half the storage. "Decompressing" would be slow in either case. This might make it viable to send through email, if that was the only available method of transmission, though 10M relations will still be quite large. |
[QUOTE=jrk;253303]One could just store the a,b coordinates for each relation, which would compress down to about 8.7 bytes per relation with bzip2. Or, you could squeeze it down further by recording the lattice basis vectors for each specialq,root and then store the i,j lattice coordinates for each relation instead, which I guess would require about half the storage.[/QUOTE]
You don't need to record the vectors while you're sieving, you can recover them by rolling lattice-basis reduction ... keep the LLL of the last three x,y pairs, most of the time you'll be able to write the next one in terms of that basis and otherwise you output in clear. |
113-114 is done and is uploading.
|
[QUOTE=R.D. Silverman;252673]I am having shoulder surgery on 2/24 to remove some bone spurs
and repair my rotator cuff.[/QUOTE] We wish you the best of luck with the procedure, Bob, and a fast recovery! The ETA for this project is late night Friday, don't worry about it. |
The cofactor is a product of a p75 and a p105 and will appear on page 120 as Silverman+mersenneforum snfs. Thanks, everyone!
|
1 Attachment(s)
P.S. Here's the log (with a few filtering attempts at different points, and with a finely articulated cusp at 12.0M[SUP]2[/SUP]; the good matrix is 7.6M[SUP]2[/SUP] and a 4[SUP]x[/SUP] times less ETA).
|
I must be getting spoiled, this is the first big run I've seen in a long time that had to perform singleton removal from disk files. Apparently the estimate of RAM needed was just past the switchover point (half of total RAM).
|
Yeah, it works great (not everyone has 32Gb of RAM :rolleyes:)
Here's a fragment from 3,610+'s log ... [FONT=Arial Narrow]Tue Jan 4 08:12:25 2011 commencing duplicate removal, pass 1 Tue Jan 4 08:42:56 2011 found 10205937 hash collisions in 216381785 relations Tue Jan 4 08:43:59 2011 commencing duplicate removal, pass 2 Tue Jan 4 08:47:58 2011 found 21 duplicates and 216381765 unique relations Tue Jan 4 08:47:58 2011 memory use: 756.8 MB Tue Jan 4 08:47:58 2011 reading ideals above 720000 Tue Jan 4 08:47:58 2011 commencing singleton removal, initial pass Tue Jan 4 09:31:56 2011 memory use: 5512.0 MB Tue Jan 4 09:31:56 2011 removing singletons from LP file Tue Jan 4 09:31:56 2011 start with 216381765 relations and 182327021 ideals Tue Jan 4 09:35:35 2011 pass 1: found 44887504 singletons Tue Jan 4 09:37:07 2011 pass 2: found 8615107 singletons Tue Jan 4 09:38:34 2011 pass 3: found 1627313 singletons Tue Jan 4 09:40:02 2011 pass 4: found 290576 singletons Tue Jan 4 09:42:51 2011 pruned dataset has 160961265 relations and 122666568 large ideals Tue Jan 4 09:42:51 2011 reading all ideals from disk Tue Jan 4 09:43:57 2011 memory use: 6481.4 MB Tue Jan 4 09:44:54 2011 keeping 121915663 ideals with weight <= 200, target excess is 867008 Tue Jan 4 09:45:59 2011 commencing in-memory singleton removal Tue Jan 4 09:46:46 2011 begin with 160961265 relations and 121915663 unique ideals Tue Jan 4 09:53:03 2011 reduce to 160900260 relations and 121854650 ideals in 7 passes ...[/FONT] Great stuff (and results are identical to the ones on a large-memory machine) compared to filtering from ideals with, say, weight <= 40 as it were in the past. Many thanks! |
| All times are UTC. The time now is 08:04. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.