![]() |
Volunteer needed for sieve merging
I need someone to manage the coordination of the sieve data files.
Right now, the plan is to have each siever upload the new, smaller output file to [url]http://www.transferbigfiles.com/Default.aspx[/url] once the siever is done with the range. Someone will then download the files from that site, and then use NewPGen's feature "merge across several machines" to merge those files, which will result in getting a new, smaller sieve file. None of my computers have enough RAM to handle that task, so I'm looking for someone who has that memory. Whoever volunteers for this should have a large amount of RAM (at least 1G), fast connection speed (download speed of 1000 Kbps or better), enough hard disk space (about 10-20 Gb), and a relatively fast processor (P4 3 GHz or better). However, the large size of those files means having a large RAM is the most important characteristic. If you'd like to volunteer for this, post on this thread with your computer's specifications. |
[QUOTE=MooooMoo;95106]I need someone to manage the coordination of the sieve data files.
Right now, the plan is to have each siever upload the new, smaller output file to [url]http://www.transferbigfiles.com/Default.aspx[/url] once the siever is done with the range. Someone will then download the files from that site, and then use NewPGen's feature "merge across several machines" to merge those files, which will result in getting a new, smaller sieve file. None of my computers have enough RAM to handle that task, so I'm looking for someone who has that memory. Whoever volunteers for this should have a large amount of RAM (at least 1G), fast connection speed (download speed of 1000 Kbps or better), enough hard disk space (about 10-20 Gb), and a relatively fast processor (P4 3 GHz or better). However, the large size of those files means having a large RAM is the most important characteristic. Edit: My connection is about 350-400 KB/sec (bytes, rather than bits. I'm too lazy to do the math ;) ) Edit: It would probably be a good idea for people to zip their files. If you'd like to volunteer for this, post on this thread with your computer's specifications.[/QUOTE] I have 2 GB of RAM, the necessary hard drive space, a 2.8GHz Pentium-D(dual-core). I'm thinking I should use a Live CD to perform this task so that I have maximum RAM. Would the Live CD be able to access the Windows files, or would I have to get creative? Not that I'm demanding to do it. :) |
Rather than shifting around massive sieve files in order to use NewPGen's "merge service", just ship all divisor-files. This is what we did with 321. Once all ".del" files are in a directory, a quick "cat", "sort" "uniq" and the magic of a "perl" script will do the merge (and more, such as verification (via Pari/GP)!) :wink:
|
If it helps,
sh_321_get_latest_sieve_file.sh: [QUOTE]cat *.del | cut -f3 -d= | sort -n | uniq > delete_file perl my321_remove_deleted.pl > latest_sieve_file cat *.del | sort | uniq | perl my321_check_divisors.pl > check_file gp < check_file [/QUOTE] my321_remove_deleted.pl: [QUOTE]#!/usr/bin/perl open DELETE_FILE,"delete_file"; open SIEVE_FILE,"sieved_to_1M.txt"; $sieve_head=<SIEVE_FILE>; print $sieve_head; $delete_value=<DELETE_FILE>; while($delete_value == ""){$delete_value=<DELETE_FILE>;} while($output=<SIEVE_FILE>){ @sieve_line = split ' ',$output; while($sieve_line[1] > $delete_value&&$delete_value!=EOF){ $delete_value=<DELETE_FILE>;} if($sieve_line[1]!=$delete_value){print $output;}} close SIEVE_FILE; close DELETE_FILE; [/QUOTE] And from Thomas, my321_check_divisors.pl: [QUOTE]#!/usr/bin/perl -w while(<>) { # p=53375470179683 divides n=4259058 # $1 $2 if(m/p=(\d+) divides n=(\d+)/) { print("if(lift(3*Mod(2,$1)^$2-1)!=0,print(\"$1 $2 bad\"))\n"); } else { print STDERR "What's $_"; } } [/QUOTE] |
I didn't manage to get NewPGen write newpgen.del file, but I have written two C programs which
1. find differences between two sieved files producing a small "removed k" file 2. merge the "removed k" file back into other sieved file. This method does not defeat cheating and errors (though newpgen has a "verify" option). But it is effectively the same as uploading the whole sieve file, but cheaper in traffic. Maybe someone could post instructions to force newpgen to write .del file? |
gribozavr:
To get a .del file, just click on options ---------> Log the numbers removed. You have to do this every time you start NewPGen, and it'll only log numbers with factors > 2^32. |
[QUOTE=gribozavr;95126]
Maybe someone could post instructions to force newpgen to write .del file?[/QUOTE] ./newpgen -h should help you. The following is an example I ran from a batch file: [CODE]cd /home/paul/newpgen ./newpgen -osp=32000000000000 -ol -oa -ci=321_3-5M.txt -cp=321_3-5M.txt [/CODE] It's the "-ol" that does the logging. For the windoze version you select "log removed numbers" under options -- but you may well have to this every time you use it?? |
[QUOTE]DSM_LogDeleted = 1
[/QUOTE] in NewPGen.ini might help :unsure: |
Surely the .del files are the way to go. I've been logging removed factors...just selected Newpgen's “log” option (windows version). I currently have about 120,000 factors from 15T in a 5 meg file. Beats the heck out of merging output files in which mine is still 742 meg.
Why all this emphasis on cheaters…where’s the trust in the world?[COLOR=black][/COLOR] |
paulunderwood, thanks a lot. I didn't even think that new newpgen has a command-line mode. As for me, it is much more comfortable than menu. And, ./newpgen -ol did the trick.
jmblazek: Not really emphasis... That's just one more thing to think about. But returning verifyable results is always better: factors are better than just k's. Maybe people could keep the files now, and report just the ranges they sieve? When a twin will be found or llr reaches k ~ 20G, we will collect the results and merge them all at once? |
| All times are UTC. The time now is 13:43. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2023, Jelsoft Enterprises Ltd.