mersenneforum.org Volunteer needed for sieve merging
 Register FAQ Search Today's Posts Mark Forums Read

 2007-01-01, 03:26 #1 MooooMoo Apprentice Crank     Mar 2006 2·227 Posts Volunteer needed for sieve merging I need someone to manage the coordination of the sieve data files. Right now, the plan is to have each siever upload the new, smaller output file to http://www.transferbigfiles.com/Default.aspx once the siever is done with the range. Someone will then download the files from that site, and then use NewPGen's feature "merge across several machines" to merge those files, which will result in getting a new, smaller sieve file. None of my computers have enough RAM to handle that task, so I'm looking for someone who has that memory. Whoever volunteers for this should have a large amount of RAM (at least 1G), fast connection speed (download speed of 1000 Kbps or better), enough hard disk space (about 10-20 Gb), and a relatively fast processor (P4 3 GHz or better). However, the large size of those files means having a large RAM is the most important characteristic. If you'd like to volunteer for this, post on this thread with your computer's specifications.
2007-01-01, 05:45   #2
jasong

"Jason Goatcher"
Mar 2005

5×701 Posts

Quote:
 Originally Posted by MooooMoo I need someone to manage the coordination of the sieve data files. Right now, the plan is to have each siever upload the new, smaller output file to http://www.transferbigfiles.com/Default.aspx once the siever is done with the range. Someone will then download the files from that site, and then use NewPGen's feature "merge across several machines" to merge those files, which will result in getting a new, smaller sieve file. None of my computers have enough RAM to handle that task, so I'm looking for someone who has that memory. Whoever volunteers for this should have a large amount of RAM (at least 1G), fast connection speed (download speed of 1000 Kbps or better), enough hard disk space (about 10-20 Gb), and a relatively fast processor (P4 3 GHz or better). However, the large size of those files means having a large RAM is the most important characteristic. Edit: My connection is about 350-400 KB/sec (bytes, rather than bits. I'm too lazy to do the math ;) ) Edit: It would probably be a good idea for people to zip their files. If you'd like to volunteer for this, post on this thread with your computer's specifications.
I have 2 GB of RAM, the necessary hard drive space, a 2.8GHz Pentium-D(dual-core). I'm thinking I should use a Live CD to perform this task so that I have maximum RAM. Would the Live CD be able to access the Windows files, or would I have to get creative?

Not that I'm demanding to do it. :)

Last fiddled with by jasong on 2007-01-01 at 06:00

 2007-01-01, 07:11 #3 paulunderwood     Sep 2002 Database er0rr 37×97 Posts Rather than shifting around massive sieve files in order to use NewPGen's "merge service", just ship all divisor-files. This is what we did with 321. Once all ".del" files are in a directory, a quick "cat", "sort" "uniq" and the magic of a "perl" script will do the merge (and more, such as verification (via Pari/GP)!) Last fiddled with by paulunderwood on 2007-01-01 at 07:12
2007-01-01, 07:46   #4
paulunderwood

Sep 2002
Database er0rr

37×97 Posts

If it helps,

sh_321_get_latest_sieve_file.sh:
Quote:
 cat *.del | cut -f3 -d= | sort -n | uniq > delete_file perl my321_remove_deleted.pl > latest_sieve_file cat *.del | sort | uniq | perl my321_check_divisors.pl > check_file gp < check_file
my321_remove_deleted.pl:
Quote:
 #!/usr/bin/perl open DELETE_FILE,"delete_file"; open SIEVE_FILE,"sieved_to_1M.txt"; $sieve_head=; print$sieve_head; $delete_value=; while($delete_value == ""){$delete_value=;} while($output=){ @sieve_line = split ' ',$output; while($sieve_line[1] > $delete_value&&$delete_value!=EOF){ $delete_value=;} if($sieve_line[1]!=$delete_value){print$output;}} close SIEVE_FILE; close DELETE_FILE;
And from Thomas, my321_check_divisors.pl:
Quote:
 #!/usr/bin/perl -w while(<>) { # p=53375470179683 divides n=4259058 # $1$2 if(m/p=(\d+) divides n=(\d+)/) { print("if(lift(3*Mod(2,$1)^$2-1)!=0,print(\"$1$2 bad\"))\n"); } else { print STDERR "What's \$_"; } }

 2007-01-01, 13:20 #5 gribozavr     Mar 2005 Internet; Ukraine, Kiev 11·37 Posts I didn't manage to get NewPGen write newpgen.del file, but I have written two C programs which 1. find differences between two sieved files producing a small "removed k" file 2. merge the "removed k" file back into other sieved file. This method does not defeat cheating and errors (though newpgen has a "verify" option). But it is effectively the same as uploading the whole sieve file, but cheaper in traffic. Maybe someone could post instructions to force newpgen to write .del file?
 2007-01-01, 20:02 #6 MooooMoo Apprentice Crank     Mar 2006 2×227 Posts gribozavr: To get a .del file, just click on options ---------> Log the numbers removed. You have to do this every time you start NewPGen, and it'll only log numbers with factors > 2^32.
2007-01-01, 20:17   #7
paulunderwood

Sep 2002
Database er0rr

E0516 Posts

Quote:
 Originally Posted by gribozavr Maybe someone could post instructions to force newpgen to write .del file?
./newpgen -h should help you. The following is an example I ran from a batch file:

Code:
cd /home/paul/newpgen
./newpgen -osp=32000000000000 -ol -oa -ci=321_3-5M.txt -cp=321_3-5M.txt
It's the "-ol" that does the logging. For the windoze version you select "log removed numbers" under options -- but you may well have to this every time you use it??

Last fiddled with by paulunderwood on 2007-01-01 at 20:18

2007-01-01, 20:37   #8
paulunderwood

Sep 2002
Database er0rr

37·97 Posts

Quote:
 DSM_LogDeleted = 1
in NewPGen.ini might help

 2007-01-01, 20:48 #9 jmblazek     Nov 2006 Earth 10000002 Posts Surely the .del files are the way to go. I've been logging removed factors...just selected Newpgen's “log” option (windows version). I currently have about 120,000 factors from 15T in a 5 meg file. Beats the heck out of merging output files in which mine is still 742 meg. Why all this emphasis on cheaters…where’s the trust in the world?
 2007-01-01, 21:13 #10 gribozavr     Mar 2005 Internet; Ukraine, Kiev 19716 Posts paulunderwood, thanks a lot. I didn't even think that new newpgen has a command-line mode. As for me, it is much more comfortable than menu. And, ./newpgen -ol did the trick. jmblazek: Not really emphasis... That's just one more thing to think about. But returning verifyable results is always better: factors are better than just k's. Maybe people could keep the files now, and report just the ranges they sieve? When a twin will be found or llr reaches k ~ 20G, we will collect the results and merge them all at once?

 Similar Threads Thread Thread Starter Forum Replies Last Post pinhodecarlos NFS@Home 46 2018-03-12 22:43 robert44444uk Software 55 2009-08-12 06:39 beyastard Software 55 2009-07-29 12:51 Prime95 Lounge 165 2006-06-03 23:57 ThomRuley Marin's Mersenne-aries 6 2004-04-26 19:40

All times are UTC. The time now is 14:15.

Thu Feb 25 14:15:47 UTC 2021 up 84 days, 10:27, 0 users, load averages: 4.39, 4.52, 4.23