mersenneforum.org  

Go Back   mersenneforum.org > Prime Search Projects > Twin Prime Search

Reply
 
Thread Tools
Old 2007-01-01, 03:26   #1
MooooMoo
Apprentice Crank
 
MooooMoo's Avatar
 
Mar 2006

2×227 Posts
Default Volunteer needed for sieve merging

I need someone to manage the coordination of the sieve data files.

Right now, the plan is to have each siever upload the new, smaller output file to http://www.transferbigfiles.com/Default.aspx once the siever is done with the range. Someone will then download the files from that site, and then use NewPGen's feature "merge across several machines" to merge those files, which will result in getting a new, smaller sieve file. None of my computers have enough RAM to handle that task, so I'm looking for someone who has that memory.

Whoever volunteers for this should have a large amount of RAM (at least 1G), fast connection speed (download speed of 1000 Kbps or better), enough hard disk space (about 10-20 Gb), and a relatively fast processor (P4 3 GHz or better). However, the large size of those files means having a large RAM is the most important characteristic.

If you'd like to volunteer for this, post on this thread with your computer's specifications.
MooooMoo is offline   Reply With Quote
Old 2007-01-01, 05:45   #2
jasong
 
jasong's Avatar
 
"Jason Goatcher"
Mar 2005

66638 Posts
Default

Quote:
Originally Posted by MooooMoo View Post
I need someone to manage the coordination of the sieve data files.

Right now, the plan is to have each siever upload the new, smaller output file to http://www.transferbigfiles.com/Default.aspx once the siever is done with the range. Someone will then download the files from that site, and then use NewPGen's feature "merge across several machines" to merge those files, which will result in getting a new, smaller sieve file. None of my computers have enough RAM to handle that task, so I'm looking for someone who has that memory.

Whoever volunteers for this should have a large amount of RAM (at least 1G), fast connection speed (download speed of 1000 Kbps or better), enough hard disk space (about 10-20 Gb), and a relatively fast processor (P4 3 GHz or better). However, the large size of those files means having a large RAM is the most important characteristic.

Edit: My connection is about 350-400 KB/sec (bytes, rather than bits. I'm too lazy to do the math ;) )

Edit: It would probably be a good idea for people to zip their files.

If you'd like to volunteer for this, post on this thread with your computer's specifications.
I have 2 GB of RAM, the necessary hard drive space, a 2.8GHz Pentium-D(dual-core). I'm thinking I should use a Live CD to perform this task so that I have maximum RAM. Would the Live CD be able to access the Windows files, or would I have to get creative?

Not that I'm demanding to do it. :)

Last fiddled with by jasong on 2007-01-01 at 06:00
jasong is offline   Reply With Quote
Old 2007-01-01, 07:11   #3
paulunderwood
 
paulunderwood's Avatar
 
Sep 2002
Database er0rr

3,677 Posts
Default

Rather than shifting around massive sieve files in order to use NewPGen's "merge service", just ship all divisor-files. This is what we did with 321. Once all ".del" files are in a directory, a quick "cat", "sort" "uniq" and the magic of a "perl" script will do the merge (and more, such as verification (via Pari/GP)!)

Last fiddled with by paulunderwood on 2007-01-01 at 07:12
paulunderwood is offline   Reply With Quote
Old 2007-01-01, 07:46   #4
paulunderwood
 
paulunderwood's Avatar
 
Sep 2002
Database er0rr

1110010111012 Posts
Default

If it helps,

sh_321_get_latest_sieve_file.sh:
Quote:
cat *.del | cut -f3 -d= | sort -n | uniq > delete_file
perl my321_remove_deleted.pl > latest_sieve_file

cat *.del | sort | uniq | perl my321_check_divisors.pl > check_file

gp < check_file
my321_remove_deleted.pl:
Quote:
#!/usr/bin/perl
open DELETE_FILE,"delete_file";
open SIEVE_FILE,"sieved_to_1M.txt";
$sieve_head=<SIEVE_FILE>;
print $sieve_head;
$delete_value=<DELETE_FILE>;
while($delete_value == ""){$delete_value=<DELETE_FILE>;}
while($output=<SIEVE_FILE>){
@sieve_line = split ' ',$output;
while($sieve_line[1] > $delete_value&&$delete_value!=EOF){
$delete_value=<DELETE_FILE>;}
if($sieve_line[1]!=$delete_value){print $output;}}
close SIEVE_FILE;
close DELETE_FILE;
And from Thomas, my321_check_divisors.pl:
Quote:
#!/usr/bin/perl -w
while(<>)
{
# p=53375470179683 divides n=4259058
# $1 $2
if(m/p=(\d+) divides n=(\d+)/)
{
print("if(lift(3*Mod(2,$1)^$2-1)!=0,print(\"$1 $2 bad\"))\n");
}
else { print STDERR "What's $_"; }
}
paulunderwood is offline   Reply With Quote
Old 2007-01-01, 13:20   #5
gribozavr
 
gribozavr's Avatar
 
Mar 2005
Internet; Ukraine, Kiev

11·37 Posts
Default

I didn't manage to get NewPGen write newpgen.del file, but I have written two C programs which
1. find differences between two sieved files producing a small "removed k" file
2. merge the "removed k" file back into other sieved file.
This method does not defeat cheating and errors (though newpgen has a "verify" option). But it is effectively the same as uploading the whole sieve file, but cheaper in traffic.

Maybe someone could post instructions to force newpgen to write .del file?
gribozavr is offline   Reply With Quote
Old 2007-01-01, 20:02   #6
MooooMoo
Apprentice Crank
 
MooooMoo's Avatar
 
Mar 2006

2·227 Posts
Default

gribozavr:

To get a .del file, just click on options ---------> Log the numbers removed. You have to do this every time you start NewPGen, and it'll only log numbers with factors > 2^32.
MooooMoo is offline   Reply With Quote
Old 2007-01-01, 20:17   #7
paulunderwood
 
paulunderwood's Avatar
 
Sep 2002
Database er0rr

3,677 Posts
Default

Quote:
Originally Posted by gribozavr View Post
Maybe someone could post instructions to force newpgen to write .del file?
./newpgen -h should help you. The following is an example I ran from a batch file:

Code:
cd /home/paul/newpgen
./newpgen -osp=32000000000000 -ol -oa -ci=321_3-5M.txt -cp=321_3-5M.txt
It's the "-ol" that does the logging. For the windoze version you select "log removed numbers" under options -- but you may well have to this every time you use it??

Last fiddled with by paulunderwood on 2007-01-01 at 20:18
paulunderwood is offline   Reply With Quote
Old 2007-01-01, 20:37   #8
paulunderwood
 
paulunderwood's Avatar
 
Sep 2002
Database er0rr

3,677 Posts
Default

Quote:
DSM_LogDeleted = 1
in NewPGen.ini might help
paulunderwood is offline   Reply With Quote
Old 2007-01-01, 20:48   #9
jmblazek
 
jmblazek's Avatar
 
Nov 2006
Earth

4016 Posts
Default

Surely the .del files are the way to go. I've been logging removed factors...just selected Newpgen's “log” option (windows version). I currently have about 120,000 factors from 15T in a 5 meg file. Beats the heck out of merging output files in which mine is still 742 meg.

Why all this emphasis on cheaters…where’s the trust in the world?
jmblazek is offline   Reply With Quote
Old 2007-01-01, 21:13   #10
gribozavr
 
gribozavr's Avatar
 
Mar 2005
Internet; Ukraine, Kiev

11·37 Posts
Default

paulunderwood, thanks a lot. I didn't even think that new newpgen has a command-line mode. As for me, it is much more comfortable than menu. And, ./newpgen -ol did the trick.

jmblazek:
Not really emphasis... That's just one more thing to think about. But returning verifyable results is always better: factors are better than just k's.

Maybe people could keep the files now, and report just the ranges they sieve? When a twin will be found or llr reaches k ~ 20G, we will collect the results and merge them all at once?
gribozavr is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
More NFS@Home 16e Lattice Sieve V5 wu's needed pinhodecarlos NFS@Home 46 2018-03-12 22:43
Sieve needed for a^(2^b)+(a+1)^(2^b) robert44444uk Software 55 2009-08-12 06:39
Sieve needed for k*b1^m*b2^n+1 beyastard Software 55 2009-07-29 12:51
Volunteer to test M44 candidate Prime95 Lounge 165 2006-06-03 23:57
Looking for a volunteer for a dangerous mission... ThomRuley Marin's Mersenne-aries 6 2004-04-26 19:40

All times are UTC. The time now is 03:41.

Wed May 19 03:41:46 UTC 2021 up 40 days, 22:22, 0 users, load averages: 2.61, 2.46, 2.39

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.