mersenneforum.org  

Go Back   mersenneforum.org > Factoring Projects > Msieve

Reply
 
Thread Tools
Old 2013-11-08, 02:10   #12
wombatman
I moo ablest echo power!
 
wombatman's Avatar
 
May 2013

23×223 Posts
Default

Try the prebuilt dll from here: ftp://sourceware.org/pub/pthreads-wi...lease/dll/x64/

If that doesn't work, I actually got another version compiled in MinGW-64 that I'll attach to this thread tomorrow along with the dll that works on mine.
wombatman is offline   Reply With Quote
Old 2013-11-08, 03:20   #13
swellman
 
swellman's Avatar
 
Jun 2012

3,203 Posts
Default

That dll seemed to do the trick! msieve now runs, though it will be a couple of days before I can really shake it out.

Thanks.
swellman is offline   Reply With Quote
Old 2013-11-08, 04:10   #14
wombatman
I moo ablest echo power!
 
wombatman's Avatar
 
May 2013

23×223 Posts
Default

You bet. And if you want a copy of msieve that can read straight from zipped file (i.e., with ZLIB compiled in), let me know. That's the one I put together in MinGW-64.

Seems like my issues were with that one particular relations file. How very strange.
wombatman is offline   Reply With Quote
Old 2013-12-01, 16:56   #15
WraithX
 
WraithX's Avatar
 
Mar 2006

7468 Posts
Default

I'd like one that has ZLIB compiled in. I'm about to do a test run of post-processing a LOT of relations and would rather not uncompress them all. Could you post the exe and any needed dll's?

edit: Also, what svn will this be? Can you make one from the latest msieve svn? (at least 945?)
WraithX is offline   Reply With Quote
Old 2013-12-01, 17:41   #16
wombatman
I moo ablest echo power!
 
wombatman's Avatar
 
May 2013

23·223 Posts
Default

I'll have to post it when I get home tonight (flying home from seeing my family for Thanksgiving). I think the SVN is 945 or 946, but I'll see when I get home.
wombatman is offline   Reply With Quote
Old 2013-12-01, 21:01   #17
swellman
 
swellman's Avatar
 
Jun 2012

1100100000112 Posts
Default

FWIW the current executable has SVN 946 in its title.

Been using it for post processing 31 bit jobs for NFS@Home with great success. But I've recently started running the relations through remdups first to avoid overwhelming msieve with errors. Oddities in the relations data file seems to be an unfortunate side effect of sieving via BOINC.

Just curious - how many relations will you be processing?

Last fiddled with by swellman on 2013-12-01 at 21:03 Reason: Added link
swellman is offline   Reply With Quote
Old 2013-12-02, 01:30   #18
WraithX
 
WraithX's Avatar
 
Mar 2006

2·35 Posts
Default

Quote:
Originally Posted by swellman View Post
FWIW the current executable has SVN 946 in its title.

Been using it for post processing 31 bit jobs for NFS@Home with great success. But I've recently started running the relations through remdups first to avoid overwhelming msieve with errors. Oddities in the relations data file seems to be an unfortunate side effect of sieving via BOINC.

Just curious - how many relations will you be processing?
Good to know about the MSVC svn version.

I wish I could run these rels through remdups (or remdups4) first. But I'm not sure if there is a version that works on windows. Do you know of one?

Also, I'm not sieving via BOINC, I've set up my own php web page that hands out assignments to my sieving machines. Those machines are running a custom python script (largely based on Brian's factmsieve.py) to get work from that web page.

I'm glad you asked me how many relations I'll be processing. In the process of finding out for each gz file, I would run "zcat rel_xyz.gz | wc -l", this actually turned up some gzip errors in some of my archives. The errors were "invalid compressed data--format violated". Not sure why I got the errors, but I may have to go back and re-sieve those ranges. So, I've sieved from 10M-400M. The problem ranges were in 163M, 342M, 343M, 344M. Excluding those, I currently have a total of 352.5M relations (around 22.1GB). (I also have some from 500M-563M, but those are on another machine far away, so I can't include those in this count) However, I'm not sure if this is enough for the C210 that I'm working on. So, I'm going to run a test pass of the post-processing to see if I have enough or if I should finish the 400M-500M range, too.
WraithX is offline   Reply With Quote
Old 2013-12-02, 02:21   #19
swellman
 
swellman's Avatar
 
Jun 2012

3,203 Posts
Default

Quote:
Originally Posted by WraithX View Post
Good to know about the MSVC svn version.

I wish I could run these rels through remdups (or remdups4) first. But I'm not sure if there is a version that works on windows. Do you know of one?
Try this.

Quote:
I'm glad you asked me how many relations I'll be processing. In the process of finding out for each gz file, I would run "zcat rel_xyz.gz | wc -l", this actually turned up some gzip errors in some of my archives. The errors were "invalid compressed data--format violated". Not sure why I got the errors, but I may have to go back and re-sieve those ranges. So, I've sieved from 10M-400M. The problem ranges were in 163M, 342M, 343M, 344M. Excluding those, I currently have a total of 352.5M relations (around 22.1GB). (I also have some from 500M-563M, but those are on another machine far away, so I can't include those in this count) However, I'm not sure if this is enough for the C210 that I'm working on. So, I'm going to run a test pass of the post-processing to see if I have enough or if I should finish the 400M-500M range, too.
How many bits is your job? 31 bit jobs require ~235M raw (i.e. pre-filtered) relations, at least in the NFS@Home factorizations I've post processed recently. YMMV of course.
swellman is offline   Reply With Quote
Old 2013-12-02, 03:08   #20
WraithX
 
WraithX's Avatar
 
Mar 2006

2×35 Posts
Default

Quote:
Originally Posted by swellman View Post
Try this.

How many bits is your job? 31 bit jobs require ~235M raw (i.e. pre-filtered) relations, at least in the NFS@Home factorizations I've post processed recently. YMMV of course.
The post says that is a 32-bit binary. Will that have any problems with a > 22GB file?

I guess you could say this is a 32/33 bit job. Here are my sieving parameters that I'm using with the 16e siever:
rlim: 500000000
alim: 500000000
lpbr: 32
lpba: 33
mfbr: 64
mfba: 96
rlambda: 2.7
alambda: 3.7
WraithX is offline   Reply With Quote
Old 2013-12-03, 00:36   #21
swellman
 
swellman's Avatar
 
Jun 2012

62038 Posts
Default

Quote:
Originally Posted by WraithX View Post
The post says that is a 32-bit binary. Will that have any problems with a > 22GB file?
I've run files >25Gb through it with no problems, despite my system being 64-bit Win 7. It only takes ~20 minutes for remdups to chew through files of that size.

Quote:
I guess you could say this is a 32/33 bit job. Here are my sieving parameters that I'm using with the 16e siever:
rlim: 500000000
alim: 500000000
lpbr: 32
lpba: 33
mfbr: 64
mfba: 96
rlambda: 2.7
alambda: 3.7
It's a good sized job, well outside my experience. I've read about the different factor base technique but I can't remember how many relations you may need. North of 400M? Maybe some of the folks who work on big iron can tell you more.

ETA: there is this thread with a similar sized job.

Last fiddled with by swellman on 2013-12-03 at 00:45
swellman is offline   Reply With Quote
Old 2013-12-03, 04:14   #22
WraithX
 
WraithX's Avatar
 
Mar 2006

48610 Posts
Default

Quote:
Originally Posted by swellman View Post
I've run files >25Gb through it with no problems, despite my system being 64-bit Win 7. It only takes ~20 minutes for remdups to chew through files of that size.

It's a good sized job, well outside my experience. I've read about the different factor base technique but I can't remember how many relations you may need. North of 400M? Maybe some of the folks who work on big iron can tell you more.

ETA: there is this thread with a similar sized job.
Well, I had compiled a version of remdups4 a while back and tried it out today, with:
zcat msieve.dat.gz | remdups4 1500 -v > msieve.dat
This reduced my 356M relations down to 272M unique. So, this works for me on these large files.

I downloaded the msvc msieve svn946 that wombatman provided earlier. When i ran that on the msieve.dat file, it finally came back and told me it needed 1e6 more relations. :( So close! Just kidding, I know I'll need a lot more than 1e6, but I can dream, right?

And thanks for the link to that other factorization job. At worst, I may be only half-way through this factorization. I'm hoping I can get away with collecting only 400M, or 450M, or 500M unique rels. We'll see how it goes as time goes by.
WraithX is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
16e Post Processing Progress pinhodecarlos NFS@Home 8 2018-11-28 13:45
NFS@Home Post-Processing Rack Build pinhodecarlos NFS@Home 1 2016-09-27 12:34
Post-Processing Fails at Cycle Optimization wombatman Msieve 3 2013-10-12 04:51
Update on 7^254+1 post processing dleclair NFSNET Discussion 4 2005-04-05 09:51
Post processing for 2,757- xilman NFSNET Discussion 3 2003-11-06 14:23

All times are UTC. The time now is 12:07.


Mon Oct 18 12:07:32 UTC 2021 up 87 days, 6:36, 0 users, load averages: 2.40, 1.76, 1.47

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.