![]() |
CADO NFS
The cado nfs suite is now available from [url]http://cado.gforge.inria.fr[/url].
|
[QUOTE=Shaopu Lin;174811]The cado nfs suite is now available from [url]http://cado.gforge.inria.fr[/url].[/QUOTE]
Anyone have success building this? I keep getting errors during the build process. Seems quite complex with pthreads and MPI versions available. |
Heh, I can't even set up my networking for it properly, let alone build it.
|
How fast is it actually meant to be?
|
It built fine for me by just typing "make" (it downloaded CMAKE and built it first), and after I got the ssh-agent working it ran fine on localhost, factoring the c59 example provided. I have been unable to figure out how to make use of remote hosts, the syntax of the mach_desc file is not explained anywhere.
|
It was simply a matter of putting the executables on the remote host and editing the machine description part of the run_example.sh script to include the lines
[code][remote] tmpdir=$t/tmp cadodir=/path/to/build/directory/ remote_host_name cores=1[/code] It then used 1 remote and 1 local core for polyselect, 2 local and 1 remote core for the sieving, and 1 local for the rest. |
CADO NFS
Moving the discussion of CADO NFS out of the Links thread...
I've downloaded the source, compiled it using pthreads, and have successfully ran a GNFS factorization using the included perl script. I have also noticed that the poly file format and relation format matches that of GGNFS. (Thanks for that!) I have not yet figured out how (1) given a polynomial file, do a complete SNFS run, and (2) given a polynomial file and set of relations that possibly includes duplicates and bad relations, do all post-processing steps. Any guidance? Once I know how to do (2), I will determine how well bwc runs on our workstation with up to 32 threads, and on our beowulf cluster of 10x4 cores. |
[QUOTE=10metreh;174902]How fast is it actually meant to be?[/QUOTE]
The siever can't compete with Franke/Kleinjung's siever yet. It's slower and uses much more memory. The core sieving routines need a complete overhaul. Embarrassingly, it sieves special-q only on the algebraic side so far. [QUOTE=frmky;174928]I have not yet figured out how (1) given a polynomial file, do a complete SNFS run, and (2) given a polynomial file and set of relations that possibly includes duplicates and bad relations, do all post-processing steps. Any guidance? [/QUOTE] The perl script keeps track of which tasks are already done by <prefix>.<task>_done files, so you can write your own poly file and "touch <prefix>.polysel_done" (e.g., "touch 797161_29.polysel_done"). The perl script should generate the factor base and start sieving. If you already have relations, you should be able to copy your own files (matching the naming scheme of the perl script, e.g., "797161_29.rels.9000000-9100000") and simply run the perl script again. It should check the relation files, count how many relations there are, start sievers, and if there are enough relations, try a filtering run. Warning: a file that contains bad relations is deleted. In fact, the script is a bit over-eager "cleaning up" sometimes, [B]keep backups[/B]! More tomorrow, Alex |
[QUOTE=akruppa;174930]The siever can't compete with Franke/Kleinjung's siever yet. It's slower and uses much more memory. The core sieving routines need a complete overhaul. Embarrassingly, it sieves special-q only on the algebraic side so far.
The perl script keeps track of which tasks are already done by <prefix>.<task>_done files, so you can write your own poly file and "touch <prefix>.polysel_done" (e.g., "touch 797161_29.polysel_done"). The perl script should generate the factor base and start sieving. If you already have relations, you should be able to copy your own files (matching the naming scheme of the perl script, e.g., "797161_29.rels.9000000-9100000") and simply run the perl script again. It should check the relation files, count how many relations there are, start sievers, and if there are enough relations, try a filtering run. Warning: a file that contains bad relations is deleted. In fact, the script is a bit over-eager "cleaning up" sometimes, [B]keep backups[/B]! More tomorrow, Alex[/QUOTE] Is this a posix archive? The version of tar that I have will not read --posix archives. Indeed, after I gunzipped the file, neither tar -x not tar -t works on the file; tar just sits there and does nothing. |
I think it's a GNU tar archive... what version of tar are you using? Is a GNU version of tar installed somewhere, maybe named "gtar" ?
Alex |
[QUOTE=akruppa;174970]I think it's a GNU tar archive... what version of tar are you using? Is a GNU version of tar installed somewhere, maybe named "gtar" ?
Alex[/QUOTE] It is GNU tar 1.12 |
All times are UTC. The time now is 13:00. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.