![]() |
Status 10_229P
What's the progress of 10_229P?
Thanks in advance, Carlos |
I have some 85M relations for 10,229+ and am presently running a test filtering to see where we stand on the excess. I had planned to switch to the next project this weekend, but will do so a few days earlier if the numbers look good.
|
Let me rephrase my question. How much to end sieving?
On December 18th the sieve was about 20% done. How many machines are currently running NFSNET? Carlos |
[QUOTE=em99010pepe;96990]Let me rephrase my question. How much to end sieving?
On December 18th the sieve was about 20% done. Carlos[/QUOTE] Somewhere between zero and 5 days. We are presently trying to evaluate whether, or not, additional over-sieving will be an effective use of resources by allowing us to produce a somewhat smaller matrix. There are approximately 100 cpu's that are active. (That's a rough guess) |
We continued the sieving until I had 90M relations because the number that survived the initial filtering was near 20M. On the 4.25M row matrix that Paul was processing, there were only about 14.7M at this point.
As for sensitivity, adding an additional 300k relations was reducing the survivor pool by 100k. From the 90M, I still had 18.65M. This finally produced a matrix. Matrix has 5669968 rows and 5686763 columns. Ignored weight is 74659120. Remaining matrix weight is 366417399, density 0.0011%. Average prime weight 64.62, average relation-set weight 64.43. The Block Lanczos is using nearly 2GB of ram and will run for 2.5 months. It looks as if I really need to find some time to try to write some good assembler code for the G5 and speed this up. |
[QUOTE=Wacky;98842]
Matrix has 5669968 rows and 5686763 columns. Ignored weight is 74659120. Remaining matrix weight is 366417399, density 0.0011%. Average prime weight 64.62, average relation-set weight 64.43. The Block Lanczos is using nearly 2GB of ram and will run for 2.5 months.[/QUOTE]After posting this, Wacky made the matrix data available for download. The data is 810Mb after compression, so the download took several hours over 2 ADSL connections (his and mine) and finished a couple of minutes ago. I estimate my machine will take about 16 days to run the matrix, though as the data is still uncompressing and so the matrix hasn't even started, that's a very crude estimate based on the run time for the 5,313+ matrix. A much better estimate will be available tomorrow --- one accurate to within a few minutes if past experience is anything to go by. Paul |
[QUOTE=xilman;98895]A much better estimate will be available tomorrow --- one accurate to within a few minutes if past experience is anything to go by.[/QUOTE]
It is taking 1767 megabytes of memory and is predicted to take 16.8 days after running for 163.51 seconds. This is 1/8887 of the full computation, so a rather lengthy extrapolation! That said, I doubt the prediction will be more than a day from reality. Paul |
| All times are UTC. The time now is 00:09. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.