mersenneforum.org Oversieving + estimate of relations needed
 User Name Remember Me? Password
 Register FAQ Search Today's Posts Mark Forums Read

 2009-01-01, 10:37 #1 RedGolpe     Aug 2006 Monza, Italy 22·3·5 Posts Oversieving + estimate of relations needed I just installed a new quad-core machine along with my old dual-core so I tried to run 6 instances of msieve on a test C104 during the night. All went pretty well but when I merged the .dat files I found that the number of combined relations was way too high (around 10x) and the algebra failed giving me the odd "only trivial relations found" error already discussed in a previous thread. So my questions are: 1) has this behaviour been investigated more, and are there any plans to "fix" it? Or maybe it already was and I just did something wrong? 2) Is there a way to estimate the number of relations needed on a distributed job so that it is possible to stop it with -r (which I assume stops at the relevant number of full+combined relations)? Thank you for your time.
 2009-01-01, 16:47 #2 jasonp Tribal Bullet     Oct 2004 1101110010012 Posts This is a known problem, and both QS and NFS have a surprising amount of trouble dealing with a dataset that contains enormously too many relations. With msieve, the NFS code is much more robust when that happens, but the QS code is almost defenseless against it. Factorizations that are about the same size will need about the same number of partial relations for the postprocessing to work, but nobody has catalogued what that target number should be, and nobody has automatic means to run multiple sieving instances and stop when the total relations found hits that target.
 2009-01-01, 17:21 #3 nuggetprime     Mar 2007 Austria 2·151 Posts Why not simply delete/ignore the relations that are too much? nugget
 2009-01-01, 19:23 #4 RedGolpe     Aug 2006 Monza, Italy 22×3×5 Posts Jason: Thank you. I thought we might reach a rough estimate based on the theory easily enough but obviously it's not that easy. Nugget: If I knew how many of them are too much, I would certainly do it. But in (say) this case we have 6 files combined, and we don't know what is the threshold that triggers the "bug". So how many files should I use? I could try with 2 and they might not be enough, and with 3 the bug might appear. So i might use only half of the third file and so on... not very practical, as you can see, it would just be trial and error. Last fiddled with by RedGolpe on 2009-01-01 at 19:27
 2009-01-01, 22:04 #5 jasonp Tribal Bullet     Oct 2004 DC916 Posts Trial and error is what people seem to use now. At C104 size you should probably try somewhere around 3-4 million relations; the minimum number actually depends more on the size of large primes than the size of the input number. Maybe I should modify the code to take the number of relations desired as an input parameter, to save everyone from having to reconstruct the savefile every time to have fewer and fewer relations. The code used to take that parameter but it was removed for some reason. Last fiddled with by jasonp on 2009-01-01 at 22:06
2009-01-02, 02:31   #6
Jeff Gilchrist

Jun 2003

7·167 Posts

Quote:
 Originally Posted by jasonp Maybe I should modify the code to take the number of relations desired as an input parameter, to save everyone from having to reconstruct the savefile every time to have fewer and fewer relations. The code used to take that parameter but it was removed for some reason.
I think that would be very useful and save a lot of headache for people that do run into this problem.

Jeff.

 2009-06-15, 10:32 #7 RedGolpe     Aug 2006 Monza, Italy 22·3·5 Posts I am running some experiments with different sized composites and I will post the results here for everyone's benefit. Input = Approximate size of composite to crack. Need = Number of relations needed. Full = Number of full relations found when result is output. Part = Number of partial relations found when result is output. Code: Input Need Full Part ----------------------- 10^60 3.0k 1.4k 15k 10^70 10k 4.6k 58k 10^80 50k 25k 274k Last fiddled with by RedGolpe on 2009-06-15 at 10:35
 2009-06-15, 12:08 #8 RedGolpe     Aug 2006 Monza, Italy 22·3·5 Posts For some reason I can't edit my own posts more than 60 minutes after posting... funny. I moved the table here so to have a complete overview.
2009-06-15, 12:28   #9
R.D. Silverman

Nov 2003

22×5×373 Posts

Quote:
 Originally Posted by RedGolpe I am running some experiments with different sized composites and I will post the results here for everyone's benefit. Input = Approximate size of composite to crack. Need = Number of relations needed. Full = Number of full relations found when result is output. Part = Number of partial relations found when result is output. Code: Input Need Full Part ----------------------- 10^60 3.0k 1.4k 15k 10^70 10k 4.6k 58k 10^80 50k 25k 274k
Doesn't anyone read my papers?????

Given the size of the numbers, I assume you are discussing QS.
I discussed all of this and more over 20 years ago in my 1987 Math.
Comp. paper

2009-06-15, 12:57   #10
RedGolpe

Aug 2006
Monza, Italy

22·3·5 Posts

Quote:
 Originally Posted by R.D. Silverman I discussed all of this and more over 20 years ago in my 1987 Math. Comp. paper
Wonderful. Can you also provide a free access link?

2009-06-15, 15:32   #11
10metreh

Nov 2008

2×33×43 Posts

Quote:
 Originally Posted by RedGolpe For some reason I can't edit my own posts more than 60 minutes after posting... funny. I moved the table here so to have a complete overview.
60 minutes is the forum limit unless you are a moderator.

 Similar Threads Thread Thread Starter Forum Replies Last Post chris2be8 Msieve 7 2010-03-13 21:51 jasonp Msieve 18 2009-04-09 03:20 jbristow Msieve 8 2008-01-02 21:43 fivemack Msieve 1 2007-09-28 18:26 fivemack Factoring 7 2007-08-04 17:32

All times are UTC. The time now is 17:53.

Sun Sep 20 17:53:29 UTC 2020 up 10 days, 15:04, 1 user, load averages: 2.84, 1.99, 1.69