mersenneforum.org (https://www.mersenneforum.org/index.php)
-   Cunningham Tables (https://www.mersenneforum.org/forumdisplay.php?f=51)
-   -   Parameter Underestimation (https://www.mersenneforum.org/showthread.php?t=13983)

 R.D. Silverman 2010-09-29 14:57

[QUOTE=R.D. Silverman;231901]I fixed the latred bug. I need to rebuild my code, and re-install it.[/QUOTE]

The code with the modified latred is producing twice as many relations
(on average) per special q as the old version.

 chris2be8 2010-09-29 17:24

Could you resieve just the special-Qs that didn't generate any relations? That should get enough relations without producing many duplicates.

Chris K

 R.D. Silverman 2010-09-29 17:30

[QUOTE=chris2be8;231934]Could you resieve just the special-Qs that didn't generate any relations? That should get enough relations without producing many duplicates.

Chris K[/QUOTE]

Nice suggestion.

It would mean changing the code to do the following:

Compute the reduced lattice the old way. If valid, go on to the next q.
If invalid, recompute the reduced lattice with the new code and then sieve.

This is surely worth doing. The problem is finding the time to do the
recoding. It isn't a lot of code, but I have a lot of other urgent stuff
to do. I may be able to get to it this weekend. I still have to rebuild
the current code with the new latred code.

 jrk 2010-09-29 19:56

[QUOTE=R.D. Silverman;231889]It won't work. Almost all relations found with composite q will be duplicates.
If norm(a + balpha)/(pq) is smooth, it will have been found with either
special_q = p, or special_q = q (or both!)[/QUOTE]
You wouldn't use all composite special_q's... only those who's prime factors are < special_q_min, to avoid those duplicates you mention. For a moment, I forgot that you prefer very small special_q, so this might not be practical for you.

In any case it's moot since you fixed the original problem. Congrats.

All times are UTC. The time now is 04:20.