View Single Post
Old 2010-08-08, 20:23   #5
diep's Avatar
Sep 2006
The Netherlands

2×5×7×11 Posts

Originally Posted by lavalamp View Post
If it takes you 18 hours to do 68 -> 69, then it will take you 36 hours to do 69 -> 70, for a total of 54 hours.

Each bit increase represents a DOUBLING of the work done in the previous level.

To put it another way, it would take the same amount of CPU time to trial factor from scratch to 69, as it would to trial factor from 69 to 70.
The break even points i have here from GIMPS, which we also tried to use initially for Wagstaff, they are based upon old hardware.

Todays hardware is far faster in floating point and relative slower in trial factoring, so the break even points are not so accurate anymore.

"slower" i mean: the intels are ugly slow in trial factoring versus the AMD's very fast, yet in floating point both intel as well as todays AMD's are real fast for the LL. Something like moving from effectively 1-2 Gflops per cycle to 4+ today, versus trial factoring still same speed.


Last fiddled with by diep on 2010-08-08 at 20:25
diep is offline   Reply With Quote