![]() |
|
|
#1 |
|
(loop (#_fork))
Feb 2006
Cambridge, England
72·131 Posts |
For C197_149_70, there's enough over-sieving that I end up with a 45.5M matrix of density 68. So I should probably cut down the input file somewhat.
I'm starting by the trivial steps of removing duplicates and removing lines containing a prime that only occurs once on that side, but obviously msieve does those stages itself - I'm just trying to get the 150GB file to a size where I can have a reasonable number of copies of it on my regrettably finite-sized (500G) fast SSD. Have people already done the experiments of whether it's better to remove the largest primes that only appear twice, or just the largest primes, or to pick seven hundred million lines at random after duplicate removal? Last fiddled with by fivemack on 2017-12-08 at 00:49 |
|
|
|
|
|
#2 |
|
(loop (#_fork))
Feb 2006
Cambridge, England
11001000100112 Posts |
I don't know sensible ways to go next, but:
Code:
Full set 45994203 68.58 (with td=200) p<1E0000000 42380410 88.13 (with td=200) 796421275r / 529369930i p<180000000 42370147 90.17 (with td=134) 646740456r / 440359514i Last fiddled with by fivemack on 2017-12-12 at 17:23 |
|
|
|
![]() |
| Thread Tools | |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| Android app recommendations | Uncwilly | Lounge | 11 | 2020-10-07 12:40 |
| Site Hosting Recommendations | wombatman | Lounge | 5 | 2017-05-18 18:06 |
| need recommendations for a PC | ixfd64 | Hardware | 45 | 2012-11-14 01:19 |
| Hardware recommendations | Mr. Odd | Factoring | 12 | 2011-11-19 00:32 |
| Recommendations (courses) | blob100 | Other Mathematical Topics | 20 | 2010-06-20 18:11 |