![]() |
[QUOTE=chalsall;602942]The former just started (by the Colab TF'ers) and the latter is just about complete.
BTW... Does anyone have a block of say 0.01M that I could give to the Colab P-1'ers? We've been surprisingly successful in the last 24 hours! :tu:[/QUOTE] Want to take 18.84M? I'm still a few days away from that subrange. |
[QUOTE=masser;602960]Want to take 18.84M? I'm still a few days away from that subrange.[/QUOTE]
Yes, please! 204 candidates with yummy low bounds... BTW... Wayne and I have been exchanging ideas about the [URL="https://www.gpu72.com/low/"]Low Composite Cleanup[/URL] thing. Amongst many other things, we need a catchy name... :wink: So everyone knows, for some reason I hadn't noticed my dataset was corrupted for the first 1M range. Everything was listed as Factored; sadly, not (yet) correct. A spider is running to fix this. Also, I need to finally stop hacking my comma rendering routine. 0.0M has once again shown my algorithm does not scale. Lastly... TheJudger is BACK (at least for another batch)!!! :tu: |
@Chalsall: You could also take 7.8M to help @Kruoli.
If I understand correctly he is focusing on sub-sub ranges over 200 ... and he's still in 7.87 I think you should be ok to take 7.81; 7.88; 7.89 I'm about 10 days from finishing all my ranges. |
[QUOTE=chalsall;602971]Yes, please! 204 candidates with yummy low bounds...
BTW... Wayne and I have been exchanging ideas about the [URL="https://www.gpu72.com/low/"]Low Composite Cleanup[/URL] thing. Amongst many other things, we need a catchy name... :wink: So everyone knows, for some reason I hadn't noticed my dataset was corrupted for the first 1M range. Everything was listed as Factored; sadly, not (yet) correct. A spider is running to fix this. Also, I need to finally stop hacking my comma rendering routine. 0.0M has once again shown my algorithm does not scale. Lastly... TheJudger is BACK (at least for another batch)!!! :tu:[/QUOTE] There is a missing comma in the first line: 4,209961,589,561,203 In the last line (49.9M) there are 1214 numbers with low P-1. This means that 1980 - 1214 = 766 numbers have P-1 done with B2 > 100M. It is not clear why the average B2 is only 8,833,727. One of the numbers 1214 or 8,833,727 is incorrect. |
[QUOTE=petrw1;602972]@Chalsall: You could also take 7.8M to help @Kruoli.
If I understand correctly he is focusing on sub-sub ranges over 200 ... and he's still in 7.87 I think you should be ok to take 7.81; 7.88; 7.89 I'm about 10 days from finishing all my ranges.[/QUOTE] Yes, this is correct. I am only still on 7.87M because I already have the stage 1 files for all of them… I am going to skip forward to 7.85M after completing the current exponent and will finish up 7.87M's stage 1 files later. |
[QUOTE=chalsall;602971]
Lastly... TheJudger is BACK (at least for another batch)!!! :tu:[/QUOTE] And 1 more! |
Devilish
My 3 highest ranges need 6 and 6 and 6 more factors...
and there are 13 ranges left. Anyone superstitious? |
[QUOTE=petrw1;603081]My 3 highest ranges need 6 and 6 and 6 more factors...
and there are 13 ranges left. Anyone superstitious?[/QUOTE] And build 13 just came out. Sound like good omen :devil: |
[QUOTE=petrw1;603037]And 1 more![/QUOTE]
Yeah! A ***big*** batch! A bit of a shock, actually... Just to be able to have things to assign for the Colab TF'ers, I've brought in [URL="https://www.gpu72.com/low/#44"]some composite candidates in 44M[/URL] to bring from 72 to 73 bits (working down). Other suggestions are welcomed. But as far as I see things this project has been fully assigned, and the work just now needs to complete. Well done Wayne (and, of course, et al)! Fun! :tu: |
[QUOTE=chalsall;603134]Yeah! A ***big*** batch! A bit of a shock, actually...
Just to be able to have things to assign for the Colab TF'ers, I've brought in [URL="https://www.gpu72.com/low/#44"]some composite candidates in 44M[/URL] to bring from 72 to 73 bits (working down). Other suggestions are welcomed. But as far as I see things this project has been fully assigned, and the work just now needs to complete. Well done Wayne (and, of course, et al)! Fun! :tu:[/QUOTE] 44M seems like a good start. I prefer the idea of working 1 bet level at a time starting at the highest range. Another options is to work on whatever ranges has the least GhzDays/assignment. You know: 44M at 72 bits = 22M at 71 bits = 11M at 70 bits, etc. As I mentioned earlier once I am done my 4 ranges ... in about a week I will offer to help others. |
Less than 243 to go!
1 Attachment(s)
Once upon a time, we completed a single range with 243 candidates. It only took 20 months! It's not going to take quite so long to find the last 239.
|
All times are UTC. The time now is 13:07. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2023, Jelsoft Enterprises Ltd.