![]() |
[QUOTE=Christenson;280003] Is the TF effort for the 100M digit range worth keeping in the GPU to 72 tool?[/QUOTE]
GPU to 72 is geared towards LL wavefront, not to ranges that high. |
[QUOTE=Christenson;280003]Question for the peanut gallery: Is the TF effort for the 100M digit range worth keeping in the GPU to 72 tool? I like getting the exponents that have the least TF on them for my GPU.[/QUOTE][QUOTE=lycorn;280034]GPU to 72 is geared towards LL wavefront, not to ranges that high.[/QUOTE]I concur with lycorn. Grabbing exponents in the 100M digit range is easy, whereas the exponents in the GPU to 72 range - having the bot do the reservations before they get handed out to someone that won't take them as far makes more sense.
Your GPU is welcome in this range. I would prefer that you take exponents from 74 to 79. But, if you like lower bit levels, you could do some 'brush clearance', there are several thousand exponents below 71 bits in the range from 332,400,000 to 332599937. Or you could take the exponents below 332,400,000 that are at 71 up to 72. |
Here is the monthly status report for the range from 332192831 to 332399999:
[code]Date of data 12/20/2011 Average bit depth for first 100 expos 78.36 Average bit depth for first 1000 expos 77.12 100th active expo (no factor found) 332198357 1000th active expo (no factor found) 332246111 Unitless total effort number 169,589,760 Number of first 100 expos to 2^76 97 Number of first 1000 expos to 2^75 926 Number left in range 4229 Estimated expos in range to be removed 292 (by taking all expos to 2^79)[/code] [code]Bit # at bit level 71 1365 72 873 73 364 74 221 75 147 76 524 77 274 78 30 79 404 80 24 81 3 P-1 156 ≈13,700,000 effort[/code]There is only 1 current straggler below 76 in the first 100 exponents. All of the first 1000 are now at or above 75. The one loan factor that we found this month was by P-1 and the exponent had been tested to a fair bit level. So, we saw a temporary decrease in our effort number. |
Here is the monthly status report for the range from 332192831 to 332399999:
[code]Date of data 1/20/2012 Average bit depth for first 100 expos 78.39 Average bit depth for first 1000 expos 77.14 100th active expo (no factor found) 332198357 1000th active expo (no factor found) 332246111 Unitless total effort number 171,429,888 Number of first 100 expos to 2^76 99 Number of first 1000 expos to 2^75 928 Number left in range 4225 Estimated expos in range to be removed 289 (by taking all expos to 2^79)[/code] [code]Bit # at bit level 71 1302 72 913 73 355 74 230 75 146 76 531 77 284 78 31 79 405 80 25 81 3 P-1 161[/code] |
1 Attachment(s)
Yes, I know it has been 2 months. Wanted to see how many were actually paying attention to these.
Here is the 'monthly' status report for the range from 332192831 to 332399999: [code]Date of data 3/20/2012 Average bit depth for first 100 expos 78.40 Average bit depth for first 1000 expos 77.16 100th active expo (no factor found) 332198357 1000th active expo (no factor found) 332246111 Unitless total effort number 173,400,064 Number of first 100 expos to 2^77 88 Number of first 1000 expos to 2^76 882 Number left in range 4220 Estimated expos in range to be removed 285 (by taking all expos to 2^79)[/code] [code]Bit # at bit level 71 1092 72 1084 73 341 74 273 75 136 76 543 77 286 78 30 79 406 80 25 81 4 P-1 178[/code] |
What's the work effort? And is 79 before or after the extra three bits of GPU TF that has become de facto standard?
|
[QUOTE=Dubslow;294212]What's the work effort? And is 79 before or after the extra three bits of GPU TF that has become de facto standard?[/QUOTE]The unitless effort number is a sort of grand totalizer that I came up with.
173,974,528 = 244,550 GHz-days. I took 1 exponent taken to 61 bits to equal 1 effort unit. (This was because in Feb 2009 all exponents in range were at 61 bits or higher.) The formula in excel is =2^('cell with current bit level'-61) If a factor is found the number can go down. I have in the past looked at the amount of effort applied to find factors. (Just pulled the data and excel is calculating it as I type.) 2,896,837 And with some help from James H. I have been able to take the P-1 data, convert it to GHz-days and back to the effort number. 179 P-1's (to various bounds) = 15,515,906 (All efforts: found factors + P-1 + TF = 192,387,271 or 270,430 GHz-days.) Prime95 is currently stops TF at 75 bits, does P-1, then resumes TF through 77 bits then starts the LL on these. So 79 is the 'standard' + 2. |
[QUOTE=Uncwilly;294217]
So 79 is the 'standard' + 2.[/QUOTE] But 80 is such a nice round number... (kinda like 72, they're both incredibly smooth numbers) :smile: |
[QUOTE=Dubslow;294219]But 80 is such a nice round number... (kinda like 72, they're both incredibly smooth numbers) :smile:[/QUOTE]I don't have any GPU's to throw at the project. Currently I have 2 real cores on this machine taking expos from 73->74, 2 HT 'cores' on a machine 72->73, 1 HT 'core' 73->74, and 1 HT 'core' 71->72.
I gladly take any and all helpers, just use primenet to reserve your work. |
When we've raced ahead of the LL wave, sure :smile:
OTOH, chalsall's algos currently predict there's ~200 days of TF work to be done up through 61M, so it might be a while :P |
[QUOTE=Uncwilly;294223]I don't have any GPU's to throw at the project. Currently I have 2 real cores on this machine taking expos from 73->74, 2 HT 'cores' on a machine 72->73, 1 HT 'core' 73->74, and 1 HT 'core' 71->72.
I gladly take any and all helpers, just use primenet to reserve your work.[/QUOTE] I will take 2 exponents that need to be taken 8 bits higher (like 70 to 78, or 71 to 79, or 72 to 80) when I reach home in few hours and I will try some timing for gtx580@782MHz. Depending of how long time it will take, I may finish them too... :D |
| All times are UTC. The time now is 22:49. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.