![]() |
![]() |
#1 |
Dec 2002
7·112 Posts |
![]()
We've had the discussion many years ago regarding intermediate files being stored on the server. Now that fiber is more common and a 3.5 Mbyte attachment to an email is no longer frowned upon, I would like to bring the suggestion to the table to have P-1 files send to the server, so that if anyone wants to extend the work later, the previous work can be retrieved.
|
![]() |
![]() |
![]() |
#2 |
Just call me Henry
"David"
Sep 2007
Liverpool (GMT/BST)
32×5×7×19 Posts |
![]()
On the same note how big are the current LL savefiles.
|
![]() |
![]() |
![]() |
#3 |
Jun 2005
USA, IL
19310 Posts |
![]()
LL files are around 7 Mb for current 10 million digit assignments or 40 Mb for 100 million digit assignments.
Last fiddled with by potonono on 2014-03-01 at 15:37 |
![]() |
![]() |
![]() |
#4 |
Just call me Henry
"David"
Sep 2007
Liverpool (GMT/BST)
32×5×7×19 Posts |
![]()
How many LL tests are completed a day?
|
![]() |
![]() |
![]() |
#5 |
1976 Toyota Corona years forever!
"Wayne"
Nov 2006
Saskatchewan, Canada
3·11·157 Posts |
![]()
This has come up before and I still agree. So many LL are abandoned some late. I might suggest:
For unproven workers save at 25% and each subsequent 10%. Then if abandoned they can be complete be someone else. Credit is a side topic. A side benefit could be check intermediate results for DC. If storage is an issue reassign and delete when abandoned. If not keep for DC. |
![]() |
![]() |
![]() |
#6 |
May 2013
East. Always East.
11·157 Posts |
![]()
Well, network and storage capabilities have improved since the project started, but it might be quite a leap to go from a 64-bit residue after a whole test to sending the whole damned checkpoint file every 10%, not to mention storing it.
On the other hand, I do honestly believe that this kind of thing will become the future of double-checking. If a save-file is sent (even every 25%) then that allows the double-check process to be broken into four pieces and decreases the amount of triple-check work (assuming an error is found somewhere in the third piece, the TC can start from the third piece instead of the first). |
![]() |
![]() |
![]() |
#7 |
Just call me Henry
"David"
Sep 2007
Liverpool (GMT/BST)
32·5·7·19 Posts |
![]()
Forgetting DC, 300/day(think DC is 150/day from memory).
That would add upto ~3GB/day for LL and ~1GB/day for DC if it was done once per test. That looks doable to me although the server would likely need an upgrade as it already runs out of disk space every so often. I don't know what sort of bandwidth primenet has/is using but ~4GB/day isn't that much traffic. A slow ADSL line would do it easily. P-1 files should be comparable in size I imagine. Are either of these files compressed? Last fiddled with by henryzz on 2014-03-01 at 21:42 |
![]() |
![]() |
![]() |
#8 |
Jun 2005
USA, IL
193 Posts |
![]()
The files I see are already in a compressed format.
|
![]() |
![]() |
![]() |
#9 |
May 2013
East. Always East.
11×157 Posts |
![]()
4 GB per day isn't *too* much assuming it's evenly distributed.
|
![]() |
![]() |
![]() |
#10 |
"Kieren"
Jul 2011
In My Own Galaxy!
236568 Posts |
![]()
At some point, this discussion should turn to fund-raising to support expansions of bandwidth and of storage capacity. Would these additional tasks also require more server CPU capacity?
|
![]() |
![]() |
![]() |
#11 |
"Nathan"
Jul 2008
Maryland, USA
111510 Posts |
![]()
Compression is only effective on files that are mostly non-random information, i.e. pictures or documents, because compression exploits the predictable nature of the bit patterns in such files (e.g. large swaths of 0s, etc.). To put it another way, these files are mostly signal and very little noise, the latter being easily identified and eliminated in the compression process.
GIMPS residue files, on the other hand, are by nature random and do not feature predictable, easily compressed bit patterns. (If they did, perhaps GIMPS wouldn't be so interesting any more!) Therefore, since these files are basically noise, you'll find that compressing them does little to reduce their size (try it!). My experience has been that attempting to zip a GIMPS residue file only reduces its size by 1-2%. Therefore, while an excellent idea, compression isn't going to be all that useful in practice. That said, I do think it is worthwhile to consider the idea of accepting and crediting partial completions, particularly with the more stringent assignment expiration and recycling rules being implemented, and the increasing size of a typical assignment. Ten percent of a 69M LL assignment is a nontrivial amount of work; ten percent of a 332M LL assignment even more so. |
![]() |
![]() |
![]() |
Thread Tools | |
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Feature request (Again) | JuanTutors | Software | 7 | 2008-10-30 00:19 |
Feature Request | Uncwilly | Software | 0 | 2008-03-06 21:07 |
Feature request - contacting server again in 67 minutes | patrik | Software | 3 | 2006-09-25 06:37 |
Feature request | JuanTutors | Software | 2 | 2005-07-04 22:02 |
Feature Request | S3SJK | Software | 10 | 2005-02-13 00:40 |