mersenneforum.org 2020 Prime95 observations, issues, and suggestions
 User Name Remember Me? Password
 Register FAQ Search Today's Posts Mark Forums Read

2020-01-23, 04:34   #12
kriesel

"TF79LL86GIMPS96gpu17"
Mar 2017
US midwest

3×1,877 Posts

Quote:
 Originally Posted by Uncwilly I was just thinking: With the work that Raphael Robinson, he published his data. Those before only published if it was or was not a prime. With a number of the later searchers, they published residues and known factors. All of those were eyeball friendly. This was good when George started GIMPS, we could DC previous work, because the previous data was available. My suggestion is: Work with Archive.org to set up a repository of Mersenne data from the various projects. ... This is just me thinking about preserving the most important data long term.
Hames H and Madpoo could weigh in here, about the size of the databases, and the likelihood they exceed what archive.org will tackle. As I recall, the PrimeNet server is now gathering 2048-bit residues, and periodic residues along the way for each prime exponent from first test and from subsequent test(s), as well as TF and P-1 data. That's not small.
The forum itself is not replicated on archive.org, except for the top level, a listing of threads. I believe it has a robots.txt entry to tell web crawlers to stay out. I learned that the hard way, after accidentally deleting an entire blog thread. There's now a separate backup mechanism for my blog, that I can manually run about weekly and takes a fraction of an hour. But that's only useful while I'm available.

 2020-01-23, 13:22 #13 Xyzzy     "Mike" Aug 2002 2×4,139 Posts We thought we had it set up to specifically allow the IA and Google to crawl the forum. Code: User-agent: ia_archiver Disallow: User-agent: Googlebot Disallow: User-agent: * Disallow: / Crawl-delay: 60
2020-01-23, 16:22   #14
kriesel

"TF79LL86GIMPS96gpu17"
Mar 2017
US midwest

3·1,877 Posts

Quote:
 Originally Posted by Xyzzy We thought we had it set up to specifically allow the IA and Google to crawl the forum. Code: User-agent: ia_archiver Disallow: User-agent: Googlebot Disallow: User-agent: * Disallow: / Crawl-delay: 60
robots.txt is only advisory, in any event, a suggestion or request, not a well enforced law.
If you do not specifically allow anything to any crawler (Allow does not appear there), specifically disallow nothing to a couple of specific user agents, and follow that by specifically disallowing the entire site, for all user agents, what happens?

Specific to the internet archive,
Code:
The volunteering group Archive Team explicitly ignores robots.txt for  the most part, viewing it as an obsolete standard that hinders web  archival efforts. According to project leader Jason Scott, "unchecked,  and left alone, the robots.txt file ensures no mirroring or reference  for items that may have general use and meaning beyond the website's  context."[19]
For some years, the Internet Archive did not crawl sites with robots.txt, but in April 2017, it announced[20]  that it would no longer honour directives in the robots.txt files.  “Over time we have observed that the robots.txt files that are geared  toward search engine crawlers do not necessarily serve our archival  purposes".[21] This was in response to entire domains being tagged with robots.txt when the content became obsolete.
https://en.wikipedia.org/wiki/Robots_exclusion_standard

But for whatever reason, it seems to be only getting the top page. Check it out:
http://web.archive.org/web/202001170...enneforum.org/
Links from that page to posts or threads are not archived. That may be an issue with the internet archive, not the robots.txt. http://web.archive.org/web/*/http://...nneforum.org/*

Last fiddled with by kriesel on 2020-01-23 at 16:27

2020-01-23, 21:40   #15
Uncwilly
6809 > 6502

"""""""""""""""""""
Aug 2003
101×103 Posts

2×4,957 Posts

Quote:
 Originally Posted by kriesel Hames H and Madpoo could weigh in here, about the size of the databases, and the likelihood they exceed what archive.org will tackle. As I recall, the PrimeNet server is now gathering 2048-bit residues, and periodic residues along the way for each prime exponent from first test and from subsequent test(s), as well as TF and P-1 data. That's not small.
A 128-bit residue would be plenty. There would be no need for interim residues to be stored. TF data can be a summary of bits done and if the last bit was completed if a factor was found. P-1 can be trimmed up, if someone ran exponent Z 3 times with increasing bounds, doesn't the largest set cover the lessor? We are only concerned about what would be needed for someone to resume the project if all the current infrastructure goes away.

 2020-01-24, 00:50 #16 Xyzzy     "Mike" Aug 2002 2×4,139 Posts As an experiment, we have disabled the robots.txt file.
2020-01-24, 01:49   #17
kriesel

"TF79LL86GIMPS96gpu17"
Mar 2017
US midwest

563110 Posts

Quote:
 Originally Posted by Uncwilly A 128-bit residue would be plenty. There would be no need for interim residues to be stored. TF data can be a summary of bits done and if the last bit was completed if a factor was found. P-1 can be trimmed up, if someone ran exponent Z 3 times with increasing bounds, doesn't the largest set cover the lessor? We are only concerned about what would be needed for someone to resume the project if all the current infrastructure goes away.
If I recall correctly, the 2048-bit residues are necessary for PRP cofactor work and relate to one of Gerbicz's ideas. To continue forward with exponents in progress toward double/triple check conclusion, it would be useful if their interim residues remained available. Not a requirement, but useful. P-1 done could be represented by a single bounds set, IF the largest B1 and largest B2 were in the SAME run to completion. Completed DC/TC of primality test of an exponent would need residue type, and res64.

2020-01-24, 06:09   #18
Bulldozer

Jun 2019

2110 Posts

Quote:
 Originally Posted by rainchill Hey Folks - just a little list of things I have been thinking about to start the new year with. 13. Primenet stop automatically assigning first time tests to clients running older builds. 14. Have separate shell and execution core. When new code is ready it can automatically download the new execution code similar to how folding@home works. And/or possibly join boinc. 17. Add a timebomb to prevent Prime95 from accepting work assignments from the server if the client build is more than 3 years from the date it was compiled. This will prevent old builds from running forever, encourage folks to upgrade, and help prevent old potentially buggy or unoptimized code from staying out there too long. Older build would only work for testing and benchmarks.
These can be solved by a UWP port. I am currently doing it. Microsoft store has automatic updates, and WinRT(UWP) apps have separate UI and logical code.

 2020-04-24, 22:52 #19 ixfd64 Bemusing Prompter     "Danny" Dec 2002 California 2×3×401 Posts Another thing I recently realized is that the project as a whole needs to be modernized. In particular, here are two ideas that can be implemented without too much trouble: 1. All available source code should be uploaded to public repositories. This would encourage people to contribute code as well as make it easier to keep track of issues. So far, only mfakto, gpuOwl and CUDAPm1 use publicly accessible version control systems. 2. It would be nice to have official GIMPS social media accounts. Although not very important in the grand scheme of things, this could potentially bring in more users. GIMPS remains one of the few distributed computing projects without a social media presence. Last fiddled with by ixfd64 on 2020-04-24 at 22:53
2020-04-27, 22:33   #20
Dylan14

"Dylan"
Mar 2017

32·5·13 Posts

Quote:
 Originally Posted by ixfd64 Another thing I recently realized is that the project as a whole needs to be modernized. In particular, here are two ideas that can be implemented without too much trouble: 1. All available source code should be uploaded to public repositories. This would encourage people to contribute code as well as make it easier to keep track of issues. So far, only mfakto, gpuOwl and CUDAPm1 use publicly accessible version control systems.
I definitely agree with this, even though the source code is publicly available on the download page.

Quote:
 Originally Posted by ixfd64 2. It would be nice to have official GIMPS social media accounts. Although not very important in the grand scheme of things, this could potentially bring in more users. GIMPS remains one of the few distributed computing projects without a social media presence.
Certainly this would help in getting the word out, especially when a prime is found. We may want to be a bit careful with the handle, however, since when I search for GIMPS on social media (ie Facebook, Twitter or Instagram) I get either
1) the software GIMP, the GNU Image Manipulator Program or
2) people calling others gimps, which is apparently a derogatory term. But I suppose that's neither here nor there.

2020-05-02, 23:00   #21
Viliam Furik

"Viliam Furík"
Jul 2018
Martin, Slovakia

653 Posts
GIMPS logo and social media

Quote:
 Originally Posted by Dylan14 Certainly this would help in getting the word out, especially when a prime is found.
I have made a Facebook profile (here) for the GIMPS using a logo I created. The logo was approved by George as "quite good".

If somebody is interested in co-administrating the profile, please send me a message.
Attached Thumbnails

 2020-05-03, 07:06 #22 Nick     Dec 2012 The Netherlands 1,721 Posts If you include a geometric figure then I think it would be better if it was mathematically accurate!

 Similar Threads Thread Thread Starter Forum Replies Last Post eliteassassin Software 15 2019-06-09 18:14 ixfd64 Software 7 2010-08-26 19:02 joblack Software 21 2009-01-29 03:10 joblack Hardware 8 2009-01-06 04:55 joblack Software 0 2008-10-17 23:44

All times are UTC. The time now is 21:26.

Thu Sep 23 21:26:33 UTC 2021 up 62 days, 15:55, 1 user, load averages: 2.02, 2.39, 2.32