20190407, 20:19  #155 
Jul 2003
2·307 Posts 
hi,
i think that prime95 v29.7b1 is buggy and there is also a problem with llr v3.8.22 so i will wait for better code ... and do more tf with mfaktc v0.21 
20190407, 22:59  #156 
Sep 2003
5×11×47 Posts 
There are bugs for one specific type of work, but for Wagstaff PRP testing there is no problem. The Gerbicz error checking gives extra confidence.

20190408, 07:51  #157 
"Carlos Pinho"
Oct 2011
Milton Keynes, UK
11575_{8} Posts 
Grand Prix 2,
How much sieve was done on this? At my pace my range will be completed within 450 days! 
20190408, 08:41  #158  
Sep 2003
5×11×47 Posts 
Quote:
It's certainly not as deep as for Mersenne, where large numbers of people have contributed to factoring. For Mersenne, the levels in the 10M range are typically TF=69 and P−1 to about B1=p/40, B2=p/2. However, even if we did have TF and P−1 up to Mersenne levels, it would only eliminate a few percent of the remaining exponents, surely less than 10%. Finding factors gets exponentially harder at larger sizes, and most factors will simply remain out of reach. So one way or another, there's no way to avoid doing most of those PRP tests. Progress on Mersenne is faster only because the work is split up among a much larger number of contributors. However, with 2048bit residues, if you do a PRP test and then a factor is found later, you can do a very quick Gerbicz cofactorcompositeness test on the new cofactor. So the PRP test is not wasted because at least there is a small chance of discovering a new very large PRP. I find that with even a simple implementation using GMP, a Gerbicz cofactorcompositeness test is about 50 times faster than a PRP cofactor test using the latest mprime AVX512 implementation. However, note that the Gerbicz test only removes the need to keep redoing PRP tests of new cofactors every time a new factor is discovered; you still have to do one initial PRP test and record the 2048bit residue, because the Gerbicz test needs that 2048bit residue as input. Last fiddled with by GP2 on 20190408 at 09:04 

20190408, 14:58  #159  
Sep 2003
5·11·47 Posts 
Quote:
If we look at the Mersenne work distribution map, as of today the line for the 10M range shows: Code:
10000000 61938  40593 21345 For Wagstaff, there are currently 22,248 unfactored exponents in the 10M range. And the 10.2M subset contains 2206 of them, very close to 10%. So based on that, if we did factor Wagstaff exponents as thoroughly as Mersenne, we'd only find factors for about 4% of the currently unfactored Wagstaff exponents in the 10M range. As you know, factoring gets exponentially harder as you increase bitlength (for TF) or nonsmoothness (for P−1). For any exponential curve, there is only a very narrow transition zone where you go from "incredibly tiny" to "impossibly large". The overwhelming majority of exponents are either trivial to factor or impossible to factor. All the years of efforts of Primenet and all the GHzdays thrown at TF and P−1 actually only made a difference for a very small subset of exponents. But of course, it's impossible to know in advance which exponents those are. Last fiddled with by GP2 on 20190408 at 15:06 

20190408, 16:28  #160 
"Carlos Pinho"
Oct 2011
Milton Keynes, UK
1001101111101_{2} Posts 
Apologies but releasing my range. No way I’ll commit my laptop for more than one year on this.

20190408, 16:43  #161 
"Curtis"
Feb 2005
Riverside, CA
3·19·89 Posts 
I, too, bit off a little more than I expected; in my case, it'll take me a month to free up a few cores, and then ~3 months to do the work. I'll get mprime going on one core in a few days, and then 35 more in May (sadly, not all on one machine). Carlos, why don't we share one 100k range for 3 months or so, e.g. you do 10k and I do 90k?
Last fiddled with by VBCurtis on 20190408 at 16:44 
20190408, 17:53  #162  
Sep 2003
101000011001_{2} Posts 
Quote:
Quote:
Currently I don't have any setup for automated assignment of individual exponents. Maybe there's some way to adapt it as a BOINC project, but I have no idea how to go about doing that. At some point, maybe a few months from now, I will resume my own testing using cloud resources. 

20190408, 17:54  #163  
"Carlos Pinho"
Oct 2011
Milton Keynes, UK
3·1,663 Posts 
Quote:
Would you like to try https://boinc.tacc.utexas.edu/ ? Attached my tested numbers. Last fiddled with by pinhodecarlos on 20190408 at 17:56 

20190409, 07:24  #164  
Mar 2018
10000001_{2} Posts 
Quote:
Then you need to write a validator that checks the results. Decide if you maybe want to have double checking and to have validator compare residues from two tests. Oh, and write work generation scripts or software. A lot of fun if you're a programmer! It's very preferable to be familiar with php and mysql because you'll likely have to deal with them for various tasks. 

20190409, 13:07  #165 
Jul 2003
2·307 Posts 
hi,
a boinc project would be very nice! this is not an easy thing you could look at http://srbase.myfirewall.org/sr5/ http://srbase.myfirewall.org/sr5/do...baseguide.pdf they use llr with a wrapper that comes with the boinc server software (not sure about this) but if it is so you do not have to develop your own wrapper or a nativeboincintegrated program 
Thread Tools  
Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
New Wagstaff PRP exponents  ryanp  Wagstaff PRP Search  26  20131018 01:33 
Hot tuna!  a p75 and a p79 by Sam Wagstaff!  Batalov  GMPECM  9  20120824 10:26 
Wagstaff Conjecture  davieddy  Miscellaneous Math  209  20110123 23:50 
Best settings to factor Wagstaff p = (2^n +1) / 1  diep  GMPECM  10  20100726 21:33 
30th Wagstaff prime  T.Rex  Math  0  20070904 07:10 