20180312, 04:03  #1 
Bemusing Prompter
"Danny"
Dec 2002
California
2^{2}·3·197 Posts 
Coordination thread for redoing P1 factoring
Some of us have been redoing P1 on exponents with only stage 1 done. But because it's not (always) possible to register such assignments, this creates the risk of stepping on toes. Therefore, I decided to create this thread to coordinate such factoring efforts. Feel free to share which ranges you're working on and any interesting factors you find.
I'll start: I have three machines that are redoing P1 factoring:
All three computers are alternating between normal P1 factoring and rerunning P1 on exponents without stage 2 done. Last fiddled with by ixfd64 on 20180312 at 04:04 
20180312, 17:31  #2 
1976 Toyota Corona years forever!
"Wayne"
Nov 2006
Saskatchewan, Canada
1000111000111_{2} Posts 
Repost from the prior thread
For me:
 5059M range  For any .1M range that has more than 1999 unfactored  For exponents that the current P1 has B1=B2 (and not excessively large) I am running PMinus1 with B1=1000000 B2=20000000 (1M,20M) I expect to be at this all of 2018 but if anything changes....I'll post 
20180312, 17:53  #3 
If I May
"Chris Halsall"
Sep 2002
Barbados
3·7·11·41 Posts 
I currently have four of my machines working in the 40M to 49M ranges (inclusive). They focus on a particular 0.1M range at a time. Currently they're working 45.7M, and will then focus on 44.2M (in about a week). I try to reserve them from Primenet so people see this activity (not always possible, since some of them have already had a DC).
For anyone who is interested, I'm letting mprime decide the bounds, based on four LL assignments being saved (doesn't make sense, I know, but it's my kit). For 45.7M I've so far run 198 tests, and found 7 factors. 
20180312, 18:04  #4 
"Victor de Hollander"
Aug 2011
the Netherlands
2^{3}×3×7^{2} Posts 
Most of my unreserved P1 effort was in the range 1.5M  1.7M (B1=10e6 B2=200e6), which I'm currently running ECM on (B1=50,000).
Also I know Jocelyn Larouche is doing P1 in the region below 4M. 
20180312, 23:23  #5 
Jul 2004
Milan, Ita
3×61 Posts 
I've two older machines slooowly doing P1 on expos having B1<150k & B2<1M
just for fun... 
20180313, 11:03  #6 
Banned
"Luigi"
Aug 2002
Team Italia
12C0_{16} Posts 
I am doing P1 testing from time to time, taking e xponents that have had poor or no stage 2 prefeÅ•ring smaller ones.

20180322, 21:01  #7 
Bemusing Prompter
"Danny"
Dec 2002
California
2^{2}·3·197 Posts 
Update: I'm done with the 43.6 and 77.2m ranges for the time being. The MacBook Pro is now redoing exponents in the 44.1m range.
Chris: I see that you've reserved a few exponents in the 44.1m range as well. Are you planning to do more? 
20180323, 02:27  #8 
1976 Toyota Corona years forever!
"Wayne"
Nov 2006
Saskatchewan, Canada
3·37·41 Posts 
Opinions or observations please...
As you may have noticed I am on a P1 binge recently.
I have full time P1 running on: 2 x 2core PC's 6 x 4core PC's Some have as much as 16GB of RAM; a couple only 4G. I have noticed over the past years of doing P1 that more RAM makes a big difference in Phase 2. Simple observations have shown that running 480 relative primes in batch of 10 takes noticeably longer than that same run in batches of 120 for example. (I wouldn't be surprised if the following has been noted before and I missed it)... So that got me to thinking that, especially for the PC's with 4GB or 8GB of RAM it should complete more total tests per week if I ran 2 workers of 2 cores each rather than 4 workers with 1 core each. The phase 1 may be slightly slower but the phase 2 should be enough faster that it more than makes up for it; faster because with only 2 workers fighting for RAM they each will get a lot more and can process more relative primes. Opinions? 
20180404, 09:39  #9 
Dec 2002
1447_{8} Posts 
I work in the 10M to 25M range. Currently in the 11M range and the 22M range.

20180405, 13:12  #10 
Jul 2004
Milan, Ita
267_{8} Posts 

20180405, 14:30  #11 
If I May
"Chris Halsall"
Sep 2002
Barbados
9471_{10} Posts 
Keep in mind that when you're redoing P1 work you should expect a lower probability than what Prime95/mprime reports. As a rough guide, I subtract from what is reported as expected by what the previous run's probability was.
When I'm redoing poorly P1'ed work (read: no Stage 2) I tell mprime that the test will save four LL tests, and give it between 10GB and 12GB of RAM to use. Doesn't make sense, I know, but it's my kit and electrons. On average in the 4xM range I get about a 3% success rate. 
Thread Tools  
Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Redoing factoring work done by unreliable machines  tha  Lone Mersenne Hunters  23  20161102 08:51 
Sieving reservations and coordination  gd_barnes  No Prime Left Behind  2  20080216 03:28 
Sieved files/sieving coordination  gd_barnes  Conjectures 'R Us  32  20080122 03:09 
P1 factoring Q&A thread  Unregistered  Software  27  20050611 05:32 
5.98M to 6.0M: redoing factoring to 62 bits  GP2  Lone Mersenne Hunters  0  20031119 01:30 