mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   Lone Mersenne Hunters (https://www.mersenneforum.org/forumdisplay.php?f=12)
-   -   Coordination thread for redoing P-1 factoring (https://www.mersenneforum.org/showthread.php?t=23152)

ixfd64 2018-03-12 04:03

Coordination thread for redoing P-1 factoring
 
Some of us have been redoing P-1 on exponents with only stage 1 done. But because [url=http://mersenneforum.org/showthread.php?t=23110]it's not (always) possible[/url] to register such assignments, this creates the risk of stepping on toes. Therefore, I decided to create this thread to coordinate such factoring efforts. Feel free to share which ranges you're working on and any interesting factors you find. :smile:

I'll start: I have three machines that are redoing P-1 factoring:[LIST][*]A dual-core laptop working on exponents around 43.9m and 47.9m[*]A quad-core MacBook Pro working on exponents around 43.6m[*]A quad-core desktop working on exponents around 47.6m and 77.2m[/LIST]
All three computers are alternating between normal P-1 factoring and rerunning P-1 on exponents without stage 2 done.

petrw1 2018-03-12 17:31

Repost from the prior thread
 
For me:
- 50-59M range
- For any .1M range that has more than 1999 unfactored
- For exponents that the current P1 has B1=B2 (and not excessively large)
I am running PMinus1 with B1=1000000 B2=20000000 (1M,20M)

I expect to be at this all of 2018 but if anything changes....I'll post

chalsall 2018-03-12 17:53

I currently have four of my machines working in the 40M to 49M ranges (inclusive). They focus on a particular 0.1M range at a time. Currently they're working 45.7M, and will then focus on 44.2M (in about a week). I try to reserve them from Primenet so people see this activity (not always possible, since some of them have already had a DC).

For anyone who is interested, I'm letting mprime decide the bounds, based on four LL assignments being saved (doesn't make sense, I know, but it's my kit).

For 45.7M I've so far run 198 tests, and found 7 factors.

VictordeHolland 2018-03-12 18:04

Most of my unreserved P-1 effort was in the range 1.5M - 1.7M (B1=10e6 B2=200e6), which I'm currently running ECM on (B1=50,000).

Also I know Jocelyn Larouche is doing P-1 in the region below 4M.

ric 2018-03-12 23:23

I've two older machines slooowly doing P-1 on expos having B1<150k & B2<1M[LIST][*]in the range 12.2M to 12.4M[*]in the range 15M to 15.3M[/LIST]
just for fun...

ET_ 2018-03-13 11:03

I am doing P-1 testing from time to time, taking e xponents that have had poor or no stage 2 prefeĊ•ring smaller ones.

ixfd64 2018-03-22 21:01

Update: I'm done with the 43.6 and 77.2m ranges for the time being. The MacBook Pro is now redoing exponents in the 44.1m range.

Chris: I see that you've reserved a few exponents in the 44.1m range as well. Are you planning to do more?

petrw1 2018-03-23 02:27

Opinions or observations please...
 
As you may have noticed I am on a P-1 binge recently.
I have full time P-1 running on:
2 x 2-core PC's
6 x 4-core PC's

Some have as much as 16GB of RAM; a couple only 4G.
I have noticed over the past years of doing P-1 that more RAM makes a big difference in Phase 2.

Simple observations have shown that running 480 relative primes in batch of 10 takes noticeably longer than that same run in batches of 120 for example.

(I wouldn't be surprised if the following has been noted before and I missed it)...

So that got me to thinking that, especially for the PC's with 4GB or 8GB of RAM it should complete more total tests per week if I ran 2 workers of 2 cores each rather than 4 workers with 1 core each.
The phase 1 may be slightly slower but the phase 2 should be enough faster that it more than makes up for it; faster because with only 2 workers fighting for RAM they each will get a lot more and can process more relative primes.

Opinions?

tha 2018-04-04 09:39

I work in the 10M to 25M range. Currently in the 11M range and the 22M range.

ric 2018-04-05 13:12

[QUOTE=ric;482194][LIST][*]in the range 12.2M to 12.4M[/LIST][/QUOTE]

324 cands, expected prob ~2.5%, 3 new factors (0.9%).
Meh!

Right now working on 12.0M to 12.2M, then will extend to from 12.4M to 13M

chalsall 2018-04-05 14:30

[QUOTE=ric;484376]324 cands, expected prob ~2.5%, 3 new factors (0.9%). Meh![/QUOTE]

Keep in mind that when you're redoing P-1 work you should expect a lower probability than what Prime95/mprime reports. As a rough guide, I subtract from what is reported as expected by what the previous run's probability was.

When I'm redoing poorly P-1'ed work (read: no Stage 2) I tell mprime that the test will save four LL tests, and give it between 10GB and 12GB of RAM to use. Doesn't make sense, I know, but it's my kit and electrons. On average in the 4xM range I get about a 3% success rate.


All times are UTC. The time now is 07:28.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.