mersenneforum.org Questions about PrimeNet graphs and overall trends
 Register FAQ Search Today's Posts Mark Forums Read

 2019-08-05, 16:21 #1 hansl     Apr 2019 5×41 Posts Questions about PrimeNet graphs and overall trends Hi, I'm just trying to get an idea of how the overall project is progressing and there are a few discrepancies that make me think maybe I'm interpreting the data incorrectly. One question I had was whether the DC gap is narrowing or widening. The Assignments chart here from this page: https://www.mersenne.org/primenet/graphs.php shows ~4.3x more DC assignments than LL based on the trend line most recent data point. This would lead me to believe that 4.3x more DC work is being done than LL, but then looking at this table of data which summarizes the overall progress: http://hoegge.dk/mersenne/GIMPSstats.html Comparing Total change in LL-D vs LL, instead it shows ~1.6x LL-D/LL over the past 30 days, and ~2x LL-D/LL over the past year. Can anyone explain what this could mean? Is it that DC assignments are simply not completed, or abandoned at a much higher rate than LL? Also, going back to the graphs page: https://www.mersenne.org/primenet/graphs.php the TFLOPS graph has small downward trend. Is overall interest and participation in the project dwindling lately? Are there longer term graphs anywhere to see when it peaked? I guess I expected to see more of an upward trend as CPU power increases in general over time, even if the number of users was relatively constant. What is the unit for X in the gaph's trend line equations? Don't mean to be overly negative but my morbid curiosity wants to know when the trend line would predict the death of GIMPS at 0 TFLOPS. The New Accounts per day presents an unrealistically optimistic view that hundreds of new people are joining the project every day, but as I understand it this is mainly due to overclockers and PC builders using prime95 for stress testing without really contributing results. A suggestion for perhaps more informative or interesting graph would be number of "active" accounts who have reported any results in the past X (30 or so?) days. Another question, related to the overclocker effect, is if it could be feasible for stress tests to actually do any kind of meaningful work and submit tiny partial results during these relatively short stress tests? I wonder how many Ghz-days/day of stress tests are performed overall? An overall downward trend in results makes sense to me at some level since the work per exponent is always increasing. Would the apparent decreased participation perhaps also be explained as a psycho-social aspect of this increased perceived effort/reward, discouraging further effort?
 2019-08-05, 17:19 #2 kriesel     "TF79LL86GIMPS96gpu17" Mar 2017 US midwest 11·347 Posts It's been several months since the last new Mersenne prime was found, and with its holiday announcement, it did not generate as many new signups as usual. Participation in number of active users is down from its peak. I think participation is lighter during summer, in general. The abandon rate of DC vs first test LL or PRP is unclear. First test assignments that are abandoned get reclassified as DC after a first-test is reported. Some exponents show 8 or more abandoned primality test assignments. New people who join not realizing how long it takes to do one assignment may quit early, particularly the least patient, during their first assignment, which is typically a DC assignment from the start. Twice as many DC being completed at ~48M as first-test at 86M represents about 60% as much DC computing effort as first-test, since primality test effort per exponent is ~p2.1. It's likely that the trend function graphed as a straight line would actually at a broad range of lower participation look more like a decaying exponential and have an asymptote. It would not have a zero crossing. I'm guessing the X in the trend line equation is something like the Excel date and time variable. Today() on August 5 2019 is 43862; tomorrow is 43863. Political publicity about climate change is probably having some effect. I've heard from some participants that they would run more if not for that consideration. The number that have signed up, the number that have ever contributed a result, the number that contributed a result in the past year, and the number that have contributed a result in the past 30 days are very different. Last fiddled with by kriesel on 2019-08-05 at 17:21
2019-08-05, 18:20   #3
chalsall
If I May

"Chris Halsall"
Sep 2002

899010 Posts

Quote:
 Originally Posted by kriesel The abandon rate of DC vs first test LL or PRP is unclear.
The abandonment rate of DCs is *extreme*.

Just over five years ago new assignment rules were implemented, which introduced the concept of categories. New users are by default first given DC Category 4 ("Cat 4") assignments until they prove they're going to stick around, and that their kit is good.

The abandonment rate there is so high it's nicknamed the "Churners' Zone". This is why you see tens of thousands of DC assignments in 52M and above, but only a couple of thousand below.

2019-08-05, 23:08   #4
kriesel

"TF79LL86GIMPS96gpu17"
Mar 2017
US midwest

11×347 Posts

Quote:
 Originally Posted by chalsall The abandonment rate of DCs is *extreme*. Just over five years ago new assignment rules were implemented, which introduced the concept of categories. New users are by default first given DC Category 4 ("Cat 4") assignments until they prove they're going to stick around, and that their kit is good. The abandonment rate there is so high it's nicknamed the "Churners' Zone". This is why you see tens of thousands of DC assignments in 52M and above, but only a couple of thousand below.
Hmm.
https://www.mersenne.org/report_expo...2001000&full=1
abandoned per exponent, 0 to 10

https://www.mersenne.org/report_expo...3001000&full=1
abandoned per exponent, 0 to 10.

These samples are similar to some I've seen around 47M. Granted, many of these 5xM are still unverified so may climb noticeably higher in number of abandonments. I'd rather people abandon DCs than squat on first time tests and delay milestones.

Last fiddled with by kriesel on 2019-08-05 at 23:10

2019-08-07, 00:35   #5
Serpentine Vermin Jar

Jul 2014

3·1,087 Posts

Quote:
 Originally Posted by chalsall The abandonment rate of DCs is *extreme*. Just over five years ago new assignment rules were implemented, which introduced the concept of categories. New users are by default first given DC Category 4 ("Cat 4") assignments until they prove they're going to stick around, and that their kit is good. The abandonment rate there is so high it's nicknamed the "Churners' Zone". This is why you see tens of thousands of DC assignments in 52M and above, but only a couple of thousand below.
Hmm... it might be an interesting set of data to get the average # of abandoned assignments for various ranges of exponent size. See if it tends to stay the same or not.

Sounds like a complicated SQL query and best saved for when I'm super bored or whatever, but it could be interesting.

This is NOT an "average abandonments per exponent" table, but it is a *total* # of abandonments for some recent exponent ranges. The key part there is I didn't figure out how many total exponents are in each range that needed testing (that means excluding any with factors, although some may have been abandoned and then a factor found later):
Code:
Range	Count
36	49614
37	92362
38	125856
39	142227
40	122450
41	97814
42	94828
43	94211
44	91936
45	98777
I'm not sure what to make of it, and my query was a quick'n'dirty one just counting expired assignments grouped by floor(exponent/1e6) - the higher #'s in the 38M-41M range are interesting, as is the low # at 36M. Weird.

In fact, if I go back further to ranges below 36M they really drop off dramatically. Note that this is all dependent on the data available. When the Primenet server was updated from v4 to v5, around 2008'ish I think, that really represents when we started keeping records of those abandoned assignments, so we can't look back too far.

Anyway, based on the most recent ones, it does seem to fluctuate in a small range, but someone would have to dig further to see if there's any trends one way or another, or just random.

2019-08-07, 13:51   #6
GP2

Sep 2003

A1216 Posts

Quote:
 Originally Posted by Madpoo the higher #'s in the 38M-41M range are interesting, as is the low # at 36M. Weird. In fact, if I go back further to ranges below 36M they really drop off dramatically.
When "at least one DC per year" was introduced, that probably meant that new users all started off doing a DC, which most of them abandon. Previously they would be assigned a first-time check or a 100M-digit exponent, and abandon that instead.

When categories were introduced, that meant that new users were assigned DCs well away from the wavefront. Previously they were probably doing wavefront DCs.

So if "one DC per year" was introduced first, and then categories were introduced later (I don't remember, did it happen that way?), then that might explain a temporary high peak in wavefront DC abandonments in the 38M to 41M range.

2019-08-07, 14:17   #7
chalsall
If I May

"Chris Halsall"
Sep 2002

2·5·29·31 Posts

Quote:
 Originally Posted by GP2 So if "one DC per year" was introduced first, and then categories were introduced later (I don't remember, did it happen that way?), then that might explain a temporary high peak in wavefront DC abandonments in the 38M to 41M range.
Nope. IIRC, both were introduced at the same time. It was part of the "Opt-in" function on the assignment rules report for an account to get the lowest available assignments (with the associated tightened expiry rules).

2019-08-07, 20:47   #8
chalsall
If I May

"Chris Halsall"
Sep 2002

2×5×29×31 Posts

Quote:
 Originally Posted by Madpoo I'm not sure what to make of it, and my query was a quick'n'dirty one just counting expired assignments grouped by floor(exponent/1e6) - the higher #'s in the 38M-41M range are interesting, as is the low # at 36M. Weird.
Thinking about this in the back of my mind while I did "real work", this is possibly a function of the fact that existing assignments were "grandfathered" when the new assignment rules were implemented.

This possibly also explains why there are more DCs successfully completed in 51M than 50M.

Anyway, I suspect if you expand your QaD query up to (say) 57M (possibly with a temporal constraint) it will make more sense.

2019-08-07, 21:48   #9
ATH
Einyen

Dec 2003
Denmark

2×13×109 Posts

Quote:
 Originally Posted by hansl http://hoegge.dk/mersenne/GIMPSstats.html Comparing Total change in LL-D vs LL, instead it shows ~1.6x LL-D/LL over the past 30 days, and ~2x LL-D/LL over the past year.
If LL-D was catching up then the LL column would have a negative progress. These numbers are the total number of exponents with 1 and 2 completed LL tests.

Right now LL-D: +4042 (30 days) and LL: +2568 (30 days)

which means that 4042+2568 = 6610 first time LL tests was done in the last 30 days and then 4042 first time LL tests was successfully double checked.

Last fiddled with by ATH on 2019-08-07 at 21:57

2019-08-15, 21:31   #10
hansl

Apr 2019

5×41 Posts

Quote:
 Originally Posted by ATH If LL-D was catching up then the LL column would have a negative progress. These numbers are the total number of exponents with 1 and 2 completed LL tests. Right now LL-D: +4042 (30 days) and LL: +2568 (30 days) which means that 4042+2568 = 6610 first time LL tests was done in the last 30 days and then 4042 first time LL tests was successfully double checked.
Ah, right! Yeah I had a feeling I was looking at it wrong.

So It would be more like (using today's numbers) 4021/(4021+2679) = 60% LL-D vs LL over the past month
and for the past year: 60787/(60787+27682) = 68.7%

That's even more disparity when comparing the # of assignments from the graphs:
https://www.mersenne.org/primenet/graphs.php

A graph of DC and LL *completed* results per day, instead of just assignments, would be interesting to see.

 2020-01-14, 22:27 #11 Uncwilly 6809 > 6502     """"""""""""""""""" Aug 2003 101×103 Posts 175408 Posts This is for James or Aaron. Until that HUGE spike of TFLOPs from Nov 19 passes of the graph, could the TFLOPs graph be set to log? It would make any trends easier to see.

 Similar Threads Thread Thread Starter Forum Replies Last Post kriesel Cloud Computing 18 2019-04-24 00:21 SELROC Hardware 23 2019-02-21 22:55 SELROC Hardware 0 2019-01-16 09:22 GP2 Cloud Computing 14 2018-01-11 02:01 ixfd64 PrimeNet 1 2008-10-25 20:39

All times are UTC. The time now is 11:10.

Sat May 30 11:10:58 UTC 2020 up 66 days, 8:44, 1 user, load averages: 1.18, 1.25, 1.36