mersenneforum.org  

Go Back   mersenneforum.org > Fun Stuff > Lounge

Reply
 
Thread Tools
Old 2003-12-30, 21:29   #1
lpmurray
 
lpmurray's Avatar
 
Sep 2002

89 Posts
Default GIMPS BREAKS THE 60000 MACHINE MARK

The virtual machine's sustained throughput* is currently 10498 billion floating point operations per second (gigaflops), or 872.1 CPU years (Pentium 90Mhz) computing time per day. For the testing of Mersenne numbers, this is equivalent to 374 Cray T916 supercomputers, or 187 of Cray's most powerful T932 supercomputers, at peak power. As such, PrimeNet ranks among the most powerful computers in the world. (*Measured in calibrated P5 90Mhz, 32.98 MFLOP units: 25658999 FPO / 0.778s using 256k FFT.)

For more information, please see the GIMPS home page, the PrimeNet Statistics or the PrimeNet Project Credits.

Current PrimeNet Atomic Clock UTC Time is Tuesday 30 December 2003, 21:28:26


------- Aggregate CPU Statistics, P90 Units* -------

Last 7 Days Average Cumulative Today
from 24 Dec 2003 06h from 30 Dec 2003 06h
---------------------- ----------------------------------
Test Type CPU yr/day GFLOP/s CPU years CPU yr/day GFLOP/s
------------ ---------- ---------- ---------- ---------- ----------
Lucas-Lehmer 849.791 10229.525 582.390 913.420 10995.480
Factoring 24.435 294.135 17.449 27.366 329.428
---------- ---------- ---------- ---------- ----------
TOTALS 874.225 10523.661 599.839 940.787 11324.907



------- Internet CPU and Server Resources -------

Machines Applied on 41944 Accounts Server Synchronization 23 Sep 2003 14:22

Intel Pentium 4 : 18811
AMD Athlon : 24541
Intel Pentium III : 8410
Intel Pentium II : 1753
Intel Celeron : 4610
Intel Pentium Pro : 135
Intel Pentium : 604
AMD K6 : 564
Intel 486 : 35
Cyrix : 339
Unspecified type : 226
---------------------- -------
TOTAL : 60028

Last fiddled with by lpmurray on 2003-12-30 at 21:33
lpmurray is offline   Reply With Quote
Old 2003-12-30, 22:06   #2
PrimeCruncher
 
PrimeCruncher's Avatar
 
Sep 2003
Borg HQ, Delta Quadrant

2·33·13 Posts
Default

Impressive. I wonder if we'll be able to maintain that though...
PrimeCruncher is offline   Reply With Quote
Old 2003-12-30, 22:33   #3
GP2
 
GP2's Avatar
 
Sep 2003

50258 Posts
Default

The usual graphs:

Total machines & Total accounts

Total accounts since October (a 50% increase in less than one month... let's hope we retain most of them).

Total machines, broken down by CPU type

Overall number-crunching speed of the project (Lucas-Lehmer testing)
GP2 is offline   Reply With Quote
Old 2003-12-30, 22:53   #4
edorajh
 
edorajh's Avatar
 
Oct 2003
Croatia

7108 Posts
Thumbs up

Really impressive!
edorajh is offline   Reply With Quote
Old 2003-12-31, 14:02   #5
Jorgen
 
Jan 2003

11 Posts
Default

Are there any stats on how the number of "Expected new primes" is decresing as a function of time?

When I check out the status page, http://www.mersenne.org/status.htm from time to time it looks like we're "finding" 0.01 primes every week, meaning the expected new primes number drops aproxomately with 0.01 every week.

But does anyone have any statistics on that? Ofcourse to keep up with finding 0.01 prime every week we'll have to keep getting ever more computerpower since the primes thin out and the Mersenne numbers get harder to check as they get larger.
Jorgen is offline   Reply With Quote
Old 2003-12-31, 15:00   #6
smh
 
smh's Avatar
 
"Sander"
Oct 2002
52.345322,5.52471

4A516 Posts
Default

Quote:
Ofcourse to keep up with finding 0.01 prime every week we'll have to keep getting ever more computerpower since the primes thin out and the Mersenne numbers get harder to check as they get larger.
And most small factors are found already. LMH is making good progress to 2^60
smh is offline   Reply With Quote
Old 2003-12-31, 17:28   #7
GP2
 
GP2's Avatar
 
Sep 2003

29·89 Posts
Default

Quote:
Originally posted by Jorgen
Are there any stats on how the number of "Expected new primes" is decresing as a function of time?
Well, the number of Mersenne primes is a linear function of log P, where P is the exponent. This is based on heuristic considerations and the known data points so far:

http://www.utm.edu/research/primes/n...tMersenne.html
http://opteron.mersenneforum.org/png/log2_log2_Mn.png

And empirically, for the GIMPS project, the increase in the leading edge of LL testing for exponent P has been a linear function of time, so far.

http://opteron.mersenneforum.org/png/leading_edge.png


So although we are progressing through the exponents at a constant speed, the density of expected new primes keeps decreasing. That ought to mean new prime discoveries will get fewer and farther between. However, past experience suggests that's not the case: after all Mersenne primes have been being discovered at a pretty steady rate over the past 50 years ever since computers started being used.


In any case, it's purely an empirical observation that the leading edge of GIMPS LL testing is progressing at constant speed. Will this hold over the long term?

On the one hand we have Moore's law which says that computers get exponentially faster. And a logscale plot of LL testing CPU yrs/day shows that GIMPS's crunching rate is indeed increasing more or less exponentially (http://opteron.mersenneforum.org/png...d_logscale.png). Of course the LL testing rate reflects not only Moore's law but also increases in the number of participants and algorithmic improvements as well.

We also know that the number of operations needed to do a Lucas-Lehmer test for exponent N is on the order of O(N2 log N) according to aaronl, because the FFT is O(N log N) and it must run for N iterations.

However, the need to do most Lucas-Lehmer tests is removed by finding small factors. Since factors of 2P-1 must be of the form 2kP+1, finding factors gets harder for larger exponents. Will trial-factoring eventually lose its effectiveness, resulting in more LL testing needing to be done as we get to higher and higher exponents?


So it's hard to predict, over the really long term, how things will go. Will Moore's exponential law trump the mere quadratic increase in LL testing operations needed for each exponent? The problem is, Moore's law itself is merely an empirical observation -- eventually the laws of physics impose a limit on how small we can make electronic circuits, unless we go to different technologies entirely.
GP2 is offline   Reply With Quote
Old 2003-12-31, 17:40   #8
GP2
 
GP2's Avatar
 
Sep 2003

29·89 Posts
Default

Here's a graph of the number of digits in the largest known prime by year:

http://www.utm.edu/research/primes/n...ar.html#graph1

More or less linear over the last 50 years.
GP2 is offline   Reply With Quote
Old 2004-01-01, 02:20   #9
aaronl
 
aaronl's Avatar
 
Aug 2003

24×3 Posts
Default

Quote:
Originally posted by GP2
The problem is, Moore's law itself is merely an empirical observation -- eventually the laws of physics impose a limit on how small we can make electronic circuits, unless we go to different technologies entirely.
Those technologies would have to transcend matter and space eventually. From Applied Cryptography by Bruce Schneier, Second Edition, p. 157:
Quote:
One of the consequences of the second law of thermodynamics is that a certain amount of energy is necessary to represent information. To record a single bit by changing the state of a system requires an amount of energy no less than k*T, where T is the absolute temperature of the system and k is the Boltzman constant....

Given that k = 1.38*10^-16 erg/deg. Kelvin, and that the ambient temperature of the universe is 3.2 deg. Kelvin, an ideal computer running and 3.2 deg. Kelvin would consme 4.4*10^-16 ergs every time it set or cleared a bit. To run a computer any colder than the cosmic background radiation would require extra energy to run a heat pump.
He goes on to show that with the sun's annual energy output of 1.21*10^41 ergs you would only have the power for 2.7*10^56 bit changes.... which is enough to test a very, very large prime.

I remember someone saying that generalized Fermat primes will soon get more attention because the candidates are distributed in a way so that you can test many without having to continuously test larger candidates (requiring more computing time). I think that if GIMPs exhausts the first one or two FFT sizes above 10 million digits, people will start trying to claim the prize by looking for large generalized Fermat primes (I'm not sure where this becomes practical, but I've heard that asymptotically they're just as easy to find).
aaronl is offline   Reply With Quote
Old 2004-01-01, 04:36   #10
Jorgen
 
Jan 2003

B16 Posts
Default

I'm aware of most of this. Mores law etc, but my question was really if we are actually keeping up with the ever harderer task of finding new primes in the same amount of time by having faster computers AND more computers in GIMPS. And I'm wondering if we have any GIMPS statistics definetaly showing this.

I joined GIMPS myself in early 1997. But I don't remember how many "expected new primes" we covered each week back then. My guess would be that we did better than 0.01 primes/week, even if we had much slower computers (to give a picture: I had 5 Pentium Pro 200's doing GIMPS which was alot then and put me up in top 100 producers in less than a year or so), but still the numbers were so small (exponents in the 2-3 mill range) that I think we had a better overall probability for GIMPS to find a new prime any given day back then than we have now.

But maybe we haven't. I guess what I'm asking is if anyone have any stats on George's weekly snapshot of "expected new primes" over the last years and if we by reading that stat should expect to find fewer primes in the years to come (less than 1 every 2 years) because we can't keep up with the growing numbers even with the extrordinary recrouting we've had and mores law helping us with ever faster computers.

Or to put it in concrete terms: Are we more or are we less likely to find a new prime on the date of January 1'st 2004 than we were January the 1'st 1998?

Last fiddled with by Jorgen on 2004-01-01 at 04:42
Jorgen is offline   Reply With Quote
Old 2004-01-01, 04:55   #11
Jorgen
 
Jan 2003

B16 Posts
Default

Just to make it clear: I'm not talking about if I as a GIMPS member is more likely to find a new prime now than back then. I'm clearly not. (I did first time checks in about 2-3 days then, I can't match that now, and the probability of a test turning out to be prime was much higher then since the numbers were smaller). But we have more GIMPS members now, and maybe that compensates.
Jorgen is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
running gimps on a virtual machine sixblueboxes Hardware 2 2013-03-31 22:14
Bit Defender Breaks Windows 64 Bit petrw1 Science & Technology 3 2010-03-25 10:41
clean removal of a machine from GIMPS blackguard PrimeNet 4 2005-02-15 16:08
we have passed the 14000 test mark wfgarnett3 PSearch 0 2004-11-15 07:18
we have passed 10000 candidates tested mark!! wfgarnett3 PSearch 0 2004-07-26 02:08

All times are UTC. The time now is 18:47.

Thu Oct 22 18:47:51 UTC 2020 up 42 days, 15:58, 2 users, load averages: 1.65, 1.86, 2.03

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.