 mersenneforum.org 100M digits prefactor project.
 Register FAQ Search Today's Posts Mark Forums Read  2005-03-13, 13:57 #67 ValerieVonck   Mar 2004 Belgium 292 Posts Range: 333600000 333699999 from 1 => 50 bit depth complete.   2005-03-13, 21:23 #68 VBCurtis   "Curtis" Feb 2005 Riverside, CA 23·54 Posts Reserving 332193431 and 332193457 for "deep" searching (likely 67 bits). -Curtis   2005-03-14, 07:25 #69 ValerieVonck   Mar 2004 Belgium 292 Posts Code: 333400000 333499999 333500000 333599999 I will take these ranges from 1 to 55 bit depth.    2005-03-14, 07:42 #70 ValerieVonck   Mar 2004 Belgium 292 Posts Correction: I will take these ranges from 1 to 50 bit depth   2005-03-14, 07:45 #71 Uncwilly 6809 > 6502   """"""""""""""""""" Aug 2003 101×103 Posts 17·19·31 Posts Hold on, I have the results of those to that depth. I am working on a site update at this very mo. Check back in tomorrow for a full update.   2005-03-15, 01:41 #72 Peter Nelson   Oct 2004 232 Posts Uncwilly, I think you have at least mostly understood my request. Thanks for trying to help me. Cheesehead: thanks for repeating the general approximation formula from elsewhere. However this is precisely what I'm NOT satisfied with. As we only test primes as potential factors of mersenne candidates, we ignore all non-prime numbers. ie ignore Even ones obviously and also those we already established are composite. This means that in a given range of numbers we do not even begin trial factoring a number because it's not going to be what we look for. So in part, the probability of finding a factor when searching all exponents in a range A to B depends on the distribution of mersennes within that range. The distribution will determine how many there are to test. This in itself skews the probability. Then my major problem was that we might say in a range I found 20 factors, but have not counted up those numbers where no factor was found. Also expending more effort to TF deeper bit level makes it more likely to find a factor. As I understand it the current prime95 client makes a choice of what bit depth it will be worthwhile based on relative effort of its TF vs LL. Now, what if a modified algorithm or project had a better or worse TF/LL ratio. The optimal cutoff would therefore be somewhere else (more or less bits of factoring effort). I've read discussion elsewhere about actual calculations using the formula cheesehead put forward - using combinatorial probability rules for each bit depth. ie the probability of finding a factor from 2^63 - 2^68 has to take into account that someone already did testing from 2^3 up to 2^62 finding nothing. As a result I was trying to build a computer model or algorithm which would do this properly and very accurately. Since the TF/LL ratio varies between client architectures (eg SSE2) and even at different bit depths eg P4 above 2^64 uses SSE2 so is fast but below is slow. Therefore I wanted to benchmark factoring on a given machine then use these speeds combined with the PROPER probability results. For example prime95 can do factoring but so can Factor3_2. I have benchmarked these on various size numbers eg 10million digits, 100million digits, billion digits range etc. (up to a given bit depth anyway). I am hoping to obtain some of the data on probability of discovering a factor in a test from empirical research eg in your project. As I pointed out, I need to know a test was tried even if unsuccessful in discovering a factor. I hope the above gives some insight to my motivation. I will look on the website for any useful info you post there. Thankyou.   2005-03-15, 06:38   #73

"Richard B. Woods"
Aug 2002
Wisconsin USA

22×3×641 Posts Quote:
 Originally Posted by Peter Nelson Cheesehead: thanks for repeating the general approximation formula from elsewhere. However this is precisely what I'm NOT satisfied with.
Please let us know if you find a better approximation. I've been surprised not to find any.

Quote:
 As we only test primes as potential factors of mersenne candidates, we ignore all non-prime numbers. ie ignore Even ones obviously and also those we already established are composite.
Yes, this optimization is built into the Prime95 TF code, but it doesn't affect the 1/n approximation.

Quote:
 This means that in a given range of numbers we do not even begin trial factoring a number because it's not going to be what we look for.
Hmmm? Can you restate that?

Quote:
 So in part, the probability of finding a factor when searching all exponents in a range A to B depends on the distribution of mersennes within that range. The distribution will determine how many there are to test.
Can you restate that? When you write "mersennes", do you mean Mersenne primes or Mersenne numbers (not necessarily prime)? When you write "searching all exponents in a range A to B" do you mean searching Mersenne numbers (2^exponent-1) for factors between A and B, or for factors between 2^A and 2^B, or searching for factors of all Mersenne numbers whose exoponents are between A and B, or what? It would help if you'd present specific numeric examples of what you mean.

Quote:
 Then my major problem was that we might say in a range I found 20 factors, but have not counted up those numbers where no factor was found.
Do you mean that someone tested all numbers in a range, found 20 factors, but does not know the total count of numbers tested?

Quote:
 As I understand it the current prime95 client makes a choice of what bit depth it will be worthwhile based on relative effort of its TF vs LL.
Actually, those bit depth limits were separately computed, then hard-coded into Prime95. Once, a few years ago, George recomputed the limits after changing to a more efficient L-L algiorithm then substituted new values for the hard-coded limits in Prime95. It turns out that the optimum limits are not much affected by minor changes in algorithm speed, so there's been no revision lately.

Quote:
 Now, what if a modified algorithm or project had a better or worse TF/LL ratio. The optimal cutoff would therefore be somewhere else (more or less bits of factoring effort).
And if the result were sufficiently different, the hard-coded values might be changed again, or even an algorithm installed to calculate variable limits for sufficiently differing circumstances.

Quote:
 ie the probability of finding a factor from 2^63 - 2^68 has to take into account that someone already did testing from 2^3 up to 2^62 finding nothing.
Yes, but that's already taken into account in the 1/n approximation. The 1/n formula applies to the sizes of smallest known factors, AFAIK.

Quote:
 I am hoping to obtain some of the data on probability of discovering a factor in a test from empirical research eg in your project.
That's what the 1/n approximation is: an empirical result derived from the results of GIMPS trial factoring. What do you want that's different?   2005-03-15, 06:53   #74

"Richard B. Woods"
Aug 2002
Wisconsin USA

170148 Posts Quote:
 Originally Posted by cheesehead AFAIK we have only the empirical statement on http://mersenne.org/math.htm (last paragraph in "Trial Factoring" section) that the chance of finding a factor between 2^x and 2^(x+1) is about 1/x. But note that (a) this was derived from TF on rather smaller exponents and ranges, and (b) it's an approximation that breaks down when the size of the Mersenne's exponent is too close to the size of the factors being tried (i.e, k in 2kp+1 is small).
I should have added two more notes: c) the approximation applies to factors of Mersenne numbers, not necessarily factors of integers in general, and d) the approximation was derived from, and therefore applies to, the lowest factor of a given Mersenne number. The approximation is not dependent on use of the trial factoring method per se, but it is dependent on there being no factor below 2^x (and the trial factoring method provides an exhaustive search for such, unlike P-1 or ECM).   2005-03-15, 17:47   #75
wblipp

"William"
May 2003
New Haven

26×37 Posts Quote:
 Originally Posted by cheesehead That's what the 1/n approximation is: an empirical result derived from the results of GIMPS trial factoring.
If you aren't too near the square root of a number, the probability a "random number" has no factor between c^a and c^b is a/b. Two derivations of this heuristic are show in the ElevenSmooth Math FAQ. (That example uses c=10, but it works for any c > 1).

Thus the 1/n is not merely empirical observation - it is the expected result for random factors, and for sufficiently large factors Mersenne numbers are empirically observed to behave randomly.   2005-03-17, 22:25 #76 ValerieVonck   Mar 2004 Belgium 34916 Posts Results: Code: 333400000 333499999 5021 5021 . . in progress Done from 1 to 50 bit depth 333500000 333599999 5073 5073 . . in progress Done from 1 to 50 bit depth 333600000 333699999 5127 5127 . . Done from 50 to 52 bit depth   2005-03-18, 05:27   #77

"Richard B. Woods"
Aug 2002
Wisconsin USA

22×3×641 Posts Quote:
 Originally Posted by wblipp If you aren't too near the square root of a number, the probability a "random number" has no factor between c^a and c^b is a/b. Two derivations of this heuristic are show in the ElevenSmooth Math FAQ. (That example uses c=10, but it works for any c > 1). Thus the 1/n is not merely empirical observation - it is the expected result for random factors, and for sufficiently large factors Mersenne numbers are empirically observed to behave randomly.
Great! THANKS!!

(Memo to self: read more math FAQs.)   Thread Tools Show Printable Version Email this Page Similar Threads Thread Thread Starter Forum Replies Last Post aurashift Software 18 2016-04-14 13:48 xorbe LMH > 100M 189 2010-12-09 08:30 joblack LMH > 100M 1 2009-10-08 12:31 davieddy Lounge 1 2008-10-18 10:40 wblipp ElevenSmooth 0 2003-10-15 16:07

All times are UTC. The time now is 20:37.

Sat Oct 23 20:37:30 UTC 2021 up 92 days, 15:06, 0 users, load averages: 1.17, 1.14, 1.19