100M digits prefactor project.
We have with GIMPS/PrimeNet officially working on the 10M digit issue and Luigi's Operation Billion Digit working toward 1,000M, there seemed to be a gap. Nobody has organized the work for the 100M digit prime.
So here is what I suggest. We start working on the 100M digit range. Using Luigi's Factor3_1 and Factor3_1b we can help clear the way for P1 and LL that shall follow. I have done some preliminary work in factoring the range starting with M332192831 (the first prime expo. that will yield a 100M digit number). Starting here and working up to higher expos seems wise. If we can get a good projection of where the Mersenne prime might be, based upon the 1.47p rule, we can apply effort there, too. I would like to suggest doing batch work. We can work in ranges and report results back etc. More thoughts later. Anyone who wants to help speak up. 
M332192831 no factor from 2^54 to 2^62.
M332192897 no factor from 2^52 to 2^54. M332192953 no factor from 2^52 to 2^54. M332192957 no factor from 2^52 to 2^54. M332192969 no factor from 2^52 to 2^54. M332193019 no factor from 2^52 to 2^54. M332193047 no factor from 2^52 to 2^54. M332193053 no factor from 2^52 to 2^54. There are a lot more, but if any one wants to run some of these up feel free, just post what you are taking. I see the 100M digits prefactor project (100Mdpp) as having 2 aspects, the deep and the wide. Deep is taking individual expos to high bit depths (what high is will be determined later). (Nose to the grindstone types work here). Wide is taking a large range (e.g. 33200000  33300000) to a moderate bit depth (removing a huge amount of expos.). (short attention span folks that need to see lots happening work here). 
[QUOTE=Uncwilly]
More thoughts later. Anyone who wants to help speak up.[/QUOTE] Don't forget to send all the factors found to Will Edgington! :smile: Uncwilly, I don't remember if I ever posted an updated version of Factor3_1b on the forum... I will set it for download on the Billion Project page if you don't mind, so that it will reflect every further modification I will put on the source. Luigi 
Cool, the link on you Billion digit page for Factor3_1 pointed to the attachment in the forum.....
As tto Will E. I plan to send a large batch that I have.... 
I started doing 332200007  332209991 to 50 bits i would have done a larger range but it is a pain putting ,1,50 on every line of the input file. I have a program that can give me the primes which i can cut and paste into a text file. is there anyway to automate the starting and ending bit depth. the run should be done soon. almost half have factors

[QUOTE=lpmurray]I started doing 332200007  332209991 to 50 bits i would have done a larger range but it is a pain putting ,1,50 on every line of the input file. I have a program that can give me the primes which i can cut and paste into a text file. is there anyway to automate the starting and ending bit depth. the run should be done soon. almost half have factors[/QUOTE]
Try [URL=http://www.mersenneforum.org/attachment.php?attachmentid=145]this[/URL]. I wrote it some weeks ago for this purpose, I don't know if it the last version. Should you have problems, just drop me a pm and I'll set it up. Luigi 
I have a program (in old fashioned basic) that will strip out the expos with factors (to a file that we can send to Will E.) and produce a new worktodo.ini with a set bit range.
I want to clear the expo that P95 is working on on this machine before I steal to much more time from it to prefactor. 
Is factor3_2b available?

[QUOTE=Uncwilly]Is factor3_2b available?[/QUOTE]
Sure! Check [URL=http://www.gimps.it/billion/billion.htm]this[/URL] link! There are both factor3_2 and factor3_2b on the same Windows Zip archive. Luigi 
I noticed that after I posted. I was asleep on my feet.

332209991  332249999 out to 54 or 55 in three or four passes. 2014 expos

i've done 332200000  332300000 to 50bits and need that program to strip out the factors or i could just send someone the results file its 222k. Also i'm working on 1 number in that range to large bit. i'm running it on a 2.8gig p4 and it took 100hrs to do just 67bit is that normal for factor3_2? At that rate 73 start 74 end would take a year....Since time doubles every bit depth.

[QUOTE=lpmurray]i've done 332200000  332300000 to 50bits and need that program to strip out the factors or i could just send someone the results file its 222k. Also i'm working on 1 number in that range to large bit. i'm running it on a 2.8gig p4 and it took 100hrs to do just 67bit is that normal for factor3_2? At that rate 73 start 74 end would take a year....Since time doubles every bit depth.[/QUOTE]
That does seem a bit high. But then again, factoring to a given bit depth takes longer for [B]smaller[/B] exponents (because there are more values of k to test, since k is smaller). 
[QUOTE=lpmurray]i've done 332200000  332300000 to 50bits and need that program to strip out the factors or i could just send someone the results file its 222k.[/QUOTE]
I'll PM you an email address, zip it and send to me. What bit interval do you want to do next on it? 50?? When I strip it I will save the factors to send to Will E. and return you a worktodo.ini for factor3_2b At this expo level and with Factor3_2 it is not worth fatoring them very far (70). 
[QUOTE=Uncwilly]I'll PM you an email address, zip it and send to me. What bit interval do you want to do next on it? 50?? When I strip it I will save the factors to send to Will E. and return you a worktodo.ini for factor3_2b At this expo level and with Factor3_2 it is not worth fatoring them very far (70).[/QUOTE]
Working so fast you'll soon complete the gap from 100MP to 1BP! :smile: As jinydu said the speed of factor3_2 depends on both the bit depth and the exponent size. I'm working on 3,321,928,171 from 71 to 72 bits and this will take less than a week on an Athlon XP 2100+ Luigi 
[QUOTE=ET_]
I'm working on 3,321,928,171 from 71 to 72 bits and this will take less than a week on an Athlon XP 2100+ Luigi[/QUOTE] Are you running windows or linux? I'm running windows xp with a p4 2.8gig. I am running a number in the 300 million range which is 10 times smaller and it took 100hrs just to do 67 to 68 bits. by doubling the time for each bit depth it should take me 67 days to do 71 to 72 bits. I can't figure out why you are running so much faster. 
Factors for Mersennes must be in the form 2[I]k[/I][B]p[/B]+1
[B]P[/B] is the expo. As it increases the max that [I]k[/I] can be and still have the whole factor below a certain bit level shrinks. You reduce the max that [I]k[/I] can be by the same amount that [B]p[/B] goes up. p'=p*10 > k'=k/10 
That is, if the exponent is ten times smaller, you have ten times as many values of k to test to reach a given bit depth. This is why it has been possible to factor MM127 to over 170 bits.
However, it was mentioned in another thread ([url]http://www.mersenneforum.org/showthread.php?t=2335&page=2[/url], biwema's posts) that the amount of CPU power needed to test a particular value of k for M(p) is proportional to the amount of CPU power to LL test M(log base 2 [p]). Thus, larger exponents do take longer to test for each value of k. However, this increase grows very slowly. Each value of k in a billion digit exponent takes only 12.5% longer than each value of k in a 100 million digit exponent. Still, it puts a limit on the largest exponents that can be factored: M(M(p)), where M(p) is the largest exponent that can be LL tested. Plesae correct me if I'm wrong. 
[QUOTE=Uncwilly]332209991  332249999 out to 54 or 55 in three or four passes. 2014 expos[/QUOTE]
Duh, I feel dumb. I stepped on toes here. I will now be doing 332300001  332399999, likely only to 50 bits. I have improved my tracking system. I am thinking about setting up a page for this, with worktodo.ini's all zipped and ready for open ranges, etc. Next week, though, not now. 
[QUOTE=Uncwilly]Duh, I feel dumb. I stepped on toes here.[/QUOTE]
No, you just did some doublecheck :rolleyes: Waiting for your page, I will link it to mine. Luigi 
[QUOTE=lpmurray]Are you running windows or linux? I'm running windows xp with a p4 2.8gig. I am running a number in the 300 million range which is 10 times smaller and it took 100hrs just to do 67 to 68 bits. by doubling the time for each bit depth it should take me 67 days to do 71 to 72 bits. I can't figure out why you are running so much faster.[/QUOTE]
I am running on Windows XP. Mind that the program be factor3_2 instead of factor3_1. Note also that the software has been optimized for [B]Athlons[/B] in both Cygwin and GMP environments. I still have to (buy and) set up a [B]Pentium 4[/B] and recompile an optimized version. :innocent: Luigi 
[QUOTE=lpmurray]Also i'm working on 1 number in that range to large bit.[/QUOTE]
Let me know what it is, I will track individual expos that are being run 'deep'. 
1 Attachment(s)
The "100Million digit prefactor project" page is now up :banana:
The url: [URL=http://www.geocities.com/onehundredmdpp/index.htm]http://www.geocities.com/onehundredmdpp/index.htm[/URL] , to bad geocities does not allow starting their pages with numbers. 
We have several folks now working on the project and could use some more warm CPU's.
Follow the link to see where we are. [URL=http://www.geocities.com/onehundredmdpp/index.htm]http://www.geocities.com/onehundredmdpp/index.htm[/URL] Once we get a bit further I will be posting a graph. 
[QUOTE=Uncwilly]We have several folks now working on the project and could use some more warm CPU's.
Follow the link to see where we are. [URL=http://www.geocities.com/onehundredmdpp/index.htm]http://www.geocities.com/onehundredmdpp/index.htm[/URL] Once we get a bit further I will be posting a graph.[/QUOTE] Nice job! :wink: BTW, Will Edgington has released a new "facpriks" file dated June 25, 2004. It seems that now it has many more factors above 79,300,000 :rolleyes: Luigi 
Hi,
I am one of the "new folks" :grin: I am currently running some ranges from the "wide" phase (up to 55 bits) + individual exponents from the "deep" phase (up to 62 bits). For my first individual exponent (332192953) the program found a factor quite fast :banana: (beginner's luck is not a myth :razz: ) No factor found between 2^55 and 2^62 for 332192957 and 332192969 though :no: (Now working on 332193019) 
M332193019 no factor from 2^55 to 2^62.
Reserving M332193047. 
M332193047 has a factor: 2926561826311184081 :banana:
(no other factor found between 2^55 and 2^62) Taking up M332193053. 
[QUOTE=segmtfault]M332193047 has a factor: 2926561826311184081 :banana:
(no other factor found between 2^55 and 2^62) Taking up M332193053.[/QUOTE] Lucky you... :w00t: :grin: Luigi 
Today's results :
M332193109 no factor from 2^58 to 2^62. M332193149 no factor from 2^58 to 2^60. M332193233 no factor from 2^58 to 2^59. M332193431 no factor from 2^58 to 2^59. M332193457 no factor from 2^58 to 2^59. M332193461 no factor from 2^58 to 2^59. M332193469 no factor from 2^58 to 2^59. M332193529 no factor from 2^58 to 2^59. M332193539 no factor from 2^58 to 2^59. M332193629 no factor from 2^58 to 2^59. M332193721 no factor from 2^58 to 2^59. M332193751 no factor from 2^58 to 2^59. M332193817 no factor from 2^58 to 2^59. M332193833 no factor from 2^58 to 2^59. M332193859 no factor from 2^58 to 2^59. Currently taking up M332193149 to 62 bit. 
M332192897 no factor from 2^61 to 2^63.

M332193149 no factor from 2^60 to 2^62.
M332193233 no factor from 2^59 to 2^62. M332193431 no factor from 2^59 to 2^62. M332193457 no factor from 2^59 to 2^62. M332193461 no factor from 2^59 to 2^61. 
Site updated with the latest deep results.
[URL=http://www.geocities.com/onehundredmdpp/index.htm]http://www.geocities.com/onehundredmdpp/index.htm[/URL] :banana: 
332192831 no factor to 2^65

M332192957 no factor from 2^62 to 2^66. :no:
Taking up M332192969 to 65 bits 
M332192969 no factor from 2^62 to 2^65.
Taking up M332193019 
M332193019 no factor from 2^60 to 2^65.
Taking up M332193109 
M332193109 no factor from 2^62 to 2^67.
Taking up M332193149 to 65 
M332193149 no factor from 2^62 to 2^65.
Taking up M332193233 to 65 
M332193233 no factor from 2^62 to 2^65.

M332193469 no factor from 2^59 to 2^60.
This is the first post since July 20th. What's going on? :question: 
I've gotten distracted. I should be able play catchup either tomorrow or the day after.

332193529 no factor to 2^60.
I'll do the next 7 to 2^60. 
I believe that we are back on track now.
[url]http://www.geocities.com/onehundredmdpp/index.htm[/url] should be upto date. 
[QUOTE=Uncwilly]I believe that we are back on track now.
[url]http://www.geocities.com/onehundredmdpp/index.htm[/url] should be upto date.[/QUOTE] Uncwilly, your page has some dead links: where you wrote "www.gimps.it" you should set "www.moregimps.it" :) Luigi 
Not any 'more'. :grin:

[QUOTE=Uncwilly]Not any 'more'. :grin:[/QUOTE]
Well, more stands for MORELLI, my family name... Luigi 
[QUOTE=clowns789]I'll do the next 7 to 2^60.[/QUOTE]
None had factors. :no: 
M332192897 no factor from 2^63 to 2^64.

M332193859 no factor from 2^60 to 2^62.

parabéns Gonçalo
[QUOTE=Uncwilly]We have with GIMPS/PrimeNet officially working on the 10M digit issue and Luigi's Operation Billion Digit working toward 1,000M, there seemed to be a gap. Nobody has organized the work for the 100M digit prime.
So here is what I suggest. We start working on the 100M digit range. Using Luigi's Factor3_1 and Factor3_1b we can help clear the way for P1 and LL that shall follow. [url="http://mersenneforum.org/images/smilies/extra/bow.gif"]http://mersenneforum.org/images/smilies/extra/bow.gif[/url] I have done some preliminary work in factoring the range starting with M332192831 (the first prime expo. that will yield a 100M digit number). Starting here and working up to higher expos seems wise. If we can get a good projection of where the Mersenne prime might be, based upon the 1.47p rule, we can apply effort there, too. I would like to suggest doing batch work. We can work in ranges and report results back etc. More thoughts later. Anyone who wants to help speak up.[/QUOTE]+ 
What are you talking about?

Just checking if this project is alive and well?
Any news would be helpfull. Thanks :confused: 
Reserving next range on MDPP site
I'm working on the range 333100000333199999, up to 50 bits. I grabbed the worktodo file off the MDPP site. Is this search still active?
I'm running a P3500 on it, so the range should be done fairly quickly. If this search is dead, I'll waste my time on the Billion search instead. Curtis 
I need to hop back in the driver's seat. I have been away from this project a while. It got to the point that there where just 1.5 interested parties.

I don't care which I run for fun this or the billiondigit search. I picked up a couple weeks' worth of factoring on the Billion site last night, so if you get things updated within 23 weeks, I'll be happy to contribute.
I finished the range mentioned yesterday, but I'm not sure where to email it.. yahoo rejected the mail sent to addy on the site. If your efforts are better spent helping admin the BD search, I'm happy to help there, too, and let this one slide into darkness. Curtis 
I'll have everything from 333100000 to 333600000 finished to 52 bits by the weekend. Where do I email the results files? Can I get your filtering program, so I can extend the ranges to 54 or 55 bits for exponents without a factor? I have 3 slow (500 or slower) machines working on factoring.
Curtis 
The email address listed on the site is active again. The program is touchy, (it has a quirk or two) and I don't want to have somebody lose data when it gets run. I will try to be more on top of things...

I'll finish the first range (333100333200K) to 55 bits over the weekend. I built the worktodo file by hand from the 50bit results.... UncWilly, did you get the other results files? This search seems useful in the nearer future, so I want to run half my classroom's machines on it (they're nearly all on Billion right now); there's joy in finding factors so quickly, too. Maybe the posts will stir a little interest in the project!
I plan to build another worktodo to 56 or 57 bits next week for the same range easy to change bitlimits and delete the exponents with factors found during the previous pass. Any guidance? Curtis 
M333700000 to M333799999 from 1 to 50 bits

Factoring probabilities
Hi, I'm interested in how likely we are to find a factor by trial factoring.
Primenet records a found factor, but regretably (with hindsight) not the number of tests required to find it. With 100mdpp you have opportunity to record/collate this important stat. Therefore could the person who has these results please count up for me, and ideally update your project website to include this info. If you could break it down by bit depth that would be great! Or raw data would be acceptable and I will collate my own stats. Something like eg 332,192,831 through nnn nnn nnn tested exhaustively to 50 bits and found fff factors. (followed by next range by someone else). If you have no idea what I'm talking about say so and I will try to clarify. 
I think that I may know what you mean. Later today I will get off my rear and work on the site when I get to the machine that has the data on it.
BTW, do you mean xxx,xxx,xxx to yyy,yyy,yyy have been tested to zz bits and found aaa factors is acceptable to any zz? Cause once I ran groups through a set bit level, I culled all that had factors, then reprocessed the remainders to a higher level. 
M333700000 to M333799999 from 1 to 50 bits => Complete

Peter
It seems easy enough to report batches (likely in the 100K increments UncWilly has set up on his page) by number of factors found from xx bits to yy bits. UncWilly has set up the search to run to 50 bits first, then run the nonfactored numbers to 52 or 53, etc... this seems like the type of data you wish for, in the sense of "if we run one more bit on this batch, how much time per factor does it cost?" My current batch is running 50 to 55 bits, and I'll post number of factors found (and time spent/machine speed) Tuesday when the run finishes. However, though I have no specific thread in mind, these stats have been run extensively on smaller numbers for GIMPS to determine the most efficient use of time between factoring and LL testing; George has the bit depth set so that when factoring takes more time per expected factor found than LL, Primenet moves the exponent from TF status to LL test status. For exponents around 28 million (the last size I TF'ed for Primenet), that cutoff was 69 bits. For project billion, someone calculated the level as roughly 85 bits (ouch!), but the calculation is heavily dependent on the speed of the code doing the factoring, and the code doing the LL testing, so none of us would really work to 85 bits on a number (not to mention the years it would take!). Curtis 
Range: 333600000 333699999
Number of exponents: 5127 from 1 to 50 Reserved 
[QUOTE=Peter Nelson]Hi, I'm interested in how likely we are to find a factor by trial factoring.[/QUOTE]
AFAIK we have only the [i]empirical[/i] statement on [url="http://mersenne.org/math.htm"]http://mersenne.org/math.htm[/url] (last paragraph in "Trial Factoring" section) that the chance of finding a factor between 2^x and 2^(x+1) is about 1/x. But note that (a) this was derived from TF on rather smaller exponents and ranges, and (b) it's an [i]approximation[/i] that breaks down when the size of the Mersenne's exponent is too close to the size of the factors being tried (i.e, k in 2kp+1 is small). See thread "Probability of finding a factor" at [url="http://www.mersenneforum.org/showthread.php?t=2689"]http://www.mersenneforum.org/showthread.php?t=2689[/url] 
Range: 333600000 333699999
from 1 => 50 bit depth complete. 
Reserving 332193431 and 332193457 for "deep" searching (likely 67 bits).
Curtis 
[code]
333400000 333499999 333500000 333599999 [/code] I will take these ranges from 1 to [b]55[/b] bit depth. :showoff: 
Correction:
I will take these ranges from 1 to 50 bit depth 
Hold on, I have the results of those to that depth. I am working on a site update at this very mo. Check back in tomorrow for a full update.

Uncwilly, I think you have at least mostly understood my request. Thanks for trying to help me.
Cheesehead: thanks for repeating the general approximation formula from elsewhere. However this is precisely what I'm NOT satisfied with. As we only test primes as potential factors of mersenne candidates, we ignore all nonprime numbers. ie ignore Even ones obviously and also those we already established are composite. This means that in a given range of numbers we do not even begin trial factoring a number because it's not going to be what we look for. So in part, the probability of finding a factor when searching all exponents in a range A to B depends on the distribution of mersennes within that range. The distribution will determine how many there are to test. This in itself skews the probability. Then my major problem was that we might say in a range I found 20 factors, but have not counted up those numbers where no factor was found. Also expending more effort to TF deeper bit level makes it more likely to find a factor. As I understand it the current prime95 client makes a choice of what bit depth it will be worthwhile based on relative effort of its TF vs LL. Now, what if a modified algorithm or project had a better or worse TF/LL ratio. The optimal cutoff would therefore be somewhere else (more or less bits of factoring effort). I've read discussion elsewhere about actual calculations using the formula cheesehead put forward  using combinatorial probability rules for each bit depth. ie the probability of finding a factor from 2^63  2^68 has to take into account that someone already did testing from 2^3 up to 2^62 finding nothing. As a result I was trying to build a computer model or algorithm which would do this properly and very accurately. Since the TF/LL ratio varies between client architectures (eg SSE2) and even at different bit depths eg P4 above 2^64 uses SSE2 so is fast but below is slow. Therefore I wanted to benchmark factoring on a given machine then use these speeds combined with the PROPER probability results. For example prime95 can do factoring but so can Factor3_2. I have benchmarked these on various size numbers eg 10million digits, 100million digits, billion digits range etc. (up to a given bit depth anyway). I am hoping to obtain some of the data on probability of discovering a factor in a test from empirical research eg in your project. As I pointed out, I need to know a test was tried even if unsuccessful in discovering a factor. I hope the above gives some insight to my motivation. I will look on the website for any useful info you post there. Thankyou. 
[QUOTE=Peter Nelson]Cheesehead: thanks for repeating the general approximation formula from elsewhere. However this is precisely what I'm NOT satisfied with.[/quote]
Please let us know if you find a better approximation. I've been surprised not to find any. [quote]As we only test primes as potential factors of mersenne candidates, we ignore all nonprime numbers. ie ignore Even ones obviously and also those we already established are composite.[/quote] Yes, this optimization is built into the Prime95 TF code, but it doesn't affect the 1/n approximation. [quote]This means that in a given range of numbers we do not even begin trial factoring a number because it's not going to be what we look for.[/quote] Hmmm? Can you restate that? [quote]So in part, the probability of finding a factor when searching all exponents in a range A to B depends on the distribution of mersennes within that range. The distribution will determine how many there are to test.[/quote] Can you restate that? When you write "mersennes", do you mean Mersenne primes or Mersenne numbers (not necessarily prime)? When you write "searching all exponents in a range A to B" do you mean searching Mersenne numbers (2^exponent1) for factors between A and B, or for factors between 2^A and 2^B, or searching for factors of all Mersenne numbers whose exoponents are between A and B, or what? It would help if you'd present specific numeric examples of what you mean. [quote]Then my major problem was that we might say in a range I found 20 factors, but have not counted up those numbers where no factor was found.[/quote] Do you mean that someone tested all numbers in a range, found 20 factors, but does not know the total count of numbers tested? [quote]As I understand it the current prime95 client makes a choice of what bit depth it will be worthwhile based on relative effort of its TF vs LL.[/quote] Actually, those bit depth limits were separately computed, then hardcoded into Prime95. Once, a few years ago, George recomputed the limits after changing to a more efficient LL algiorithm then substituted new values for the hardcoded limits in Prime95. It turns out that the optimum limits are not much affected by minor changes in algorithm speed, so there's been no revision lately. [quote]Now, what if a modified algorithm or project had a better or worse TF/LL ratio. The optimal cutoff would therefore be somewhere else (more or less bits of factoring effort).[/quote] And if the result were sufficiently different, the hardcoded values might be changed again, or even an algorithm installed to calculate variable limits for sufficiently differing circumstances. [quote]ie the probability of finding a factor from 2^63  2^68 has to take into account that someone already did testing from 2^3 up to 2^62 finding nothing.[/quote] Yes, but that's already taken into account in the 1/n approximation. The 1/n formula applies to the sizes of smallest known factors, AFAIK. [quote]I am hoping to obtain some of the data on probability of discovering a factor in a test from empirical research eg in your project.[/quote] That's what the 1/n approximation is: an empirical result derived from the results of GIMPS trial factoring. What do you want that's different? 
[QUOTE=cheesehead]AFAIK we have only the [i]empirical[/i] statement on [url="http://mersenne.org/math.htm"]http://mersenne.org/math.htm[/url] (last paragraph in "Trial Factoring" section) that the chance of finding a factor between 2^x and 2^(x+1) is about 1/x. But note that (a) this was derived from TF on rather smaller exponents and ranges, and (b) it's an [i]approximation[/i] that breaks down when the size of the Mersenne's exponent is too close to the size of the factors being tried (i.e, k in 2kp+1 is small).[/QUOTE]
I should have added two more notes: c) the approximation applies to factors of Mersenne numbers, not necessarily factors of integers in general, and d) the approximation was derived from, and therefore applies to, the lowest factor of a given Mersenne number. The approximation is not dependent on use of the trial factoring method per se, but it is dependent on there being no factor below 2^x (and the trial factoring method provides an exhaustive search for such, unlike P1 or ECM). 
[QUOTE=cheesehead]
That's what the 1/n approximation is: an empirical result derived from the results of GIMPS trial factoring.[/QUOTE] If you aren't too near the square root of a number, the probability a "random number" has no factor between c^a and c^b is a/b. Two derivations of this heuristic are show in the [URL=http://elevensmooth.com/MathFAQ.html#NoFactors]ElevenSmooth Math FAQ[/URL]. (That example uses c=10, but it works for any c > 1). Thus the 1/n is not merely empirical observation  it is the expected result for random factors, and for sufficiently large factors Mersenne numbers are empirically observed to behave randomly. 
Results:
[code] 333400000 333499999 5021 5021 . . in progress Done from 1 to 50 bit depth 333500000 333599999 5073 5073 . . in progress Done from 1 to 50 bit depth 333600000 333699999 5127 5127 . . Done from 50 to 52 bit depth [/code] 
[QUOTE=wblipp]If you aren't too near the square root of a number, the probability a "random number" has no factor between c^a and c^b is a/b. Two derivations of this heuristic are show in the [url="http://elevensmooth.com/MathFAQ.html#NoFactors"]ElevenSmooth Math FAQ[/url]. (That example uses c=10, but it works for any c > 1).
Thus the 1/n is not merely empirical observation  it is the expected result for random factors, and for sufficiently large factors Mersenne numbers are empirically observed to behave randomly.[/QUOTE] Great! :banana: THANKS!! (Memo to self: read more math FAQs.) 
Range
Hello,
I have finish, and I have sent it, the range 333300000/333399999 I am working on the range 333600000/333699999 Best regards, Domenico 
[QUOTE=VBCurtis]I'll have everything from 333100000 to 333600000 finished to 52 bits by the weekend. Where do I email the results files? Can I get your filtering program, so I can extend the ranges to 54 or 55 bits for exponents without a factor? I have 3 slow (500 or slower) machines working on factoring.
Curtis[/QUOTE] Cedric I finished the 333400000 and 333500000 ranges to 52 bits a month ago that quote is from feb 14. I'm currently taking those ranges to the bit depth you mentioned.. in fact, I already have one of them done to 54. Please focus on a different range. Since nobody else was working on the ranges, I wasn't as clear as I could have been about precisely reserving each of those 5 ranges for that I am sorry; I thought my blanket statement was sufficient. However, 333100000, 333200000, 333400000, and 333500000 are still active on separate machines in my classroom. These are the four ranges marked "in progress" on UncWilly's website. 333300000 never got its own machine, so I'm thankful UncWilly didn't mark it "in progress", even though my post said I'd work on it. Curtis 
Ok, but for now I have uploaded my results.txt file.
I did not know that you have already taken these ranges. For the moment I am working on my 15k. Regards, Cedric vonck 
Doing 33700000 to 33799999 ;)

[QUOTE=Keller]Doing 33700000 to 33799999 ;)[/QUOTE]
Excuse me for asking... Are you THAT Wilfrid Keller? :w00t: Luigi 
No, Sorry but I am just a 16 year old pupil with the same surname ;)
And I dont think I am realted to him ... A lot of Kellers in Germany ;) 
[QUOTE=Keller]No, Sorry but I am just a 16 year old pupil with the same surname ;)
And I dont think I am realted to him ... A lot of Kellers in Germany ;)[/QUOTE] So is it true that you're posting from your basement? (Sorry, couldn't resist. ;)) 
[QUOTE=CedricVonck]M333700000 to M333799999 from 1 to 50 bits => Complete[/QUOTE]
Keller that range has already been worked on, but hasn't been updated on the site yet. Note the first three ranges with Mersenne icons on them are available UncWilly ran them to 50 or 52 bits, but you can do the next step. These ranges are 332800000, 332900000, 333000000. All the rest of the ranges are in progress, I believe check older posts on this forum to make sure. Glad you're joining us! Curtis 
Working on 333800000 and 333900000 to 52 bits. These are the next two ranges after those listed on the MDPP site.

I decided to have a go at deep factoring and took 332193469 from
60 to 61 but no factor found. So i'm now taking it from 61 to 65 and 332192831 from 65 to 70. I'm also taking the folling from 60 to 63bits 332193469 332193529 332193539 332193629 332193721 332193751 332193817 332193833 Regards Foots 
Has anyone tried using v24.12 of Prime95? It can handle these exponents.

[QUOTE=garo]Has anyone tried using v24.12 of Prime95? It can handle these exponents.[/QUOTE]
It's not yet sure that it works correctly, as last runs failed to check some factors. IMHO it would be better waiting for some more thooughtful checks. See: [url]http://www.mersenneforum.org/showpost.php?p=56363&postcount=17[/url] Luigi 
M332193469 no factor from 2^60 to 2^61.
M332193529 no factor from 2^60 to 2^63. M332193539 no factor from 2^60 to 2^63. M332193629 no factor from 2^60 to 2^63. M332193721 no factor from 2^60 to 2^63. M332193751 no factor from 2^60 to 2^63. M332193817 no factor from 2^60 to 2^63. M332193833 no factor from 2^60 to 2^63. M332193469 no factor from 2^61 to 2^64. One remaining is 332192831, 65 to 70. This is still being worked on. Regards Foots 
You are correct Luigi. It is probably worth waiting for a while longer though the thread indicates that the problem is only for exponents above 580 million. Have you tried to see what the speedup would be once it starts working correctly?

reserving 332900000 332999999
Regards footmaster 
[QUOTE=garo]You are correct Luigi. It is probably worth waiting for a while longer though the thread indicates that the problem is only for exponents above 580 million. Have you tried to see what the speedup would be once it starts working correctly?[/QUOTE]
Based on my experience with my last Factor4, the speedup involved with Prime95 would be of about one order of magnitude :showoff: :flex: Luigi 
You should test prime95 with these big numbers. Like check
332192831 > 332199999 and verify that prime95 finds all known factors. In prime95 code is different for factors less than 32bit so bits 1> 31 are slow but bits 32 > 55 are very fast. like: AdvancedFactor=332192831,332199999,32,56 Yours, Nuutti 
I am looking to hand off this project and the associated trappings to a volunteer.

I agree with nuutti. it will also be a nice QA test for George's new code. I can do it if someone can send me the list of numbers and known factors. Send me a PM.

N00b
Hi I'm new and was just testing so I didn't reserve anything but I can at least report that:
M332192897 no factor from 2^64 to 2^65. Weeee! My first result! :banana: 
All times are UTC. The time now is 21:34. 
Powered by vBulletin® Version 3.8.11
Copyright ©2000  2021, Jelsoft Enterprises Ltd.