mersenneforum.org  

Go Back   mersenneforum.org > Factoring Projects > Lone Mersenne Hunters

Reply
 
Thread Tools
Old 2005-07-18, 06:58   #1
JHagerson
 
JHagerson's Avatar
 
May 2005
Naperville, IL, USA

22×72 Posts
Default Current Factor Depth

Here is an update to the factor depth table.
Attached Files
File Type: zip 2005-07-17 Factor Depth.zip (2.0 KB, 196 views)
JHagerson is offline   Reply With Quote
Old 2005-07-18, 10:16   #2
garo
 
garo's Avatar
 
Aug 2002
Termonfeckin, IE

22×691 Posts
Default

HI,
This is very useful. However, there seems to some problem with your table for 33M. Assuming this signifies the range from 33-34M, there are more than 6000 numbers at 68 bits. My quick check shows over 16,000 at this level.
garo is offline   Reply With Quote
Old 2005-07-18, 13:36   #3
JHagerson
 
JHagerson's Avatar
 
May 2005
Naperville, IL, USA

22×72 Posts
Default

OK, Garo. I did this late last night (very early this morning) in my time zone so I might have goofed somewhere along the line. I started by downloading the 17-JUL-2005 nofactor.zip file. I extracted all of the data from 25M to 79.3M, loaded it into a database, and wrote a query to generate the tableaux of data. Is the discrepancy that this first effort looks only at nofactor.zip when a true picture would integrate the data from factors.zip as well? Please let me know. Thank you.
JHagerson is offline   Reply With Quote
Old 2005-07-18, 15:13   #4
garo
 
garo's Avatar
 
Aug 2002
Termonfeckin, IE

22×691 Posts
Default

No, your methodology is correct. You do not need factors.zip to generate your table. I cannot say what has caused the problem unless you tell me what query you used. Maybe you should just run it one more time?

I just checked the other rows and the numbers there seem wrong too. Can you be more specific about how you made these tables?

Last fiddled with by garo on 2005-07-18 at 15:19
garo is offline   Reply With Quote
Old 2005-07-18, 18:03   #5
marc
 
marc's Avatar
 
Jun 2004
UK

139 Posts
Default

Just fwiw.

With 17th July file I get:

Code:
             60     61     62     63     64     65     66     67     68     69     70     71     72     73     74
33000000      0   4291   1258      0     39      1      5      2  16414      0      0      0      0      0      0    22010
marc is offline   Reply With Quote
Old 2005-07-18, 22:41   #6
JHagerson
 
JHagerson's Avatar
 
May 2005
Naperville, IL, USA

22×72 Posts
Default

I think this one is better. Please let me know if you find any issues.
Attached Files
File Type: zip 050717.zip (1.9 KB, 189 views)
JHagerson is offline   Reply With Quote
Old 2005-07-19, 10:18   #7
tom11784
 
tom11784's Avatar
 
Aug 2003
Upstate NY, USA

2×163 Posts
Default

Those numbers look much better (I noticed problems with the earlier one of the 69M line as well)
I used to make these charts at work and host them all on my geocities site, but that (along with all other geocities pages) has since been blocked so I can only do it from home where I tend to forget to do things for a bit.
If I get some time I'll upload my latest files either this evening or tomorrow onto that site, or perhaps another place so that I can maintain it from work where I have less to do.
tom11784 is offline   Reply With Quote
Old 2005-07-19, 10:46   #8
garo
 
garo's Avatar
 
Aug 2002
Termonfeckin, IE

ACC16 Posts
Default

That looks great! Thanks JHagerson.
garo is offline   Reply With Quote
Old 2005-07-19, 15:30   #9
JHagerson
 
JHagerson's Avatar
 
May 2005
Naperville, IL, USA

22·72 Posts
Default

Tom11784, my effort was greatly inspired by your Geocities work. I didn't mean to "step on your toes" by taking the initiative.

I think the problem with my first effort was that the calculation to assign an exponent to a "bucket" for reporting purposes rounded rather than truncated. I thought something was fishy when the 79M line showed so may values. However, that alarm did not sound loudly enough to rouse my sleep-craving brain.

I have thought about extending the chart with one of two weighting schemes. Would you think either of these (or both) is worth pursuing?

1. Multiply the factor depth by the exponent, sum those, and divide by the sum of the exponents to give the weighted average factoring depth. (Higher numbers take more effort to factor and would therefore be accorded higher weight.)

2. Compare the factor depth of each exponent with the appropriate current Prime95 trial factoring threshold. Then we can compute an indication of "percent complete."

Thank you for your feedback.
JHagerson is offline   Reply With Quote
Old 2005-07-19, 16:57   #10
garo
 
garo's Avatar
 
Aug 2002
Termonfeckin, IE

276410 Posts
Default

1) does not make sense as it actually takes less time to factor a larger number at the same limit. In any case, the time difference between say 40M and 41M is small enough to be ignored for our purposes.

2), on the other hand, is a very good idea!
garo is offline   Reply With Quote
Old 2005-07-19, 18:56   #11
cheesehead
 
cheesehead's Avatar
 
"Richard B. Woods"
Aug 2002
Wisconsin USA

22×3×641 Posts
Default

Quote:
Originally Posted by JHagerson
2. Compare the factor depth of each exponent with the appropriate current Prime95 trial factoring threshold. Then we can compute an indication of "percent complete."
Okay, but remember that each power-of-2 level contains twice as many candidate factors (for a given Mersenne number ["Mnumber"]) as the next lower level. So, "percent complete" = 100% / (2^[(threshold bits) - (so-far bits)])

For example, if an Mnumber's been TFed to 2^65 and the appropriate current Prime95 trial factoring threshold is 66 (2^66), then the TF "percent complete" is (100% / 2^[66-65]) = 50%, not 100%*(65/66) = ~98.5%.

Similarly, if an Mnumber has been TFed so far to 2^63 and the current Prime95 trial factoring threshold is 67 (2^67), the TF "percent complete" would be (100% / 2^[67-63]) = 6.25%, not 100%*(63/67) = ~94%.

Also, some Mnumbers have been TFed beyond the current threshold (for various reasons). If an Mnumber's been TFed to 2^68 whereas the current threshold for that Mnumber's exponent is 67, then that Mnumber's TF "percent complete" should be (100% / 2^[67-68]) = (100% / 0.5 ) = 200%.

Last fiddled with by cheesehead on 2005-07-19 at 19:04
cheesehead is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
Current recommended TF bit depth? endless mike GPU Computing 3 2015-08-07 23:00
Specifing TF factor depth in "Manual Assignments"? kracker PrimeNet 2 2012-07-22 17:49
Factoring bit depth? Dubslow Information & Answers 103 2011-09-04 14:51
Trial Factor Bit Depth lavalamp Operation Billion Digits 8 2010-08-02 18:49
optimality of ecm depth mklasson Msieve 2 2009-03-08 20:18

All times are UTC. The time now is 02:41.

Sun May 9 02:41:12 UTC 2021 up 30 days, 21:22, 0 users, load averages: 1.00, 1.47, 1.51

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.