mersenneforum.org Current Factor Depth
 Register FAQ Search Today's Posts Mark Forums Read

2005-07-18, 06:58   #1
JHagerson

May 2005
Naperville, IL, USA

3048 Posts
Current Factor Depth

Here is an update to the factor depth table.
Attached Files
 2005-07-17 Factor Depth.zip (2.0 KB, 188 views)

 2005-07-18, 10:16 #2 garo     Aug 2002 Termonfeckin, IE 11×251 Posts HI, This is very useful. However, there seems to some problem with your table for 33M. Assuming this signifies the range from 33-34M, there are more than 6000 numbers at 68 bits. My quick check shows over 16,000 at this level.
 2005-07-18, 13:36 #3 JHagerson     May 2005 Naperville, IL, USA 22·72 Posts OK, Garo. I did this late last night (very early this morning) in my time zone so I might have goofed somewhere along the line. I started by downloading the 17-JUL-2005 nofactor.zip file. I extracted all of the data from 25M to 79.3M, loaded it into a database, and wrote a query to generate the tableaux of data. Is the discrepancy that this first effort looks only at nofactor.zip when a true picture would integrate the data from factors.zip as well? Please let me know. Thank you.
 2005-07-18, 15:13 #4 garo     Aug 2002 Termonfeckin, IE 11·251 Posts No, your methodology is correct. You do not need factors.zip to generate your table. I cannot say what has caused the problem unless you tell me what query you used. Maybe you should just run it one more time? I just checked the other rows and the numbers there seem wrong too. Can you be more specific about how you made these tables? Last fiddled with by garo on 2005-07-18 at 15:19
 2005-07-18, 18:03 #5 marc     Jun 2004 UK 139 Posts Just fwiw. With 17th July file I get: Code:  60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 33000000 0 4291 1258 0 39 1 5 2 16414 0 0 0 0 0 0 22010
2005-07-18, 22:41   #6
JHagerson

May 2005
Naperville, IL, USA

22·72 Posts

I think this one is better. Please let me know if you find any issues.
Attached Files
 050717.zip (1.9 KB, 179 views)

 2005-07-19, 10:18 #7 tom11784     Aug 2003 Upstate NY, USA 5068 Posts Those numbers look much better (I noticed problems with the earlier one of the 69M line as well) I used to make these charts at work and host them all on my geocities site, but that (along with all other geocities pages) has since been blocked so I can only do it from home where I tend to forget to do things for a bit. If I get some time I'll upload my latest files either this evening or tomorrow onto that site, or perhaps another place so that I can maintain it from work where I have less to do.
 2005-07-19, 10:46 #8 garo     Aug 2002 Termonfeckin, IE 276110 Posts That looks great! Thanks JHagerson.
 2005-07-19, 15:30 #9 JHagerson     May 2005 Naperville, IL, USA 22·72 Posts Tom11784, my effort was greatly inspired by your Geocities work. I didn't mean to "step on your toes" by taking the initiative. I think the problem with my first effort was that the calculation to assign an exponent to a "bucket" for reporting purposes rounded rather than truncated. I thought something was fishy when the 79M line showed so may values. However, that alarm did not sound loudly enough to rouse my sleep-craving brain. I have thought about extending the chart with one of two weighting schemes. Would you think either of these (or both) is worth pursuing? 1. Multiply the factor depth by the exponent, sum those, and divide by the sum of the exponents to give the weighted average factoring depth. (Higher numbers take more effort to factor and would therefore be accorded higher weight.) 2. Compare the factor depth of each exponent with the appropriate current Prime95 trial factoring threshold. Then we can compute an indication of "percent complete." Thank you for your feedback.
 2005-07-19, 16:57 #10 garo     Aug 2002 Termonfeckin, IE 11×251 Posts 1) does not make sense as it actually takes less time to factor a larger number at the same limit. In any case, the time difference between say 40M and 41M is small enough to be ignored for our purposes. 2), on the other hand, is a very good idea!
2005-07-19, 18:56   #11

"Richard B. Woods"
Aug 2002
Wisconsin USA

22·3·641 Posts

Quote:
 Originally Posted by JHagerson 2. Compare the factor depth of each exponent with the appropriate current Prime95 trial factoring threshold. Then we can compute an indication of "percent complete."
Okay, but remember that each power-of-2 level contains twice as many candidate factors (for a given Mersenne number ["Mnumber"]) as the next lower level. So, "percent complete" = 100% / (2^[(threshold bits) - (so-far bits)])

For example, if an Mnumber's been TFed to 2^65 and the appropriate current Prime95 trial factoring threshold is 66 (2^66), then the TF "percent complete" is (100% / 2^[66-65]) = 50%, not 100%*(65/66) = ~98.5%.

Similarly, if an Mnumber has been TFed so far to 2^63 and the current Prime95 trial factoring threshold is 67 (2^67), the TF "percent complete" would be (100% / 2^[67-63]) = 6.25%, not 100%*(63/67) = ~94%.

Also, some Mnumbers have been TFed beyond the current threshold (for various reasons). If an Mnumber's been TFed to 2^68 whereas the current threshold for that Mnumber's exponent is 67, then that Mnumber's TF "percent complete" should be (100% / 2^[67-68]) = (100% / 0.5 ) = 200%.

Last fiddled with by cheesehead on 2005-07-19 at 19:04

 Similar Threads Thread Thread Starter Forum Replies Last Post endless mike GPU Computing 3 2015-08-07 23:00 kracker PrimeNet 2 2012-07-22 17:49 Dubslow Information & Answers 103 2011-09-04 14:51 lavalamp Operation Billion Digits 8 2010-08-02 18:49 mklasson Msieve 2 2009-03-08 20:18

All times are UTC. The time now is 13:23.

Fri Mar 5 13:23:57 UTC 2021 up 92 days, 9:35, 0 users, load averages: 2.55, 3.11, 3.19