mersenneforum.org Improving Sieving by 18%.
 Register FAQ Search Today's Posts Mark Forums Read

 2009-06-12, 05:13 #1 cipher     Feb 2007 211 Posts Improving Sieving by 18%. I decided to help with PSP sieving. http://www.mersenneforum.org/showthread.php?t=2666 when i downloaded the sieve file "Sievecomb" or Sob.dat i realized that the n values have not been truancted, hence i truncated n from n=1 to n=6mil as all the k's have been PRP'd upto n=6mil min. The resulting file was 14% lighter and 18% faster with sr2sieve. So here it is the new sieve file (unofficial) where n=6mil to 50mil, for all the K's. (The file is RAR'd use winrar or winzip to unzip the file.) http://www.sendspace.com/file/iwjqzl Now when you start sr2sieve you have to make one small change. On your command prompt. Code: Before: Example sr2sieve -s -p 70115500e9 -P 70123500e9 Now: sr2sieve -i sr_2.abcd -p 70115500e9 -P 70123500e9 Everything else stays the same and you get same results. Thanks Cipher
 2009-06-12, 05:30 #2 ltd     Apr 2003 22×193 Posts I have not tried your file so far but there is at lesast one problem with it. The lower bound for sieving PSP is at 1.5M !!! As our second pass testing for all k has only reached the 1.5M level all factors above that are still very important.
 2009-06-15, 14:00 #3 VJS     Dec 2004 12B16 Posts Yes we have contimplated this before... reducing to 6M is actually too much at a max the reducion would be 1.5M as lars said. Please remove your file Second the calculated increase should be [(41^6)/(50^6)]^0.5 = which is about 9% not 18%, which is a little weird. I think you might have removed a little too much. Also if the efficiency were really that different we should crop the top end. Bring it back to say 1.5M
 2009-06-16, 09:03 #4 ltd     Apr 2003 22×193 Posts As an information about the next steps. I have at home the most recend dat files (all known factors removed) in a form starting at 991 and also starting at 1.5M. I will start a testrun under identical conditions to see the real speed changes.
2009-06-16, 10:34   #5
opyrt

Apr 2008
Oslo, Norway

D916 Posts

Quote:
 Originally Posted by ltd As an information about the next steps. I have at home the most recend dat files (all known factors removed) in a form starting at 991 and also starting at 1.5M. I will start a testrun under identical conditions to see the real speed changes.
I'm looking forward to hearing the results.

Also, I'm planning on finding a prime this summer... Will that help?

2009-06-16, 15:51   #6
Joe O

Aug 2002

3×52×7 Posts

Quote:
 Originally Posted by opyrt I'm looking forward to hearing the results. Also, I'm planning on finding a prime this summer... Will that help?
Finding a prime is the best thing that you can do!

Finding two primes would be even better!.

2009-06-16, 20:05   #7
Sloth

Mar 2006

2·47 Posts

Quote:
 Originally Posted by Joe O Finding a prime is the best thing that you can do! Finding two primes would be even better!.
Stop slacking and get to it.

2009-06-16, 21:21   #8
opyrt

Apr 2008
Oslo, Norway

7·31 Posts

Quote:
 Originally Posted by Sloth Stop slacking and get to it.
Oh, sorry! I'll get to it straight away! I'll just copy what you did, and we should have them any time soon!

 2009-06-17, 03:12 #9 VJS     Dec 2004 13·23 Posts Joe and I spent god only know how many weeks and CPU hours perfecting that dat at 991
 2009-06-18, 14:52 #10 ltd     Apr 2003 14048 Posts The testrun has finished. The gain was even smaller then expected. The file with n>1.5M needed 151354 seconds The file with n=991 needed 152360 seconds. This would mean there is a gain of 0.66% by changing to a shorter file. I expect that there is a possible error of around +/-0.2% due to the fact that the machine was in normal use for 3hours during the test. So the expected gain would be between ~0.45% and ~0.85%. Last fiddled with by ltd on 2009-06-18 at 14:52
 2009-07-01, 13:34 #11 opyrt     Apr 2008 Oslo, Norway 21710 Posts Please forgive me what I write here is totally moronic, but aren't all factors interresting with regards to having more complete data as to what candidates are not primes? LLR tests only say "this is not prime", while sieving says "this candidate is divisible by that factor and is not prime", right? In my mind, this is a good reason not to cut down the sieve span.

 Similar Threads Thread Thread Starter Forum Replies Last Post debrouxl NFS@Home 10 2018-05-06 21:05 bhelmes Computer Science & Computational Number Theory 7 2017-06-26 02:20 ixfd64 PrimeNet 5 2013-11-08 05:41 Unregistered Information & Answers 1 2011-04-02 02:17 Matthias C. Noc Software 3 2004-02-12 19:34

All times are UTC. The time now is 02:40.

Sat Jan 28 02:40:05 UTC 2023 up 163 days, 8 mins, 0 users, load averages: 0.82, 0.84, 0.96