-   15k Search (
-   -   GROUP IDEAS (

TTn 2003-07-24 21:02

Sounds like a pretty good idea!
You should contact Thomas R. and ask him about this.
We are just starting to sieve multiple k, using Phil Carmody's sieve, so it is faster than a single k, but I think fixed n is pretty fast too.

Thanks Harsh!

Citrix 2003-07-26 17:03

LLR and ksieve
Phil Carmoody's k sieve will work equally fast. I am not sure how you people are planning on using it with LLR, but if it will help I am willing to write a program that takes ksieve's ouput file and converts it into files compatible with LLR/newpgen for each k in the input file.

Let me Know!

Harsh Aggarwal

8) 8) 8)

Thomas11 2003-07-29 07:49


I'm actually sieving with ksieve2m - the multiple k version of ksieve - and found it much faster than NewPGen, when doing about 15 k in parallel. And if we go to k>2^31, where NewPGen needs k to be entered in factorized form and gets a lot slower than on smaller k, then ksieve is even more the better choice.

There is already an option (-l) in Phil's script to generate input files for LLR, though it is undocumented ...

Phil sent me a modified version (, which can create one single ABC or LLR file from multiple del-files, where the candidates are sorted in increasing order. We are actually testing it and he will include that script into the next version of ksieve.

If you want to incorporate your coding skills then this could be a project for you:
We need a fast way to compute the weights (Nash and/or Brennen) of a large number of k values. At the beginning of the 15k project I used a simple VBS script, which submitted the value of k to PSieve and extracted the necessary information from the output and stores them in a text-file. The whole process is very slow, and PSieve can hanle k<2^31 only.
The next steps were some modifications of Jack Brennen's Java applet ( - a NashWeight applet and a stand alone program, which reads the k values from a file and writes both kinds of weights into another file. It can do k>2^31, but it is still not very fast.
I thought about rewriting the whole thing in C or C++ using the GMP library, but actually I don't have the time to do that. So that could be your task, if you like :)

Thomas R.

Citrix 2003-08-17 22:27

deciding the best candidate
I had a suggestion:

Instead of looking at candidates that give 50 primes under 5000 we should also look at the number of proth tests that were preformed to get this result.
the candidates that give the most primes and require fewer proth tests should be the best candiates because for large n we will have to preform the least proth tests and get the most primes. 8)


8) 8) 8)

Citrix 2003-09-23 16:28

calculating weight
In reply to your previous post, I have figured out a way to generate the top n number of candidates based on weight under a given range and given number of divisors, without testing every candiate indivisually. It would be better to generate the top 1000 candidates and then do the PRP on them to find the best candidate. let me know if you want me to find the top 1000 candidates.

:cool: :cool: :cool:

All times are UTC. The time now is 00:36.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.