![]() |
n=480K-485K, p=27T-28T complete, 52073 factors: [URL]http://uwin.mine.nu/TPS/480000-485000/factorupload/tpfactors_n480K-485K_p27T-28T.txt.bz2[/URL]
Reserving 28T-29T, same range. |
mdettweiler: I've downloaded and verified your file.
|
4150G-4750G complete, over 240000 factors found:
Part 1 (n=485000-486700): [url]http://www.sendspace.com/file/qzxd9y[/url] Part 2 (n=486700-488400): [url]http://www.sendspace.com/file/94t55i[/url] Part 3 (n=488400-490000): [url]http://www.sendspace.com/file/68djfr[/url] Reserving 4750G-5400G |
i reserve 5400-5600 range2
|
Oddball: I've downloaded and verified your file.
45T-65T is finished. Taking 66T-90T. |
Done with 28T-29T (range 1), 49418 factors: [URL]http://uwin.mine.nu/TPS/480000-485000/factorupload/tpfactors_n480K-485K_p28T-29T.txt.bz2[/URL]
Reserving 29T-30T, same range. |
mdettweiler: I've downloaded and verified your file.
|
Reserving 5600-5800G, 485-490k range.
|
5600-5800G complete, 62023 factors found:
[url]http://www.sendspace.com/file/w42lsy[/url] |
5400-5600 done:
[url]http://www.sendspace.com/file/4ou2hw[/url] 64503 factors |
Historian, agent1: I've downloaded and verified your files.
|
29T-30T (range 1) done, 47677 factors: [URL]http://uwin.mine.nu/TPS/480000-485000/factorupload/tpfactors_n480K-485K_p29T-30T.txt.bz2[/URL]
Reserving 30T-31T. Note @Lennart: I accidentally started uploading the file to [URL]http://uwin.mine.nu/TPS/tpfactors_n480K-485K_p29T-30T.txt.bz2[/URL] instead, but canceled it partway and it didn't all get there. Can you please delete that one? Thanks. :smile: |
mdettweiler: I've downloaded and verified your file.
|
4750G-5400G complete, about 228000 factors found.
Part 1 (n=485000-486700): [url]http://www.sendspace.com/file/pfy0ts[/url] Part 2 (n=486700-488400): [url]http://www.sendspace.com/file/w0ojzv[/url] Part 3 (n=488400-490000): [url]http://www.sendspace.com/file/xv7v1p[/url] Reserving 5800-6400G. |
Oddball: I've downloaded and verified your file.
Also taking 120T-140T just to keep cores busy. My previous range should be finished in a few hours. |
[quote=gribozavr;216224]Oddball: I've downloaded and verified your file.
Also taking 120T-140T just to keep cores busy. My previous range should be finished in a few hours.[/quote] Could you possibly check the removal rate somewhere within the 120T-140T range? From what I could tell by a very rough estimate in my ranges near 30T, we're not that far from optimal depth for k<100K in the n=480K-485K range and I wouldn't be surprised if 140T is sufficient to start LLRing in earnest. |
Finished 66T-90T.
At p=120T removal rate is around 1 k/sec. |
[QUOTE=mdettweiler;216254]Could you possibly check the removal rate somewhere within the 120T-140T range? From what I could tell by a very rough estimate in my ranges near 30T, we're not that far from optimal depth for k<100K in the n=480K-485K range and I wouldn't be surprised if 140T is sufficient to start LLRing in earnest.[/QUOTE]
previous estimates suggested an optimal depth of 3P, so there's some way to go yet. edit:- also, removing k's from sieve file (for llring) gives zero speedup of sieving. you're gonna wanna remove n at a time for llr. |
[quote=axn;216286]previous estimates suggested an optimal depth of 3P, so there's some way to go yet.
edit:- also, removing k's from sieve file (for llring) gives zero speedup of sieving. you're gonna wanna remove n at a time for llr.[/quote] Ah, I forgot about the 3P estimate. However, since we're only dealing with optimal depth for the k<100K portion of the file, i.e. 1% of it, our optimal depth would be 3P*.01=.03P=30T. Given the actual observed removal rates that would seem to be rather low, so my guess is that the original 3P estimate was therefore slightly lowballed. I see that over in the LLR reservation thread for this range the testing is being arranged by n, so that would be in line with the optimal procedure. Well, at least for k<100K; once we're done with that and ready to sieve the remainder further I suppose removing the lower portion wouldn't help much as far as sieving speed goes. Given that, it would be a bit more optimal to continue sieving all the way to optimal for the k<10M; however, that would hold up LLRing for quite a while, so the tradeoff will need to be weighed accordingly. |
[QUOTE=mdettweiler;216289]Ah, I forgot about the 3P estimate. However, since we're only dealing with optimal depth for the k<100K portion of the file, i.e. 1% of it, our optimal depth would be 3P*.01=.03P=30T. Given the actual observed removal rates that would seem to be rather low, so my guess is that the original 3P estimate was therefore slightly lowballed.[/quote]
Not really. That's what I am trying to point out. Breaking off a chunk of k (across all n's) will not help with the sieving. Breaking of a chunk of n's (across all k's) will. So there is no point in treating k < 100K as a special case. I understand that people think that k < 100K can be tested faster with LLR so somehow their optimal sieve depth should be lower, but that is not true. When the whole range is sieved to the collective optimal depth, you'd have sieved k < 100K for free. So breaking off chunks before reaching 3P is _never_ a good idea here. [QUOTE=mdettweiler;216289]however, that would hold up LLRing for quite a while, so the tradeoff will need to be weighed accordingly.[/QUOTE] If focus is on starting LLR soon, the "correct" procedure would be to concentrate the sieving for fewer n's -- say the first 1000, and get that to optimal sieve depth (still should be around 3P -- sieve doesn't lose much efficiency from sieving 1000n instead of 5000n). |
Reserving 6400-6550G.
Calculating the optimal sieve depth is quite tricky - TPS is looking for only one twin in a range, not all of the twins in a range. For example, if a twin is found at n=482391, all of the sieving done for n>482391 is useless, since those candidates will not be tested. |
[quote=axn;216294]Not really. That's what I am trying to point out. Breaking off a chunk of k (across all n's) will not help with the sieving. Breaking of a chunk of n's (across all k's) will. So there is no point in treating k < 100K as a special case.
I understand that people think that k < 100K can be tested faster with LLR so somehow their optimal sieve depth should be lower, but that is not true. When the whole range is sieved to the collective optimal depth, you'd have sieved k < 100K for free. So breaking off chunks before reaching 3P is _never_ a good idea here. If focus is on starting LLR soon, the "correct" procedure would be to concentrate the sieving for fewer n's -- say the first 1000, and get that to optimal sieve depth (still should be around 3P -- sieve doesn't lose much efficiency from sieving 1000n instead of 5000n).[/quote] Ah, okay, I see what you mean. I'm still thinking in terms of a sieve where the primary variation is in n (a la NPLB, RPS, etc.) rather than in k as it is here. :rolleyes: In that case, then, I would suggest that we concentrate all our sieving resources on the n=480K-485K portion. Right now it's just me and gribozavr on that range, with everybody else focusing on 485K-490K; since the goal seems to be to able to start LLRing as soon as possible, I'd think the former range would be the place to be. |
The reservations and stats have been updated.
Gribozavr, could you tell us the number of candidates in both ranges and the number of factors found since the sieve file was posted? I'm just wondering how many of them are duplicates (two factors for the same candidate). |
[QUOTE=Historian;216295]Calculating the optimal sieve depth is quite tricky - TPS is looking for only one twin in a range, not all of the twins in a range. For example, if a twin is found at n=482391, all of the sieving done for n>482391 is useless, since those candidates will not be tested.[/QUOTE]
I do not see it that way at all. This would be a true statement when the focus is on a single n (390000 or bust, for example :smile:). But when you're doing a range of n, you need not stop when you find a twin. You can just keep going to ever higher n's -- and that would basically mean, just keep on testing. |
Oddball: Around 3.5% of duplicates. Removed 7.9% of factors.
21876778 480000-484999_11may2010.txt 20148312 480000-484999_27may2010.txt |
[QUOTE=gribozavr;216358]
Removed 7.9% of factors. 21876778 480000-484999_11may2010.txt 20148312 480000-484999_27may2010.txt[/QUOTE] Interesting. What are the stats for the 485000-490000 range? |
Factored 7.1% of numbers.
25947380 485000-489999__11may2010.txt 24099433 485000-489999__28may2010.txt |
Also taking 140T-160T.
|
[QUOTE=gribozavr;216358]Oddball: Around 3.5% of duplicates. [/QUOTE]
I see. That's less than I thought (my original guess was around 10-20%). Anyway, 5800G-6400G is complete with over 173000 factors found. Part 1 (n=485000-486700): [url]http://www.sendspace.com/file/v4gmyz[/url] Part 2 (n=486700-488400): [url]http://www.sendspace.com/file/vqpknt[/url] Part 3 (n=488400-490000): [url]http://www.sendspace.com/file/b6t0sh[/url] Reserving 6550-7150G, same range. |
Oddball: I've downloaded and verified your files.
|
i reserve 7150-7350 range2
|
Finished 120T-140T.
|
30T-31T (range 1) done, 92058 factors: [url]http://www.sendspace.com/file/3i10xg[/url] (Lennart's site is offline at the moment)
One interesting oddity about this range: it has about 92K factors, whereas my earlier 29T-30T range just a little farther up this thread has only 47K factors. Looking back through the archived sieve reservations thread, I see: 55806 in 25T-26T 53300 in 26T-27T 52073 in 27T-28T 49418 in 28T-29T Thus, a logical progression of steadily decreasing yield as p increases, which is exactly what one would expect. Yet 30T-31T remains an anomaly, with almost twice the factors. gribozavr, could you possibly check my 30T-31T range to get a duplication rate, and compare it with the duplication rate in 29T-30T (or any of my other 25T+ files--their yields were similar so I'd expect little difference in duplication between them)? I wonder if part of the 30T-31T file got duplicated somehow, though I can't imagine how. Meanwhile, reserving 31T-35T--may as well take the rest of that gap and polish it off in one fell swoop. :wink: |
[QUOTE=mdettweiler;216670]30T-31T (range 1) done, 92058 factors: [url]http://www.sendspace.com/file/3i10xg[/url] (Lennart's site is offline at the moment)
One interesting oddity about this range: it has about 92K factors, whereas my earlier 29T-30T range just a little farther up this thread has only 47K factors. [/QUOTE] You have managed to copy the whole list of factors twice. After the first 46037 lines, the factors start again from 30T. |
[quote=axn;216682]You have managed to copy the whole list of factors twice. After the first 46037 lines, the factors start again from 30T.[/quote]
Wowsers! That's very weird. The range was processed from start to finish uninterrupted, and I didn't even open the tpsieve directory in the meantime. The only thing I can think of is some very strange as-yet-unnoticed bug in tpsieve... :ermm: |
Taking 160T-200T.
|
[QUOTE=gribozavr;216741]Taking 160T-200T.[/QUOTE]
gribozavr, did you download and verify mdettweiler's latest file? If so, Oddball could clean the thread up. 6400-6550G complete for range 2, 40583 factors found: [url]http://www.sendspace.com/file/j00jhi[/url] |
mdettweiler, Historian: I've downloaded and verified your files.
Finished 140-160T. |
taking 200T-300T i guess
|
7150-7350 done:
[url]http://www.sendspace.com/file/t505fm[/url] 48462 factors |
agent1: I've downloaded and verified your file.
Finished 160T-200T. Taking 300T-400T. |
vaguely how long does a range of 1T take
|
I get like 68M - 72M p/sec on my dual core i5 laptop. I calculated a tad over 4 hours then for 1T if my math was right. Or 8 continuous days for 50T
|
[quote=Joshua2;217420]I get like 68M - 72M p/sec on my dual core i5 laptop. I calculated a tad over 4 hours then for 1T if my math was right. Or 8 continuous days for 50T[/quote]
ok i will reserve some soon |
6550G-7150G complete, about 150000 factors found.
Part 1 (n=485000-486700): [url]http://www.sendspace.com/file/t6qwb1[/url] Part 2 (n=486700-488400): [url]http://www.sendspace.com/file/w5bbe3[/url] Part 3 (n=488400-490000): [url]http://www.sendspace.com/file/i7ybj8[/url] |
Oddball: I've downloaded and verified your files.
|
7350G-8000G complete, about 148000 factors found.
Part 1 (n=485000-486700): [url]http://www.sendspace.com/file/x1v13l[/url] Part 2 (n=486700-488400): [url]http://www.sendspace.com/file/wymrwt[/url] Part 3 (n=488400-490000: [url]http://www.sendspace.com/file/4neags[/url] Reserving 8T-10T. |
p=31T-35T (range 1) complete, 170446 factors: [URL]http://uwin.mine.nu/TPS/480000-485000/factorupload/tpfactors_n480K-485K_p31T-35T.txt.bz2[/URL]
Reserving 400T-405T, same range. |
[QUOTE=mdettweiler;217827]p=31T-35T (range 1) complete, 170446 factors: [URL]http://uwin.mine.nu/TPS/480000-485000/factorupload/tpfactors_n480K-485K_p31T-35T.txt.bz2[/URL]
Reserving 400T-405T, same range.[/QUOTE] That closes all gaps and completes the 200T milestone :showoff: Thanks everyone! |
Oddball, mdettweiler: I've downloaded and verified your files.
|
While I'm waiting for my k=19 reservation request at RPS to be approved or denied, I figure I'd do some useful short-term work in the meantime.
So, I'll get the 10T-10.1T range. It should be done in a few hours. |
[QUOTE=The Carnivore;218238]
I'll get the 10T-10.1T range. It should be done in a few hours.[/QUOTE] Here you go: [url]http://www.sendspace.com/file/a9yy7y[/url] There's about 17,200 factors in that 10T-10.1T file. |
The Carnivore: I've downloaded and verified your file.
|
400T-405T (range 1) complete, 16426 factors: [URL]http://uwin.mine.nu/TPS/480000-485000/factorupload/tpfactors_n480K-485K_p400T-405T.txt.bz2[/URL]
Removal rate was approximately 17 seconds/factor. Considering that the LLR tests will take at least 5 minutes on a modern machine, we still have a ways to go. :smile: So...reserving 405T-410T! |
mdettweiler: I've downloaded and verified your file.
|
405T-410T (range 1) done, 15983 factors: [URL]http://uwin.mine.nu/TPS/480000-485000/factorupload/tpfactors_n480K-485K_p405-410T.txt.bz2[/URL]
Reserving 410T-415T. |
mdettweiler: I've downloaded and verified your file.
Finished 300T-400T. I'll now bring 490-495 to 11T because I feel that 4 Gb RAM would allow me to sieve 485-495 combined. |
[quote=gribozavr;218847]mdettweiler: I've downloaded and verified your file.
Finished 300T-400T. I'll now bring 490-495 to 11T because I feel that 4 Gb RAM would allow me to sieve 485-495 combined.[/quote] AFAIK you won't lose any effieciency if you just start sieving 490-495 from 11T and let others take it to there if you want to sieve several lots together. I don't think it matters how many candidates are remaining for tpsieve. Overall it would be best for those that can sieve multiple lots at once to do so and those that can't do the bits that can't be done for. |
8T-10T complete, about 388000 factors found.
Part 1 (n=485000-486700): [url]http://www.sendspace.com/file/z0lala[/url] Part 2 (n=486700-488400): [url]http://www.sendspace.com/file/1tektj[/url] Part 3 (n=488400-500000): [url]http://www.sendspace.com/file/4hwuby[/url] |
Oddball: I've downloaded and verified your files.
490000<n<495000 up to 11G finished. Taking 11T-15T for 485000<n<490000, 490000<n<495000. |
gribozavr, could you upload the new files (480K-485K, 485K-490K, and 490K-495K)? They were uploaded more than 30 days ago, so the sendspace links may expire soon. Also, the new files will be smaller than the old ones.
Finally, could you give us some stats comparing the number of candidates on the new files with the number of candidates on the old files? It'll be nice to see how much we've done since last month. |
Also taking 15T-50T for 485K-490K, 490K-495K.
|
[QUOTE=Oddball;219176]gribozavr, could you upload the new files (480K-485K, 485K-490K, and 490K-495K)? They were uploaded more than 30 days ago, so the sendspace links may expire soon.[/QUOTE]
The sendspace links will stay up as long as enough people are downloading them. The original links are still there, but the less popular txt.gz and txt.xz compressions are no longer available. |
[QUOTE=Historian;219218]the less popular txt.gz and txt.xz compressions are no longer available.[/QUOTE]
I'll remove the alternate compressions from the first post then. Thanks for letting me know. edit: Only the sendspace links have expired. The rapidspread ones are still there, so I'll leave those links up. |
Files:
480000-484999_19jun2010.zip [url]http://www.sendspace.com/file/zdh3kd[/url] [url]http://www.rapidspread.com/file.jsp?id=gccabwtb48[/url] 480000-484999_19jun2010.txt.gz [url]http://www.sendspace.com/file/r6z5l3[/url] [url]http://www.rapidspread.com/file.jsp?id=4wvj3lmktq[/url] 480000-484999_19jun2010.txt.xz [url]http://www.sendspace.com/file/gyqeej[/url] [url]http://www.rapidspread.com/file.jsp?id=i2bduwl8jn[/url] 485000-489999__19jun2010.zip [url]http://www.sendspace.com/file/rxdu7c[/url] [url]http://www.rapidspread.com/file.jsp?id=lt3isi0iok[/url] 485000-489999__19jun2010.txt.gz [url]http://www.sendspace.com/file/22uop5[/url] [url]http://www.rapidspread.com/file.jsp?id=ws7ti7yzjq[/url] 485000-489999__19jun2010.txt.xz [url]http://www.sendspace.com/file/8pv6tr[/url] [url]http://www.rapidspread.com/file.jsp?id=0yfjio3zz1[/url] Stats (number of k's): 21876778 480000-484999_11may2010.txt 18977478 480000-484999_19jun2010.txt 13.2% removed 25947380 485000-489999__11may2010.txt 23215929 485000-489999__19jun2010.txt 10.5% removed |
410T-415T (range 1) done, 15780 factors: [url]http://www.sendspace.com/file/wrypid[/url] (Lennart's site is down ATM)
Reserving 415T-420T. |
10.1T-11T complete.
Part 1 (n=485000-486700): [url]http://www.sendspace.com/file/6f9gw0[/url] Part 2 (n=486700-488400): [url]http://www.sendspace.com/file/p3wcnu[/url] Part 3 (n=488400-500000): [url]http://www.sendspace.com/file/77y7yn[/url] Parts 1 and 2 were done with last month's old file, but part 3 was done with the new file. |
mdettweiler, Oddball: I've downloaded and verified your files.
|
p=415T-420T (range 1) done, 13510 factors: [URL]http://uwin.mine.nu/TPS/480000-485000/factorupload/tpfactors_n480K-485K_p415T-420T.txt.bz2[/URL]
Reserving 420T-425T, same range. |
gribozavr, could you upload the file for k<100,000 and n=480000-481000? I'm going to take a break from sieving so that I can do some LLR tests.
Yes, I know that it's a bit early to start LLRing, but I'll try my luck and give it a shot anyway. After all, you can't find a twin if you don't test the candidates. edit: As a reminder, make sure that the file is as current as possible (the only k's remaining should be ones without any factors less than 200T or between 300T and 420T). |
mdettweiler: I've downloaded and verified your file.
Oddball: Here's n_480000-480999__k_1-100000__26jun2010.zip: [url]http://www.sendspace.com/file/jlz6ao[/url] |
Range 1, p=420T-425T finished, 13152 factors: [URL]http://uwin.mine.nu/TPS/480000-485000/factorupload/tpfactors_n480K-485K_p420T-425T.txt.bz2[/URL]
Reserving 425T-430T. |
mdettweiler: I've downloaded and verified your file.
Ranges 2, 3: 11T-50T finished. Reserving 50T-100T. |
Taking 100T-120T for n=485000-495000.
|
Joshua2, could you tell us how your 200T-300T range is going? I'm doing some LLR work, and the search would progress faster if the factors in that large range were removed from the file.
|
425T-430T (range 1) complete, 13188 factors: [URL]http://uwin.mine.nu/TPS/480000-485000/factorupload/tpfactors_n480K-485K_p425T-430T.txt.bz2[/URL]
Not reserving another range right away. I'll be going on vacation from the 4th to the 10th and will only have indirect access to all my machines, so to simplify things I'm just putting everything on my personal PRPnet server (doing Conjectures 'R Us work) while I'm away. When I get back I plan to pick up where I left off with TPS sieving. :smile: |
mdettweiler: I've downloaded and verified your file.
Taking 120T-200T for n=485000-495000. |
50T-100T, 100T-120T (n=485000-495000) finished.
|
Taking 200T-300T for n=485000-495000.
|
Finished 120T-200T for n=485000-495000.
|
I just checked Joshua2's profile and found that he hasn't logged on since June 10th. I'm allowing him a few more weeks to account for things like long vacations and other delays, but if he doesn't respond by the end of the month, his 200T-300T reservation will be canceled. The range doesn't have to be finished by that time, but we need another status update to be sure it's still being worked on.
On a separate note, my 430T-433T range will be done by Monday or Tuesday next week. |
[QUOTE=Oddball;220836]I just checked Joshua2's profile and found that he hasn't logged on since June 10th. I'm allowing him a few more weeks to account for things like long vacations and other delays, but if he doesn't respond by the end of the month, his 200T-300T reservation will be canceled. The range doesn't have to be finished by that time, but we need another status update to be sure it's still being worked on.[/QUOTE]
Try PMing him. If he has email notification of PMs on, he will read the PM as long as he checks his emails. |
[QUOTE=10metreh;220839]Try PMing him. [/QUOTE]
PM sent. |
430T-433T complete
Part 1 (n=480000-482500): [url]http://www.sendspace.com/file/42c09e[/url] Part 2 (n=482500-485000): [url]http://www.sendspace.com/file/l2abxt[/url] About 7900 factors found. |
Taking 300T-400T for n=485000-495000.
|
Taking 433T to 434T for 480000<n<485000
|
[QUOTE=Robert_47;221409]Taking 433T to 434T for 480000<n<485000[/QUOTE]
Welcome to TPS! |
Oddball: I've downloaded and verified your files.
|
433T-434T for 480000<n<485000 complete
[URL]http://www.sendspace.com/file/qrh98u[/URL] 2619 factors |
Taking 434T-440T for 480000<n<485000
|
Reserving 440T-445T for n=480K-485K.
|
Robert_47: I've downloaded and verified your file.
Finished 200T-300T for n=485000-495000. |
Taking 400T-450T for n=485000-495000.
|
Range 1, 440T-445T done, 12663 factors: [URL]http://uwin.mine.nu/TPS/480000-485000/factorupload/tpfactors_n480K-485K_p440T-445T.txt.bz2[/URL]
Reserving 445T-450T. |
434T-440T for 480000<n<485000 complete, 15391 factors.
[URL]http://www.sendspace.com/file/npni4x[/URL] |
Reserving 450T-455T for n=480K-485K
|
mdettweiler, Robert_47: I've downloaded and verified your files.
300T-400T for n=485000-495000 finished. |
All times are UTC. The time now is 00:37. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2023, Jelsoft Enterprises Ltd.