20190102, 07:53  #859 
"/X\(‘‘)/X\"
Jan 2013
3×977 Posts 
I find manually distributing lists to be a bit too much work.
If my systems were dedicated to DC range TC, I'd probably get through 5+ a day. At the LL wavefront it would be closer to 2 or 3. How quickly do they come in? As far as query tuning, I like your scoring idea. I'd make a new column on the exponents table (assuming there is one). When a matching LL or PRP result is submitted, set the score to 0.0. If a mismatch comes in, have the server set the score to 1.0. Anything otherwise interesting can have its score set with a manual update query. Then when someone's cpu, who has the "interesting" checkbox set, requests an exponent, a simple index on (score desc, exponent asc) can make the `select ... from exponents where score > 0 and exponent > $cpu_min_elligible_exponent order by score desc, exponent asc` query a fast index scan. I should find out if there's a free version of SQL Server so I can play with it. I really enjoy optimizing SQL. Edit: there is a free version, but it's limited to only 10 GB, so not sufficient to store a replica of all the primer data. Could still be used for expermients though. Last fiddled with by Mark Rose on 20190102 at 07:58 
20190102, 15:53  #860  
Serpentine Vermin Jar
Jul 2014
37×89 Posts 
Quote:
For the LL results, it's currently around 280 MB (although the indices on it are nearly twice that, for performance). Another somewhat larger table is the one that holds factors... lots of exponents under 1e9 with factors (and many with multiple), and those factors are stored as varchar since they can get ginormous. So that one is around 2.4 GB (1GB for indices). The performance part of picking out exponents needing triplechecks is slow because of the aggregate/counting that has to happen during the query. In essence it looks like (in a very simple form): Code:
select distinct exponent from ll_results where result_state=unverified group by exponent having count(exponent) > 1 Anyway... something to think about for me... some type of exponent "score" system... 

20190102, 17:51  #861  
"/X\(‘‘)/X\"
Jan 2013
3×977 Posts 
Quote:
Quote:


20190102, 22:42  #862  
Serpentine Vermin Jar
Jul 2014
110011011101_{2} Posts 
Quote:
I had some old junk in there before to look at any current assignments and tell me how far along they were, when they had last checked in, etc. When my workers were starving for things to do, I did poach some assignments for triplechecks if that assignment hadn't reported in for a month, or whatever. Bear in mind that quite a few of these triplechecks were for larger exponents that where maybe one result was suspicious, so it was done again as a "first time check" which mismatched, and maybe it was previously assigned but had expired, so that expired assignment got converted to a live double (really a triple) check. I didn't feel bad about poaching assignments that hadn't reported in for months or even years. I also streamlined a hideous "where" clause I was using and turned it into a custom join which really helps (I'm not matching huge datasets that I whittle down with a WHERE anyway). When I'm doing these types of queries for my own use I don't typically bother to optimize much unless it's really horrible. So yeah, went from 8 seconds to 1 second, and there went my lunch break. LOL I even took another look at a different query that looks for selfverified work and got that running in about half the time... it shouldn't have been taking as long as it did (only returns a pair of results now anyway) but I was sloppy when I wrote it so it was taking twice as long as it should have. That's trickier since it's doing a join on the same table but has to look for cases where the user is the same but a different shift and/or date for the result, and those are the *only* two used to verify. Kind of a pain, but only takes 4 seconds now, and it's a pretty big table it's working with so I'm happy with it. (PS  that other query takes 4 seconds because based on the selfverified work it finds, I'm looking up the factoring effort done on it to help me generate a valid worktodo entry  it's surprising how many people have doublechecked their own firsttime tests and haven't done a single bit of P1 on it) Last fiddled with by Madpoo on 20190102 at 22:48 

20190102, 23:18  #863 
6809 > 6502
"""""""""""""""""""
Aug 2003
101×103 Posts
2^{5}·13·23 Posts 
Anyway you can make a supersecret link for some mods to run the query, say once a week. We could keep the list up to date.

20190102, 23:28  #864 
"/X\(‘‘)/X\"
Jan 2013
3×977 Posts 
So without the query to modify, I'll have to show a more generic approach. But you can do something like this:
Code:
select exponent, ... from ( select distinct exponent from ll_results where result_state=unverified group by exponent having count(exponent) > 1 ) as needs_tc join factoring_effort on needs_tc.exponent = factoring_effort.exponent left join assignments on needs_tc.exponent = assignments.exponent and assignments.assignment_id = ( select max(assignment_id) from assignments where exponent = needs_tc.exponent and expired = null ) where factoring_effort.bits > magic_formula_or_case_statement and assignments.assignment_id is null order by exponent  assignments should have an index on (exponent, expired, assignment_id) 
20190103, 00:28  #865  
If I May
"Chris Halsall"
Sep 2002
Barbados
2566_{16} Posts 
Quote:
"Subselects" / "Subqueries" are a very powerful tool in a SQL geeks' toolbelt! Joins are sometimes a*b in size, if not constrained.... 

20190103, 02:04  #866 
"/X\(‘‘)/X\"
Jan 2013
3×977 Posts 
I realize now that the subquery predicate on the assignments join can be dropped completely: the null check in the outside where takes care of it.

20190103, 03:39  #867 
If I May
"Chris Halsall"
Sep 2002
Barbados
2·4,787 Posts 

20190103, 17:51  #868 
Sep 2018
45_{16} Posts 
I also would love a way to just assign machines to Triple checks if needed!

20190104, 15:02  #869 
6809 > 6502
"""""""""""""""""""
Aug 2003
101×103 Posts
2^{5}·13·23 Posts 
So I took this one:
DoubleCheck=47168477,73,1 And the 2009 result was ok, the 2018 result was bad. That is a rarity for me. 
Thread Tools  
Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Double checks  casmith789  PrimeNet  7  20150526 00:53 
Help doing some quadrup1e+ checks  Madpoo  Data  28  20150406 17:01 
Double checks  Rastus  Data  1  20031219 18:20 
How do I get rid of the Triple Checks??  outlnder  Lounge  4  20030407 18:06 
Doublechecks come in pairs?  BigRed  Software  1  20021020 05:29 