Quote:
Originally Posted by EdH
Were the grossly overdone (10x) smaller B1 values wasted?
How do I tell what t value I'm current at?
What t value should I strive for?
Thanks!

An old rule of thumb for GNFS jobs was to ECM to a tvalue of 0.31 * input size. For this C162, that's 50.22. More recent bayesian estimates for ECM effort have shown this to be a bit too much effort, so something in the vicinity of a t50 will suffice.
Invoking ecm v with a B1 bound will show you how many curves at that bound are needed for a variety of tlevels. For instance, ECM 7 indicates that 24,000 curves at 3e6 are equivalent to a t45 (the level usually run by curves at 11e6), and 240,000 are a t50. At 11e6, 39500 curves are a t50. So, your 3e6 curves are about 10% of a t50, while your 11e6 curves are 16% of a t50, and your 43e6 curves are worth 9% of a t50 (8700 curves, again using ECM 7.0). As for "wasted", overkill on smaller curves is an ineffcient way to find largish factors, but there is definitely a chance to do so; so compared with a fastestplan, the superextra3M curve count maybe wasted 20 or 30% of the computrons spent beyond the usual t40 number of curves.
Adding these up, you've done just over 1/3rd of a t50, so another 5000 or 6000 curves at 43e6 would be enough to justify proceeding to GNFS. Note that "enough" is both a rough and broad optimum; some folks feel strong regret if GNFS turns up a factor that ECM "could have" or even "should have" found; those people should likely do a bit more ECM to reduce incidence of that regret.
fivemack wrote a bayesiananalysis tool for ECM, taking alreadyrun curves as input and outputting what curves should be run before GNFS. Alas, I can't find the thread presently.