Quote:
Originally Posted by JuanTutors
Modification 1 of Stage 2 calculates the following:
1. Calculate M. (First 1% of the PRP test.)
2. Calculate H^{q}. (First multiplication, from an array, and part of the original Stage 2 algorithm)
3. Multiply M by H^{q} and subtract 1. (Second multiplication, and the only new multiplication. The subtraction is essentially free.)
4. Multiply by H^{q}1. (Third multiplication, and part of the original stage 2 algorithm)

Current P1 stage 2 uses 1 multiplication to handle two q's. In your new scheme, it will take 4 multiplications to handled 1 q (Hq, M*Hq, (Hq1) * (M*Hq1), and Π).
Quote:
Originally Posted by JuanTutors
The time cost of this Modification 1 of Stage 2 algorithm are 50% larger, plus the cost of arriving at iteration m = 10^6 based on this procedure, which according to my computer's speed adds about 25% more time depending on the length of Mp. Thus, in total, including Stage 1, this Pm1 test is about 38% more time expensive:

That makes it roughly 8x costlier stage 2. Since stage 2 is already longer than stage 1, overall this will be about 5x more time than regular P1.
Quote:
Originally Posted by JuanTutors
The gain is all the unknown factors of each 2^m+q, which have on average 13.45 distinct prime factors each, a huge step from the single prime factor q. Note that 2^m+q may have common factors with E. However, since 2^m+q has over 3*10^{5} digits, that does not prevent us from gaining many prime factors.

Sadly, most of the 13.45 factors are completely useless. A factor q will divide a p1 with probability 1/q (well, technically 1/(q1)). Which means, for example, a 10digit q is only 1% as good as an 8digit q. A 20digit q is 1/10 billion as good as a 10digit q. So, while we gain a lot of factors, there is not a chance in our life time to ever see one appear in a found factor's P1 decomposition. You're better off to just increase the B2 bound in traditional stage 2.