Finally having run something in ProvingGround involving the main learning method, here are some observations.
Simple AndrewsCurtis run
Starting with an initial distribution, evoution took place with the basic AndrewsCurtis moves. I ran two loops with two steps each, and then resumed and ran 3 more loops. The raw results are at first run results
Conclusions

In terms of the highest weights, the method worked well even with such a short run. For example, the presentation $\langle a, b; \bar{a}, b\rangle$ was one of those with high weight.

In general, the order was according to expected behaviour.

However, virtually all theorems were remembered.
 This stems from two weaknesses:
 two high a weight for theorems relative to proofs, leading to memos favoured over deduction.
 not enough methods of deduction, specifically exlusion of multiplicationinversion moves.
 These can be corrected by:
 adding more moves: multiplicationinversions, transpositions; or allowing composite moves.
 lowering the probability of wordcontinuation in generating presentations (just a runtime parameter).