| |
BGonline.org Forums
GNUbg optimal cache size for analysis
Posted By: Philippe Michel In Response To: GNUbg optimal cache size for analysis (leobueno)
Date: Thursday, 19 May 2011, at 7:42 p.m.
Wondering if this is a typical result, that is, for evaluations (as opposed to rollouts), *bigger* cache is indeed *better*.
I think it is, and it would apply to rollouts as well, with two reserves :
- it must be well within your machine's memory size (in your example, 336 MB is fine if you have 1+ GB of memory ; if you have 512 MB or less, it is too high)
- if what you do doesn't take much time, you will evaluate so few positions (relatively speaking) that the larger cache won't make a difference. Your example took 20-30 minutes. A 2-ply analysis taking 20-30 seconds may have gone : 32s 26s 25s 25s 25s 25s ... or something like that. This is a concern with the example given by Michael Petch in another answer.
with no neural net pruning
This, on the other hand, looks like a bad idea, or at least very expensive paranoia.
With pruning your analysis should be about twice as fast and the risk of discrepancies with no pruning is something like :
- less than 1% risk of 0.001 or higher difference (for any move or cube decision)
- less than 0.1% risk of 0.01 or higher difference
In terms of ER/PR, it should amount to 0.01-0.02. Rather marginal compared to the inaccuracy of the evaluation itself.
| |
BGonline.org Forums is maintained by Stick with WebBBS 5.12.