|
BGonline.org Forums
GNUbg optimal cache size for analysis
Posted By: leobueno
Date: Thursday, 19 May 2011, at 1:34 a.m.
I would like to maximize the speed at which GNUbg does batch analysis of matches and was wondering what the optimal cache size should be.
Mindful that bigger is not always better, I did a simple test.
I do 4-ply cubeful evaluations with no neural net pruning and a huge move filter; 2 threads on a fairly slow Dell dual core.
As a benchmark, I used a 3-point match that ended in one game on a gammon with a 2-cube turned in the 6th move. I only analyze my game.
I do a batch analyze of a bear-off position plus the benchmark match. The beaoff position is analyzed almost instantaneously, then the benchmark match. I then just look at the file stamp on both output files and the difference in minutes is the time it took to analyze the benchmark.
Results (using leading zeroes to try to format the columns):
CACHE TIME (min.)
000 32
005 26
010 25
021 24
042 23
084 22
168 21
336 20
Wondering if this is a typical result, that is, for evaluations (as opposed to rollouts), *bigger* cache is indeed *better*.
|
BGonline.org Forums is maintained by Stick with WebBBS 5.12.