|
BGonline.org Forums
Can GNUbg be improved through distributed training?
Posted By: leobueno
Date: Wednesday, 29 August 2012, at 8:14 p.m.
I understand that GNUbg uses a neural network engine. My limited knowlege of NNs indicates that one needs to "train" the NN, in this case by having it play against itself so it "learns".
I was wondering if we could collectively improve the performance of GNUbg by volunteering our PCs to train the NN, so that instead of doing the training on one big, fast and very expensive supercomputer, you have the job distributed among dozens or hundreds of plain old PCs.
I have seen these these types of projects done in the past, for example, one that simulated protein folding and another that processed cosmic data.
Thoughts?
|
BGonline.org Forums is maintained by Stick with WebBBS 5.12.