Also, older 32-bit CPUs are capped at 4 GB of RAM, making execution of larger models impossible.
Slower, not impossible. I don’t think any of the chess or Go models have model sizes >1GB, and even if they did, you don’t have to load the entire model into RAM, they’re just feedforward CNNs, you only need to be able to fit one layer at a time. With appropriate tricks you could probably even slowly train the models, like https://arxiv.org/abs/2002.05645v5
Slower, not impossible. I don’t think any of the chess or Go models have model sizes >1GB, and even if they did, you don’t have to load the entire model into RAM, they’re just feedforward CNNs, you only need to be able to fit one layer at a time. With appropriate tricks you could probably even slowly train the models, like https://arxiv.org/abs/2002.05645v5
Right. My experiment used 1 GB for Stockfish, which would also work on a 486 machine (although at the time, it was almost unheard of...)