Btw, a multilayer perceptron (which is a permutation invariant model) with 230000 parameters and, AFAIK, no data augmentaiton used, can achieve 88.33% accuracy on FashionMNIST.
I doubt that this would be the best a MLP can achieve on F-MNIST.
I will put it this way: SONNs and MLPs do the same thing, in a different way. Therefore they should achieve the same accuracy. If this SONN can get near 90%, so should MLPs.
It is likely that nobody has bothered to try âwithout convolutionsâ because it is so old-fashioned.
Convolutions are for repeated locally aggregated correlations.
Btw, a multilayer perceptron (which is a permutation invariant model) with 230000 parameters and, AFAIK, no data augmentaiton used, can achieve 88.33% accuracy on FashionMNIST.
I doubt that this would be the best a MLP can achieve on F-MNIST.
I will put it this way: SONNs and MLPs do the same thing, in a different way. Therefore they should achieve the same accuracy. If this SONN can get near 90%, so should MLPs.
It is likely that nobody has bothered to try âwithout convolutionsâ because it is so old-fashioned.
Convolutions are for repeated locally aggregated correlations.