I hate to wheel this out again but evolution-broadly-construed is actually a very close fit for gradient methods. Agreed there’s a whole lot of specifics in biological natural selection, and a whole lot of specifics in gradient-methods-as-practiced, but they are quite akin really.
please wheel such things out every time they seem relevant until such time as someone finds a strong argument not to, people underrecommend sturdy work imo. in this case, I think the top comment on that post raises some issues with it that I’d like to see resolved before I’d feel like I could rely on it to be a sturdy generalization. but I appreciate the attempt.
Separately, I’m not a fan of ‘evolveware’ or ‘evoware’ in particular, though I can’t put my finger on exactly why. Possibly it’s because of a connotation of ongoing evolution, which is sorta true in some cases but could be misleading as a signifier. Though the same criticism could be levelled against ‘ML-ware’, which I like more.
I hate to wheel this out again but evolution-broadly-construed is actually a very close fit for gradient methods. Agreed there’s a whole lot of specifics in biological natural selection, and a whole lot of specifics in gradient-methods-as-practiced, but they are quite akin really.
please wheel such things out every time they seem relevant until such time as someone finds a strong argument not to, people underrecommend sturdy work imo. in this case, I think the top comment on that post raises some issues with it that I’d like to see resolved before I’d feel like I could rely on it to be a sturdy generalization. but I appreciate the attempt.
Separately, I’m not a fan of ‘evolveware’ or ‘evoware’ in particular, though I can’t put my finger on exactly why. Possibly it’s because of a connotation of ongoing evolution, which is sorta true in some cases but could be misleading as a signifier. Though the same criticism could be levelled against ‘ML-ware’, which I like more.