eh, this just seems like a repeat of arguments against greedy reductionism. Parsimony is good except when it loses information, but if you’re losing information you’re not being parsimonious correctly.
probably not. I’m not exactly sure what you mean by this question since I don’t full understand hamilton’s rule but in general evolutionary stuff only needs to be close enough to correct rather than actually correct.
Losing information isn’t a crime. The virtues of simple models go beyond Occam’s razor. Often, replacing a complex world with a complex model barely counts as progress—since complex models are hard to use and hard to understand.
eh, this just seems like a repeat of arguments against greedy reductionism. Parsimony is good except when it loses information, but if you’re losing information you’re not being parsimonious correctly.
If there were a good way of distinguishing between losing information and losing noise, that would be useful.
So: Hamilton’s rule is not being parsimonious “correctly”?
probably not. I’m not exactly sure what you mean by this question since I don’t full understand hamilton’s rule but in general evolutionary stuff only needs to be close enough to correct rather than actually correct.
Losing information isn’t a crime. The virtues of simple models go beyond Occam’s razor. Often, replacing a complex world with a complex model barely counts as progress—since complex models are hard to use and hard to understand.