But we’re only interested in some aspects of self maintenance, we’re not interested in how well individual molecules stay in their places (except when we’re measuring hardness of materials). Some fully general measure wouldn’t know what parameters are interesting and what are not.
Much the same goes for “integrated information theory”—without some external conscious observer informally deciding what’s information and what’s not (or what counts as “integration”) to make the premise seem plausible (and carefully picking plausible examples), you just have a temperature-like metric which is of no interest whatsoever if not for the outrageous claim that it measures consciousness. A metric that is ridiculously huge for e.g. turbulent gasses, or if we get down to microscale and consider atoms bouncing around chaotically, for gasses in general.
Again, I think you’re misunderstanding. The metric I’m proposing doesn’t measure how well those self-maintenance systems work, only how many of them there are.
Yes, of course we’re only really interested in some aspects of self-maintenance. Let’s start by counting how many aspects there are, and start categorizing once that first step has produced some hard numbers.
Ahh, OK. The thing is, though… say, a crystal puts atoms back together if you move them slightly (and a liquid doesn’t). And so on, all sorts of very simple apparent self maintenance done without a trace of intelligent behaviour.
What’s your point? I’ve already acknowledged that this metric doesn’t return equally low values for all inanimate objects, and it seems a bit more common (in new-agey circles at least) to ascribe intelligence to crystals or rivers than to puffs of hot gas, so in that regard it’s better calibrated to human intuition than Integrated Information Theory.
But we’re only interested in some aspects of self maintenance, we’re not interested in how well individual molecules stay in their places (except when we’re measuring hardness of materials). Some fully general measure wouldn’t know what parameters are interesting and what are not.
Much the same goes for “integrated information theory”—without some external conscious observer informally deciding what’s information and what’s not (or what counts as “integration”) to make the premise seem plausible (and carefully picking plausible examples), you just have a temperature-like metric which is of no interest whatsoever if not for the outrageous claim that it measures consciousness. A metric that is ridiculously huge for e.g. turbulent gasses, or if we get down to microscale and consider atoms bouncing around chaotically, for gasses in general.
Again, I think you’re misunderstanding. The metric I’m proposing doesn’t measure how well those self-maintenance systems work, only how many of them there are.
Yes, of course we’re only really interested in some aspects of self-maintenance. Let’s start by counting how many aspects there are, and start categorizing once that first step has produced some hard numbers.
Ahh, OK. The thing is, though… say, a crystal puts atoms back together if you move them slightly (and a liquid doesn’t). And so on, all sorts of very simple apparent self maintenance done without a trace of intelligent behaviour.
What’s your point? I’ve already acknowledged that this metric doesn’t return equally low values for all inanimate objects, and it seems a bit more common (in new-agey circles at least) to ascribe intelligence to crystals or rivers than to puffs of hot gas, so in that regard it’s better calibrated to human intuition than Integrated Information Theory.