Cute idea, but I think you won’t get many upvotes because the post felt longer (and probably more technical) than the idea could sustain.
One unavoidable issue with defining consciousness, which has to be handled with some delicacy, is that people don’t have separate mental buckets for “consciousness” and “the mental properties of humans that I care about.” Sometimes we like to say that we intrinsically care about consciousness (as if they were independent), but really it’s more like consciousness and us caring about things are all muddled together.
In one direction, this means that it seems obvious that upon offering a definition for consciousness, this means that there’s a “consciousness monster” that maximizes the definition, which seems interesting because since you’ve labeled the thing you’re defining “consciousness,” it feels like you intrinsically care about it.
In the other direction, this means that upon offering a simple definition for consciousness, everyone who applies common sense to it will go “Wait, but this definition doesn’t include properties of humans that I care about, like emotions / pain / dreams / insert your favorite thing here.”
The goal of writing this post was “this is a slight improvement on IIT”, not “I expect normal people to understand/agree with this particular definition of consciousness”.
Cute idea, but I think you won’t get many upvotes because the post felt longer (and probably more technical) than the idea could sustain.
One unavoidable issue with defining consciousness, which has to be handled with some delicacy, is that people don’t have separate mental buckets for “consciousness” and “the mental properties of humans that I care about.” Sometimes we like to say that we intrinsically care about consciousness (as if they were independent), but really it’s more like consciousness and us caring about things are all muddled together.
In one direction, this means that it seems obvious that upon offering a definition for consciousness, this means that there’s a “consciousness monster” that maximizes the definition, which seems interesting because since you’ve labeled the thing you’re defining “consciousness,” it feels like you intrinsically care about it.
In the other direction, this means that upon offering a simple definition for consciousness, everyone who applies common sense to it will go “Wait, but this definition doesn’t include properties of humans that I care about, like emotions / pain / dreams / insert your favorite thing here.”
Agree with almost all of your points.
The goal of writing this post was “this is a slight improvement on IIT”, not “I expect normal people to understand/agree with this particular definition of consciousness”.