What you described as computation could apply to literally any two things in the same causal universe. But you meant two things that track each other much more tightly than usual. It may be that a rock is literally conscious, but if so, then not very much so. So little that it really does not matter at all. Humans are much more conscious because they reflect the world much more, reflect themselves much more, and [insert solution to Hard Problem here].
It may be that a rock is literally conscious, but if so, then not very much so. So little that it really does not matter at all.
I dunno. I think if rocks are even a little bit conscious, that’s pretty freaky, and I’d like to know about it. I’d certainly like to hear more about what they’re conscious of. Are they happy? Can I alter them in some way that will maximize their experiential well-being? Given how many more rocks there are than humans, it could end up being the case that our moral algorithm is dominated by rearranging pebbles on the beach.
Humans are much more conscious because they reflect the world much more, reflect themselves much more, and [insert solution to Hard Problem here].
Hah. Luckily, true panpsychism dissolves the Hard Problem. You don’t need to account for mind in terms of non-mind, because there isn’t any non-mind to be found.
I think if rocks are even a little bit conscious, that’s pretty freaky, and I’d like to know about it.
I meant, I’m pretty sure that rocks are not conscious. It’s just that the best way I’m able to express what I mean by “consciousness” may end up apparently including rocks, without me really claiming that rocks are conscious like humans are—in the same way that your definition of computation literally includes air, but you’re not really talking about air.
Luckily, true panpsychism dissolves the Hard Problem. You don’t need to account for mind in terms of non-mind, because there isn’t any non-mind to be found.
I don’t understand this. How would saying “all is Mind” explain why qualia feel the way they do?
I’m pretty sure that rocks are not conscious. It’s just that the best way I’m able to express what I mean by “consciousness” may end up apparently including rocks, without me really claiming that rocks are conscious like humans are—in the same way that your definition of computation literally includes air, but you’re not really talking about air.
This still doesn’t really specify what your view is. Your view may be that strictly speaking nothing is conscious, but in the looser sense in which we are conscious, anything could be modeled as conscious with equal warrant. This view is a polite version of eliminativism.
Or your view may be that strictly speaking everything is conscious, but in the looser sense in which we prefer to single out human-style consciousness, we can bracket the consciousness of rocks. In that case, I’d want to hear about just what kind of consciousness rocks have. If dust specks are themselves moral patients, this could throw an interesting wrench into the ‘dust specks vs. torture’ debate. This is panpsychism.
Or maybe your view is that rocks are almost conscious, that there’s some sort of Consciousness Gap that the world crosses, Leibniz-style. In that case, I’d want an explanation of what it means for something to almost be conscious, and how you could incrementally build up to Consciousness Proper.
I don’t understand this. How would saying “all is Mind” explain why qualia feel the way they do?
The Hard Problem is not “Give a reductive account of Mind!” It’s “Explain how Mind could arise from a purely non-mental foundation!” Idealism and panpsychism dissolve the problem by denying that the foundation is non-mental; and eliminativism dissolves the problem by denying that there’s such a thing as “Mind” in the first place.
What you described as computation could apply to literally any two things in the same causal universe. But you meant two things that track each other much more tightly than usual. It may be that a rock is literally conscious, but if so, then not very much so. So little that it really does not matter at all. Humans are much more conscious because they reflect the world much more, reflect themselves much more, and [insert solution to Hard Problem here].
I dunno. I think if rocks are even a little bit conscious, that’s pretty freaky, and I’d like to know about it. I’d certainly like to hear more about what they’re conscious of. Are they happy? Can I alter them in some way that will maximize their experiential well-being? Given how many more rocks there are than humans, it could end up being the case that our moral algorithm is dominated by rearranging pebbles on the beach.
Hah. Luckily, true panpsychism dissolves the Hard Problem. You don’t need to account for mind in terms of non-mind, because there isn’t any non-mind to be found.
I meant, I’m pretty sure that rocks are not conscious. It’s just that the best way I’m able to express what I mean by “consciousness” may end up apparently including rocks, without me really claiming that rocks are conscious like humans are—in the same way that your definition of computation literally includes air, but you’re not really talking about air.
I don’t understand this. How would saying “all is Mind” explain why qualia feel the way they do?
This still doesn’t really specify what your view is. Your view may be that strictly speaking nothing is conscious, but in the looser sense in which we are conscious, anything could be modeled as conscious with equal warrant. This view is a polite version of eliminativism.
Or your view may be that strictly speaking everything is conscious, but in the looser sense in which we prefer to single out human-style consciousness, we can bracket the consciousness of rocks. In that case, I’d want to hear about just what kind of consciousness rocks have. If dust specks are themselves moral patients, this could throw an interesting wrench into the ‘dust specks vs. torture’ debate. This is panpsychism.
Or maybe your view is that rocks are almost conscious, that there’s some sort of Consciousness Gap that the world crosses, Leibniz-style. In that case, I’d want an explanation of what it means for something to almost be conscious, and how you could incrementally build up to Consciousness Proper.
The Hard Problem is not “Give a reductive account of Mind!” It’s “Explain how Mind could arise from a purely non-mental foundation!” Idealism and panpsychism dissolve the problem by denying that the foundation is non-mental; and eliminativism dissolves the problem by denying that there’s such a thing as “Mind” in the first place.