Thanks for this great sequence of posts on behaviourism and related issues.
Anyone who does not believe mental states are ontologically fundamental—ie anyone who denies the reality of something like a soul—has two choices about where to go next. They can try reducing mental states to smaller components, or they can stop talking about them entirely.
Here’s what I take it you’re committed to:
by ‘mental states’ we mean things like beliefs and desires.
an eliminativist has both to stop talking about them and also using them in explanations.
whither go beliefs and desires also goes rationality. You can’t have a rational agent without what amount to beliefs and desires.
you are advocating eliminativism.
Can you say a bit about the implications of eliminating rationality? How do we square doing so with all the posts on this site about what is and isn’t rational? Are these claims all meaningless or false? Do you want to maintain that they all can be reformulated in terms of tendencies or the like?
Alternately, if you want to avoid this implication, can you say where you dig in your heels? My prejudices lead me to suspect that the devil lurks in the details of those ‘higher level abstractions’ you refer to, but am interested to hear how that suggestion gets cashed-out. Apols if you have answered this and I have missed it.
Can you say more about how you got that second bullet item?
It’s not clear to me that being committed to the idea that mental states can be reduced to smaller components (which is one of the options the OP presented) commits one to stop talking about mental states, or to stop using them in explanations.
I mean, any economist would agree that dollars are not ontologically fundamental, but no economist would conclude thereby that we can’t talk about dollars.
This may owe to a confusion on my part. I understood from the title of the post and some of its parts (incl the last par.) that the OP was advocating elimination over reduction (ie, contrasting these two options and picking elimination). I agree that if reduction is an option, then it’s still ok to use them in explanation, as per your dollar example.
Thanks for this great sequence of posts on behaviourism and related issues.
Here’s what I take it you’re committed to:
by ‘mental states’ we mean things like beliefs and desires.
an eliminativist has both to stop talking about them and also using them in explanations.
whither go beliefs and desires also goes rationality. You can’t have a rational agent without what amount to beliefs and desires.
you are advocating eliminativism.
Can you say a bit about the implications of eliminating rationality? How do we square doing so with all the posts on this site about what is and isn’t rational? Are these claims all meaningless or false? Do you want to maintain that they all can be reformulated in terms of tendencies or the like?
Alternately, if you want to avoid this implication, can you say where you dig in your heels? My prejudices lead me to suspect that the devil lurks in the details of those ‘higher level abstractions’ you refer to, but am interested to hear how that suggestion gets cashed-out. Apols if you have answered this and I have missed it.
Can you say more about how you got that second bullet item?
It’s not clear to me that being committed to the idea that mental states can be reduced to smaller components (which is one of the options the OP presented) commits one to stop talking about mental states, or to stop using them in explanations.
I mean, any economist would agree that dollars are not ontologically fundamental, but no economist would conclude thereby that we can’t talk about dollars.
This may owe to a confusion on my part. I understood from the title of the post and some of its parts (incl the last par.) that the OP was advocating elimination over reduction (ie, contrasting these two options and picking elimination). I agree that if reduction is an option, then it’s still ok to use them in explanation, as per your dollar example.