Well, for me it is a habit (I was eating meat before I realized it is wrong) and convenience (society all the time provides me easy opportunities to pay other people to torture and kill animals for me and even not to think about it).
If I had a magic button that would rewrite my habits, I would push it, even if it meant that no one else would be influenced by it.
Seems like you are trying to redefine “wanting” to mean what people are actually doing. There are some problems with this, but it’s kinda difficult to explain the problem with that. Something about people not being automatically strategic, or not being utility maximizers...
It’s conceivable that you forget that eating meat is unethical every time you order or buy it, but that’s unlikely. If you really believed “I shouldn’t buy meat” when you had the opportunity to buy it, you wouldn’t buy it. (It doesn’t mean you wouldn’t find meat appealing, only that you wouldn’t buy it.)
People don’t always want to do what they should do. Sometimes they have habits that cause them to forget that they’re doing something wrong. Sometimes they’re inconsistent, and may not realize that they’re acting contrary to what they’d do if they were consistent. But people can’t simultaneously believe “I shouldn’t do X” and “I should do X”, as in “I shouldn’t eat meat” and “I should buy this chicken”.
But people can’t simultaneously believe “I shouldn’t do X” and “I should do X”, as in “I shouldn’t eat meat” and “I should buy this chicken”.
I don’t think that’s what Viliam_Bur is saying.
Look at this problem that I encounter every day: I believe that I should learn how to program, and yet I end up playing Team Fortress. This can be because of laziness or akrasia or because I’ve spend my willpower for other things.
The same thing can happen for broader moral considerations. If you believe eating meat is wrong, you can still end up eating meat if you have a strong craving, don’t feel like arguing with the person cooking your meal or any other reason.
For perfect rationalists, acting in accordance with your moral values is easy and takes no effort. Unfortunately, this isn’t the case for humans.
I don’t think akrasia can apply to the area traditionally considered to be morality. If you believe doing something would be evil, that feels different from it being merely suboptimal and harmful to yourself. For example, you like playing TF2, even though it may be a suboptimal to play it at times, but even though it’s a habit, you’d instantly stop doing it if, say, the player avatars in TF2 were real beings that experienced terror, pain, and suffering in the course of gameplay. It stands to reason that eating meat would be the same.
I searched for “I want to be vegan but love meat” It was in google autocomplete and has plenty of results including this Yahoo answers page which explicitly mentions that the poster wants to be a vegetarian for ethical reasons.
I don’t think that’s a counterexample. If I had a billionaire uncle who willed me his fortune, I could say something like “I like money but I don’t want to commit murder”—and then I wouldn’t commit murder. Liking the taste of meat and still abstaining from it because you think eating it is evil is similar.
The point of it wasn’t to say that people like meat. The point was that people have or expect akrasia from not eating meat enough that they search Google and ask people on question sites for help.
I used to believe like you that if you believe something is morally good then you would do it. That axiom used to be a corner stone in my model of morality. There was actually a stage in my life where my moral superiority provided most of my self esteem and disobeying it was unthinkable. When I encountered belief in belief I couldn’t make sense of it at all. I was further confused that they didn’t admit it when I explained how they were being inconsistent.
But besides that I don’t think humans evolved to have that kind of consistency . I believe that humans act mostly according to reinforcement. Morality does provide a form of reinforcement in the sense that you feel good when you act morally and worse otherwise, however if there was a sufficient external motivator such as extreme torture then you would eventually give in, perhaps rationalizing the decision.
I would suggest the people who have commented here read this post if they haven’t yet because there have been two arguments over definitions here already (first with consistency and then the definition of “genuine belief”) and there is a reason that is frowned upon. You should also see Belief in belief for better understanding how people can act contrary to their stated morals and behave in contradictory ways. (It typically comes up a lot with religious people, who don’t try to be as moral as they can be despite viewing it as good)
I don’t believe that you believe you should learn how to program. I do believe you think it would be good for you to learn to program, but you also enjoy TF2 and choose to play it instead. If it happened occasionally, I’d believe that you could occasionally forget your real goals, but if it happens all the time, I question that your real goals are what you say they are.
This is even more the case with eating meat. If you genuinely believe that eating meat harms others and that’s morally relevant, it’s more than just the self-inflicted harm of not learning to program, it’s actually doing evil that hurts something. If you believe something is morally wrong, you will act accordingly—catching yourself in a state of “I’m in the habit of doing something evil” would be such a horrifying realization that it would instantly shatter the habit.
Well, for me it is a habit (I was eating meat before I realized it is wrong) and convenience (society all the time provides me easy opportunities to pay other people to torture and kill animals for me and even not to think about it).
If I had a magic button that would rewrite my habits, I would push it, even if it meant that no one else would be influenced by it.
Seems like you are trying to redefine “wanting” to mean what people are actually doing. There are some problems with this, but it’s kinda difficult to explain the problem with that. Something about people not being automatically strategic, or not being utility maximizers...
It’s conceivable that you forget that eating meat is unethical every time you order or buy it, but that’s unlikely. If you really believed “I shouldn’t buy meat” when you had the opportunity to buy it, you wouldn’t buy it. (It doesn’t mean you wouldn’t find meat appealing, only that you wouldn’t buy it.)
People don’t always want to do what they should do. Sometimes they have habits that cause them to forget that they’re doing something wrong. Sometimes they’re inconsistent, and may not realize that they’re acting contrary to what they’d do if they were consistent. But people can’t simultaneously believe “I shouldn’t do X” and “I should do X”, as in “I shouldn’t eat meat” and “I should buy this chicken”.
I don’t think that’s what Viliam_Bur is saying.
Look at this problem that I encounter every day: I believe that I should learn how to program, and yet I end up playing Team Fortress. This can be because of laziness or akrasia or because I’ve spend my willpower for other things.
The same thing can happen for broader moral considerations. If you believe eating meat is wrong, you can still end up eating meat if you have a strong craving, don’t feel like arguing with the person cooking your meal or any other reason.
For perfect rationalists, acting in accordance with your moral values is easy and takes no effort. Unfortunately, this isn’t the case for humans.
I don’t think akrasia can apply to the area traditionally considered to be morality. If you believe doing something would be evil, that feels different from it being merely suboptimal and harmful to yourself. For example, you like playing TF2, even though it may be a suboptimal to play it at times, but even though it’s a habit, you’d instantly stop doing it if, say, the player avatars in TF2 were real beings that experienced terror, pain, and suffering in the course of gameplay. It stands to reason that eating meat would be the same.
I searched for “I want to be vegan but love meat” It was in google autocomplete and has plenty of results including this Yahoo answers page which explicitly mentions that the poster wants to be a vegetarian for ethical reasons.
I don’t think that’s a counterexample. If I had a billionaire uncle who willed me his fortune, I could say something like “I like money but I don’t want to commit murder”—and then I wouldn’t commit murder. Liking the taste of meat and still abstaining from it because you think eating it is evil is similar.
The point of it wasn’t to say that people like meat. The point was that people have or expect akrasia from not eating meat enough that they search Google and ask people on question sites for help.
I used to believe like you that if you believe something is morally good then you would do it. That axiom used to be a corner stone in my model of morality. There was actually a stage in my life where my moral superiority provided most of my self esteem and disobeying it was unthinkable. When I encountered belief in belief I couldn’t make sense of it at all. I was further confused that they didn’t admit it when I explained how they were being inconsistent.
But besides that I don’t think humans evolved to have that kind of consistency . I believe that humans act mostly according to reinforcement. Morality does provide a form of reinforcement in the sense that you feel good when you act morally and worse otherwise, however if there was a sufficient external motivator such as extreme torture then you would eventually give in, perhaps rationalizing the decision.
I would suggest the people who have commented here read this post if they haven’t yet because there have been two arguments over definitions here already (first with consistency and then the definition of “genuine belief”) and there is a reason that is frowned upon. You should also see Belief in belief for better understanding how people can act contrary to their stated morals and behave in contradictory ways. (It typically comes up a lot with religious people, who don’t try to be as moral as they can be despite viewing it as good)
I don’t believe that you believe you should learn how to program. I do believe you think it would be good for you to learn to program, but you also enjoy TF2 and choose to play it instead. If it happened occasionally, I’d believe that you could occasionally forget your real goals, but if it happens all the time, I question that your real goals are what you say they are.
This is even more the case with eating meat. If you genuinely believe that eating meat harms others and that’s morally relevant, it’s more than just the self-inflicted harm of not learning to program, it’s actually doing evil that hurts something. If you believe something is morally wrong, you will act accordingly—catching yourself in a state of “I’m in the habit of doing something evil” would be such a horrifying realization that it would instantly shatter the habit.