I don’t think it’s lumping everything together. It’s criticizing the rule “Act on what you feel in your heart.” That applies to a lot of people’s beliefs, but it certainly isn’t the epistemology of everyone who doesn’t agree with Penn Jillette.
The problem with “Act on what you feel in your heart” is that it’s too generalizable. It proves too much, because of course someone else might feel something different and some of those things might be horrible. But if my epistemology is an appeal to an external source (which I guess in this context would be a religious book but I’m going to use “believe whatever Rameses II believed” because I think that’s funnier), then that doesn’t necessarily have the same problem.
You can criticize my choice of Rameses II, and you probably should. But now my epistemology is based on an external source and not just my feelings. Unless you reduce me to saying I trust Rameses because I Just Feel that he’s trustworthy, this epistemology does not have the same problem as the one criticized in the quote.
All this to say, Jillette is not unfairly lumping things together and there exist types of morality/epistemology that can be wrong without having this argument apply.
The problem with “Act on what you feel in your heart” is that it’s too generalizable. It proves too much, because of course someone else might feel something different and some of those things might be horrible. But if my epistemology is an appeal to an external source (which I guess in this context would be a religious book but I’m going to use “believe whatever Rameses II believed” because I think that’s funnier), then that doesn’t necessarily have the same problem.
‘Act on an external standard’ is just as generalizable—because you can choose just about anything as your standard. You might choose to consistently act like Gandhi, or like Hitler, or like Zeus, or like a certain book suggests, or like my cat Peter who enjoys killing things and scratching cardboard boxes. If the only thing I know about you is that you consistently behave like someone else, but I don’t know like whom, then I can’t actually predict your behavior at all.
The more important question is: if you act on what you feel in your heart, what determines or changes what is in your heart? And if you act on an external standard, what makes you choose or change your standard?
The problem with “Act on what you feel in your heart” is that it’s too generalizable. It proves too much, because of course someone else might feel something different and some of those things might be horrible.
It looks like there’s all this undefined behavior, and demons coming out the nose from the outside because you aren’t looking at the exact details of what’s going on in with their feelings that are choosing the beliefs. Though a C compiler given an undefined construct may cause your program to crash, it will never literally cause demons to come out of your nose, and you could figure this out if you looked at the implementation of the compiler. It’s still deterministic.
As an atheistic meta-ethical ant-realist, my utility function is basically whatever I want it to be. It’s entirely internal. From the outside, from someone who has a system where they follow something external and clearly specified, they could shout “Nasal demons!”, but demons will never come out my nose, and my internal, ever so frighteningly non-negotiable desires are never going to include planned famines. It has reliable internal structure.
The mistake is looking at a particular kind of specification that defines all the behavior, and then looking at a system not covered by that specification, but which is controlled by another specification you haven’t bothered to understand, and saying “Who can possibly say what that system will do?”
Some processors (even x86) have instructions (such as bit rotate) which are useful for significant performance boosts in stuff like cryptography, and yet aren’t accessible from C or C++, and to use it you have to perform hacks like writing the machine code out as bytes, casting its address to a function pointer and calling it. That’s undefined behavior with respect to the C/C++ standard. But it’s perfectly predictable if you know what platform you’re on.
Other people who aren’t meta-ethical anti-realists’ utility functions are not really negotiable either. You can’t really give them a valid argument that will convince them not to do something evil if they happen to be psychopaths. They just have internal desires and things they care about, and they care a lot more about having a morality which sounds logical when argued for than I do.
And if you actually examine what’s going on with the feelings of people with feeling-driven epistemology that makes them believe things, instead of just shouting “Nasal demons! Unspecified behavior! Infinitely beyond the reach of understanding!” you will see that the non-psychopathic ones have mostly-deterministic internal structure to their feelings that prevents them from believing that they should murder Sharon Tate. And psychopaths won’t be made ethical by reasoning with them anyway. I don’t believe the 9/11 hijackers were psychopaths, but that’s the holy book problem I mentioned, and a rare case.
In most cases of undefined C constructs, there isn’t another carefully-tuned structure that’s doing the job of the C standard in making the behavior something you want, so you crash. And faith-epistemology does behave like this (crashing, rather than running hacky cryptographic code that uses the rotate instructions) when it comes to generating beliefs that don’t have obvious consequences to the user. So it would have been a fair criticism to say “You believe something because you believe it in your heart, and you’ve justified not signing your children up for cryonics because you believe in an afterlife,” because (A) they actually do that, (B) it’s a result of them having an epistemology which doesn’t track the truth.
Disclaimer: I’m not signed up for cryonics, though if I had kids, they would be.
my utility function is basically whatever I want it to be.
I very much doubt that. At least with present technology you cannot self-modify to prefer dead babies over live ones; and there’s presumably no technological advance that can make you want to.
my utility function is basically whatever I want it to be.
If utility functions are those constructed by the VNM theorem, your utility function is your wants; it is not something you can have wants about. There is nothing in the machinery of the theorem that allows for a utility function to talk about itself, to have wants about wants. Utility functions and the lotteries that they evaluate belong to different worlds.
Are there theorems about the existence and construction of self-inspecting utility functions?
I don’t think it’s lumping everything together. It’s criticizing the rule “Act on what you feel in your heart.” That applies to a lot of people’s beliefs, but it certainly isn’t the epistemology of everyone who doesn’t agree with Penn Jillette.
The problem with “Act on what you feel in your heart” is that it’s too generalizable. It proves too much, because of course someone else might feel something different and some of those things might be horrible. But if my epistemology is an appeal to an external source (which I guess in this context would be a religious book but I’m going to use “believe whatever Rameses II believed” because I think that’s funnier), then that doesn’t necessarily have the same problem.
You can criticize my choice of Rameses II, and you probably should. But now my epistemology is based on an external source and not just my feelings. Unless you reduce me to saying I trust Rameses because I Just Feel that he’s trustworthy, this epistemology does not have the same problem as the one criticized in the quote.
All this to say, Jillette is not unfairly lumping things together and there exist types of morality/epistemology that can be wrong without having this argument apply.
‘Act on an external standard’ is just as generalizable—because you can choose just about anything as your standard. You might choose to consistently act like Gandhi, or like Hitler, or like Zeus, or like a certain book suggests, or like my cat Peter who enjoys killing things and scratching cardboard boxes. If the only thing I know about you is that you consistently behave like someone else, but I don’t know like whom, then I can’t actually predict your behavior at all.
The more important question is: if you act on what you feel in your heart, what determines or changes what is in your heart? And if you act on an external standard, what makes you choose or change your standard?
It looks like there’s all this undefined behavior, and demons coming out the nose from the outside because you aren’t looking at the exact details of what’s going on in with their feelings that are choosing the beliefs. Though a C compiler given an undefined construct may cause your program to crash, it will never literally cause demons to come out of your nose, and you could figure this out if you looked at the implementation of the compiler. It’s still deterministic.
As an atheistic meta-ethical ant-realist, my utility function is basically whatever I want it to be. It’s entirely internal. From the outside, from someone who has a system where they follow something external and clearly specified, they could shout “Nasal demons!”, but demons will never come out my nose, and my internal, ever so frighteningly non-negotiable desires are never going to include planned famines. It has reliable internal structure.
The mistake is looking at a particular kind of specification that defines all the behavior, and then looking at a system not covered by that specification, but which is controlled by another specification you haven’t bothered to understand, and saying “Who can possibly say what that system will do?”
Some processors (even x86) have instructions (such as bit rotate) which are useful for significant performance boosts in stuff like cryptography, and yet aren’t accessible from C or C++, and to use it you have to perform hacks like writing the machine code out as bytes, casting its address to a function pointer and calling it. That’s undefined behavior with respect to the C/C++ standard. But it’s perfectly predictable if you know what platform you’re on.
Other people who aren’t meta-ethical anti-realists’ utility functions are not really negotiable either. You can’t really give them a valid argument that will convince them not to do something evil if they happen to be psychopaths. They just have internal desires and things they care about, and they care a lot more about having a morality which sounds logical when argued for than I do.
And if you actually examine what’s going on with the feelings of people with feeling-driven epistemology that makes them believe things, instead of just shouting “Nasal demons! Unspecified behavior! Infinitely beyond the reach of understanding!” you will see that the non-psychopathic ones have mostly-deterministic internal structure to their feelings that prevents them from believing that they should murder Sharon Tate. And psychopaths won’t be made ethical by reasoning with them anyway. I don’t believe the 9/11 hijackers were psychopaths, but that’s the holy book problem I mentioned, and a rare case.
In most cases of undefined C constructs, there isn’t another carefully-tuned structure that’s doing the job of the C standard in making the behavior something you want, so you crash. And faith-epistemology does behave like this (crashing, rather than running hacky cryptographic code that uses the rotate instructions) when it comes to generating beliefs that don’t have obvious consequences to the user. So it would have been a fair criticism to say “You believe something because you believe it in your heart, and you’ve justified not signing your children up for cryonics because you believe in an afterlife,” because (A) they actually do that, (B) it’s a result of them having an epistemology which doesn’t track the truth.
Disclaimer: I’m not signed up for cryonics, though if I had kids, they would be.
I very much doubt that. At least with present technology you cannot self-modify to prefer dead babies over live ones; and there’s presumably no technological advance that can make you want to.
If utility functions are those constructed by the VNM theorem, your utility function is your wants; it is not something you can have wants about. There is nothing in the machinery of the theorem that allows for a utility function to talk about itself, to have wants about wants. Utility functions and the lotteries that they evaluate belong to different worlds.
Are there theorems about the existence and construction of self-inspecting utility functions?