Here is a dialogue between an imaginary six-year-old child named Dennis and myself.
Me: Hi Dennis, do you like broccoli?
Dennis: No, I hate it!
Me: But it’s good for you, right?
Dennis: I don’t care! It tastes awful!
Me: Would you like to like broccoli?
Dennis: No, I can’t stand broccoli! That stuff is gross!
Me: What if I told you some magic words that would make it so that every piece of broccoli you ever ate would taste just like chocolate if you said them? Would you say the magic words?
Dennis: Well...
Me: You like chocolate, don’t you?
Dennis: Yes, but...
Me: What?
Dennis: Your questions are too hard.
I think everyone has conflicts between their different wants. I want to do well in my classes, but I don’t want to study. And yet I can’t think of any conflicts between my metawants: If I could choose to like studying just as much as I like my favorite computer game, I would make that choice. The wants offered to the humans in the babyeaters story seem fairly sensible from a utilitarian perspective. They promote peace throughout the galaxy and mean lots of fun for everyone. What’s not to like?
Well, their ontological, epistemological, and ethical statuses, for three. Specifically, how it’s possible to want X and simultaneously want to not want X (while remaining more or less sane/rational). Whether metawants have any special status when making utilitarian ethical calculations. That sort of thing. Even the history of thought on the subject (e.g. Buddhism, where the stated (and only?) metawant is to eliminate all first-order wants).
I’ll see what I can do. There was a fair bit about second-order desire in my self-knowledge class and if people would be interested in a distillation/summary of it, I’ll provide.
I get the argument, but I assign a high value to self-determination. Like Arthur Dent, I don’t want my brain replaced (unless by choice), even if the new brain is programmed to be ok with being replaced. Which ending did you pick in Deus Ex 2? I felt guilty gunning down JC and his brother, but it seemed the least wrong (according to my preferences) thing to do.
I felt guilty gunning down JC and his brother, but it seemed the least wrong (according to my preferences) thing to do.
Isn’t human nature funny* that we have qualms about behaving immorally in a sufficiently realistic simulation, yet can hear cold numbers about enormous real disutility (genocides, natural disasters, etc.) and feel nothing? That’s speaking for myself incidentally, not casting aspersions on you.
*(where by “funny” I mean “designed by a blind idiot god”, naturally)
As things that were good for us in the anscestral environment (fat and sugar) tend to taste good, and things that might be bad (suspect plants) taste yucky, Imaginary Dennis’ reaction makes adaptive sence. Do you want to want to eat poison?
Can I make a pro-babyeater argument?
Here is a dialogue between an imaginary six-year-old child named Dennis and myself.
Me: Hi Dennis, do you like broccoli?
Dennis: No, I hate it!
Me: But it’s good for you, right?
Dennis: I don’t care! It tastes awful!
Me: Would you like to like broccoli?
Dennis: No, I can’t stand broccoli! That stuff is gross!
Me: What if I told you some magic words that would make it so that every piece of broccoli you ever ate would taste just like chocolate if you said them? Would you say the magic words?
Dennis: Well...
Me: You like chocolate, don’t you?
Dennis: Yes, but...
Me: What?
Dennis: Your questions are too hard.
I think everyone has conflicts between their different wants. I want to do well in my classes, but I don’t want to study. And yet I can’t think of any conflicts between my metawants: If I could choose to like studying just as much as I like my favorite computer game, I would make that choice. The wants offered to the humans in the babyeaters story seem fairly sensible from a utilitarian perspective. They promote peace throughout the galaxy and mean lots of fun for everyone. What’s not to like?
I wish someone would do a post on metawants. Personally I view them with deep suspicion.
What about metawants (a.k.a. second-order desire) do you want to see a post on?
Well, their ontological, epistemological, and ethical statuses, for three. Specifically, how it’s possible to want X and simultaneously want to not want X (while remaining more or less sane/rational). Whether metawants have any special status when making utilitarian ethical calculations. That sort of thing. Even the history of thought on the subject (e.g. Buddhism, where the stated (and only?) metawant is to eliminate all first-order wants).
I’ll see what I can do. There was a fair bit about second-order desire in my self-knowledge class and if people would be interested in a distillation/summary of it, I’ll provide.
I get the argument, but I assign a high value to self-determination. Like Arthur Dent, I don’t want my brain replaced (unless by choice), even if the new brain is programmed to be ok with being replaced. Which ending did you pick in Deus Ex 2? I felt guilty gunning down JC and his brother, but it seemed the least wrong (according to my preferences) thing to do.
A rather vacuous statement, no?
Isn’t human nature funny* that we have qualms about behaving immorally in a sufficiently realistic simulation, yet can hear cold numbers about enormous real disutility (genocides, natural disasters, etc.) and feel nothing? That’s speaking for myself incidentally, not casting aspersions on you.
*(where by “funny” I mean “designed by a blind idiot god”, naturally)
I don’t think you’re being very fair to your new brain. Do you?
I haven’t played Deus Ex 2, sorry.
As things that were good for us in the anscestral environment (fat and sugar) tend to taste good, and things that might be bad (suspect plants) taste yucky, Imaginary Dennis’ reaction makes adaptive sence. Do you want to want to eat poison?