It seems to me that your argument is essentially that expected-utility-maximization is indistinguishable from believing-truth. I don’t see any particular need to address or dispute that, as your error is elsewhere.
The original post starts with the rather telling words:
Do we really have to be rational all the time in order to teach rationality?breaking the rules of reality within the realm of a work of fiction
Accepting for the sake of argument your earlier tying-together of deciding-well and believing-well, you seem to be incorrectly assuming that believing true things is somehow at odds with writing fantasy fiction. I don’t think that believing truth necessarily requires saying truth, and if your audience understands that it’s fiction, there aren’t any ethical issues related to deceit either.
That’s fine if fiction is about pure escapism, but when fiction is used to convey a message about the real world, whether it’s an Aesop or, worse even, a teaching about how reality works on a physical level. is when I’m feeling queasy.
I agree with this comment, but now I’m not sure what you were trying to get at in the original post.
(I’ll focus on just the first paragraph.)
If Miss Frizzle could do it, why couldn’t we? Do we really have to be rational all the time in order to teach rationality?breaking the rules of reality within the realm of a work of fiction and making the protagonists (or the audience if it’s a videogame) figure the new rules out for themselves… Actually now that I think of it videogamers are very used to adapting themselves to entirely new sets of physics on a weekly basis… but no-one has ever made them stop and think about it for a while, AFAIK.
I read something along the lines of this:
Counter to first intuition, telling stories set in a world whose natural laws differ from those of our own may sometimes be useful to the purpose of teaching worthwhile facts and skills.
I don’t see that this has anything to do with decision theory, so I wouldn’t have used the word “rational” to say it.
I didn’t think we were disputing definitions.
It seems to me that your argument is essentially that expected-utility-maximization is indistinguishable from believing-truth. I don’t see any particular need to address or dispute that, as your error is elsewhere.
The original post starts with the rather telling words:
Accepting for the sake of argument your earlier tying-together of deciding-well and believing-well, you seem to be incorrectly assuming that believing true things is somehow at odds with writing fantasy fiction. I don’t think that believing truth necessarily requires saying truth, and if your audience understands that it’s fiction, there aren’t any ethical issues related to deceit either.
That’s fine if fiction is about pure escapism, but when fiction is used to convey a message about the real world, whether it’s an Aesop or, worse even, a teaching about how reality works on a physical level. is when I’m feeling queasy.
I agree with this comment, but now I’m not sure what you were trying to get at in the original post.
(I’ll focus on just the first paragraph.)
I read something along the lines of this:
I don’t see that this has anything to do with decision theory, so I wouldn’t have used the word “rational” to say it.