The model has no leeway. It must assume that humans are behaving optimally, and therefore that there is some intrinsic value in lottery-tickets and seatbelt-free driving that should be preserved into the far future.
1. You’re driving your friend to the hospital. Do you speed?
2. I’m not saying seatbelt-free driving is always rational, but on what grounds is it irrational?
some intrinsic value in lottery-tickets
3. Which would you rather have? $1 (with certainty), a 10% chance of $10, or a 1% chance of $100? What pair/s of $X with y probability, where y*x=1 do you want the most?
> 2. I’m not saying seatbelt-free driving is always rational, but on what grounds is it irrational?
Individual actions are not a priori irrational, because we’re always talking about a conflict between at least two things. Furthermore, you can always describe humans as perfect rational agents—just use the microphysical description of all their atoms to predict their next action, and say that they assign that action high utility. (This is basically Rohin’s point)
Ways of choosing actions are what we think can be irrational (there are problems with this but I’d rather ignore them), but these ways of choosing actions are only associated with humans within some particular way of describing humans (the intentional stance). Like, if you’re describing humans as collections of atoms, your description will never label anything as a value conflict or an inconsistency. You have to describe humans in terms of values and choices and so on before you can say that an action “conflicts with their values and is therefore irrational” or whatever.
Long story short, when I say that driving without a seatbelt is usually dumb because people don’t want to die, there is no further or more universal sense I know of in which anything is irrational. I do not mean that you cannot assign values to humans in which driving without a seatbelt is always right—in fact, the problem is that I’m worried that a poor AI design might do just such a thing! But in the values that I actually do assign to humans, driving without a seatbelt is usually dumb.
Just realized I didn’t distinguish between “being in a moving vehicle and not wearing a seatbelt” and “driving a moving vehicle and not wearing a seatbelt”. (There not being seatbelts on passenger seats is kind of a feature on buses.)
1. You’re driving your friend to the hospital. Do you speed?
2. I’m not saying seatbelt-free driving is always rational, but on what grounds is it irrational?
3. Which would you rather have? $1 (with certainty), a 10% chance of $10, or a 1% chance of $100? What pair/s of $X with y probability, where y*x=1 do you want the most?
> 2. I’m not saying seatbelt-free driving is always rational, but on what grounds is it irrational?
Individual actions are not a priori irrational, because we’re always talking about a conflict between at least two things. Furthermore, you can always describe humans as perfect rational agents—just use the microphysical description of all their atoms to predict their next action, and say that they assign that action high utility. (This is basically Rohin’s point)
Ways of choosing actions are what we think can be irrational (there are problems with this but I’d rather ignore them), but these ways of choosing actions are only associated with humans within some particular way of describing humans (the intentional stance). Like, if you’re describing humans as collections of atoms, your description will never label anything as a value conflict or an inconsistency. You have to describe humans in terms of values and choices and so on before you can say that an action “conflicts with their values and is therefore irrational” or whatever.
Long story short, when I say that driving without a seatbelt is usually dumb because people don’t want to die, there is no further or more universal sense I know of in which anything is irrational. I do not mean that you cannot assign values to humans in which driving without a seatbelt is always right—in fact, the problem is that I’m worried that a poor AI design might do just such a thing! But in the values that I actually do assign to humans, driving without a seatbelt is usually dumb.
Just realized I didn’t distinguish between “being in a moving vehicle and not wearing a seatbelt” and “driving a moving vehicle and not wearing a seatbelt”. (There not being seatbelts on passenger seats is kind of a feature on buses.)