Computationally tractable is Yudkowsky’s framing and might be too limited. The kind of thing I believe is for example, an animal without a certain brain complexity will tend not to be a social animal and is therefore unlikely to have the sort of values social animals have. And animals that can’t do math aren’t going to value mathematical aesthetics the way human mathematicians do.
Ah ok, that makes sense. That’s more about being able to understand what the goal is, not about the ability to compute what actions are able to achieve it.
Computationally tractable is Yudkowsky’s framing and might be too limited. The kind of thing I believe is for example, an animal without a certain brain complexity will tend not to be a social animal and is therefore unlikely to have the sort of values social animals have. And animals that can’t do math aren’t going to value mathematical aesthetics the way human mathematicians do.
Ah ok, that makes sense. That’s more about being able to understand what the goal is, not about the ability to compute what actions are able to achieve it.