Yeah, that’s right. There’s some sort of goal/specification/desire that picks out the particular behavior we want out of the large space of possible behaviors. However, that goal/specification/desire need not be internal to the AI system, and it need not be long-term.
Yeah, that’s right. There’s some sort of goal/specification/desire that picks out the particular behavior we want out of the large space of possible behaviors. However, that goal/specification/desire need not be internal to the AI system, and it need not be long-term.
(Side note: it’s not that giant)