Among many other things, and most relevantly… We don’t know what we want. We have to hack ourselves in order to approximate having a utility function. This is fairly predictable from the operation of evolution. Consistency in complex systems is something evolution is very bad at producing.
An artificial agent would most likely be built to know what it wants, and could easily have a utility function.
The consequences of this one difference are profound.
Among many other things, and most relevantly… We don’t know what we want. We have to hack ourselves in order to approximate having a utility function. This is fairly predictable from the operation of evolution. Consistency in complex systems is something evolution is very bad at producing.
An artificial agent would most likely be built to know what it wants, and could easily have a utility function.
The consequences of this one difference are profound.