I had a secular upbringing. I’m not disappointed that the universe has no meaning (i.e., purpose given by a creator) but I do sometimes find it frustrating to be in the situation of being an optimizer who doesn’t know what it is they’re supposed to optimize.
It is as if a capricious creator-god made a hodge-podge of heuristics and underdetermined and conflicting preferences, and as an afterthought endowed it with the desire to have a comprehendible set of values.
I’m sure anyone who’s really thought about the problem of extrapolating volition has been rather frustrated with this at some point. Not only is it uselessly difficult to figure out what we want, it’s incredibly difficult to figure out how to design something that figures out what we want, and this chain of dependencies ends up requiring vast knowledge of mathematics, psychology, philosophy, cosmology, and other fields that any sane agent just shouldn’t need to have mastered in order to introspect on the question of “In the end, what am I trying to do?”.
I personally enjoy optimizing things, for example writing the fastest and/or most elegant code, or finding the best deal on a purchase.
I’m similar to a utility maximizer in many respects (e.g., has beliefs based on deduction and induction, considers consequences of actions, etc.) except that I don’t seem to have a utility function.
I had a secular upbringing. I’m not disappointed that the universe has no meaning (i.e., purpose given by a creator) but I do sometimes find it frustrating to be in the situation of being an optimizer who doesn’t know what it is they’re supposed to optimize.
It is as if a capricious creator-god made a hodge-podge of heuristics and underdetermined and conflicting preferences, and as an afterthought endowed it with the desire to have a comprehendible set of values.
Man, this comment thread is so profound.
Indeed, a very alien god.
I’m sure anyone who’s really thought about the problem of extrapolating volition has been rather frustrated with this at some point. Not only is it uselessly difficult to figure out what we want, it’s incredibly difficult to figure out how to design something that figures out what we want, and this chain of dependencies ends up requiring vast knowledge of mathematics, psychology, philosophy, cosmology, and other fields that any sane agent just shouldn’t need to have mastered in order to introspect on the question of “In the end, what am I trying to do?”.
Doesn’t this contradict another comment of yours?
In what way do they contradict each other? Please explain.
If you know you have no utility function, why do you feel you’re an optimizer?
I guess I used the word “optimizer” because
I personally enjoy optimizing things, for example writing the fastest and/or most elegant code, or finding the best deal on a purchase.
I’m similar to a utility maximizer in many respects (e.g., has beliefs based on deduction and induction, considers consequences of actions, etc.) except that I don’t seem to have a utility function.
Here you say you’re an optimizer, and there you say you have no utility function. Or am I misunderstanding something?
One of them assumes that you’re optimizing something, and the other one says that you aren’t.
Why do you feel you’re an optimizer at all? I don’t.