Some people on LW have claimed that reflective stability is essential. My impression is that Robin Hanson always rejected that.
This seems like an important clash of intuitions. It seems to be Eliezer claiming that his utility function required it, and Robin denying that is a normal part of human values. I suspect this disagreement stems from some important disagreement about human values.
My position seems closer to Robin’s than to Eliezer’s. I want my values to become increasingly stable. I consider it ok for my values to change moderately as I get closer to creating a CEV. My desire for stability isn’t sufficiently strong compared to other values that I need guarantees about it.
Some people on LW have claimed that reflective stability is essential. My impression is that Robin Hanson always rejected that.
This seems like an important clash of intuitions. It seems to be Eliezer claiming that his utility function required it, and Robin denying that is a normal part of human values. I suspect this disagreement stems from some important disagreement about human values.
My position seems closer to Robin’s than to Eliezer’s. I want my values to become increasingly stable. I consider it ok for my values to change moderately as I get closer to creating a CEV. My desire for stability isn’t sufficiently strong compared to other values that I need guarantees about it.