Absolutely, granted. I guess I just found this post to be an extremely convoluted way to make the point of “if you maximize the wrong thing, you’ll get something that you don’t want, and the more effectively you achieve the wrong goal, the more you diverge from the right goal.” I don’t see that the existence of “marketing worlds” makes maximizing the wrong thing more dangerous than it already was.
Additionally, I’m kinda horrified about the class of fixes (of which the proposal is a member) which involve doing the wrong thing less effectively. Not that I have an actual fix in mind. It just sounds like a terrible idea—”we’re pretty sure that our specification is incomplete in an important, unknown way. So we’re going to satisfice instead of maximize when we take over the world.”
Absolutely, granted. I guess I just found this post to be an extremely convoluted way to make the point of “if you maximize the wrong thing, you’ll get something that you don’t want, and the more effectively you achieve the wrong goal, the more you diverge from the right goal.” I don’t see that the existence of “marketing worlds” makes maximizing the wrong thing more dangerous than it already was.
Additionally, I’m kinda horrified about the class of fixes (of which the proposal is a member) which involve doing the wrong thing less effectively. Not that I have an actual fix in mind. It just sounds like a terrible idea—”we’re pretty sure that our specification is incomplete in an important, unknown way. So we’re going to satisfice instead of maximize when we take over the world.”