But the theory fails because this fits it but isn’t wireheading, right? It wouldn’t actually be pleasing to play that game.
I think you are right.
The two are errors that practically, with respect to hedonistic extremism, operate in opposing directions. They are similar in form in as much as they fit the abstract notion “undesirable outcomes due to lost purposes when choosing to optimize what turns out to be a poor metric for approximating actual preferences”.
Meh, yeah, maybe? Still seems like other, more substantive objections could be made.
Relatedly, I’m not entirely sure I buy Steve’s logic. PRNGs might not be nearly as interesting as short mathematical descriptions of complex things, like Chaitin’s omega. Arguably collecting as many bits of Chaitin’s omega as possible, or developing similar maths, would in fact be interesting in a human sense. But at that point our models really break down for many reasons, so meh whatever.
But the theory fails because this fits it but isn’t wireheading, right? It wouldn’t actually be pleasing to play that game.
I think you are right.
The two are errors that practically, with respect to hedonistic extremism, operate in opposing directions. They are similar in form in as much as they fit the abstract notion “undesirable outcomes due to lost purposes when choosing to optimize what turns out to be a poor metric for approximating actual preferences”.
Meh, yeah, maybe? Still seems like other, more substantive objections could be made.
Relatedly, I’m not entirely sure I buy Steve’s logic. PRNGs might not be nearly as interesting as short mathematical descriptions of complex things, like Chaitin’s omega. Arguably collecting as many bits of Chaitin’s omega as possible, or developing similar maths, would in fact be interesting in a human sense. But at that point our models really break down for many reasons, so meh whatever.