I haven’t read Tuxedage’s writeups in their entirety, nor am I likely to, so I’m at a loss for how emotional turmoil and psychological warfare could be evidence that the gatekeeper doesn’t think there’s something more important at stake than winning the game.
That said, I’ll take your word for it that in this case they are, and that Tuxedage’s transcripts constitute a counterexample to my model.
Losing felt horrible. By attempting to damage Alexei’s psyche, I in turn, opened myself up to being damaged. I went into a state of catharsis for days.
...and such.
That said, I’ll take your word for it that in this case they are, and that Tuxedage’s transcripts constitute a counterexample to my model.
No, don’t do that, I made a mistake.
I guess I just thought that “you should open the box to convince people of the danger of AI” type arguments aren’t emotionally salient.
But that was a bad assumption, you never limited yourself to just that one argument but spoke of meta in general. You’re right that there exist arguments that might go meta and be emotionally salient.
I suppose you could think of some convoluted timeless decision theory reason for you to open the box. History has shown that some people on LW find timeless blackmail threats emotionally upsetting, though these seem to be in a minority.
there exist arguments that might go meta and be emotionally salient
Oh, absolutely. Actually, the model I am working from here is my own experience of computer strategy games, in which I frequently find myself emotionally reluctant to “kill” my units and thus look for a zero-casualties strategy. All of which is kind of absurd, of course, but there it is.
I haven’t read Tuxedage’s writeups in their entirety, nor am I likely to, so I’m at a loss for how emotional turmoil and psychological warfare could be evidence that the gatekeeper doesn’t think there’s something more important at stake than winning the game.
That said, I’ll take your word for it that in this case they are, and that Tuxedage’s transcripts constitute a counterexample to my model.
I’m only speaking of things written in the OP
...and such.
No, don’t do that, I made a mistake.
I guess I just thought that “you should open the box to convince people of the danger of AI” type arguments aren’t emotionally salient.
But that was a bad assumption, you never limited yourself to just that one argument but spoke of meta in general. You’re right that there exist arguments that might go meta and be emotionally salient.
I suppose you could think of some convoluted timeless decision theory reason for you to open the box. History has shown that some people on LW find timeless blackmail threats emotionally upsetting, though these seem to be in a minority.
Oh, absolutely. Actually, the model I am working from here is my own experience of computer strategy games, in which I frequently find myself emotionally reluctant to “kill” my units and thus look for a zero-casualties strategy. All of which is kind of absurd, of course, but there it is.