Can you unpack your thinking, there? I certainly wouldn’t want to behave with respect to Clippy in ways that would be bad ideas to behave with respect to a paperclip maximizer, but I’m not quite clear on what you’re discouraging and why.
I’m discouraging people from thinking of arbitrarily weird scenarios where one of the possible payoff matrices is bad for humanity.
I oppose this influence. I welcome considerations of any and every weird scenario with arbitrary payoff matrices for anyone at any time.
Can you unpack your thinking, there? I certainly wouldn’t want to behave with respect to Clippy in ways that would be bad ideas to behave with respect to a paperclip maximizer, but I’m not quite clear on what you’re discouraging and why.
I’m discouraging people from thinking of arbitrarily weird scenarios where one of the possible payoff matrices is bad for humanity.
I oppose this influence. I welcome considerations of any and every weird scenario with arbitrary payoff matrices for anyone at any time.