I read the email and the post, and the feeling of this “I do actually really care about people being able to coordinate to not take the site down. It’s an actual hard thing to do that actually is trying to reinforce a bunch of the real and important values that I care about in Petrov day” wasn’t really articulated anywhere.
I’m pretty sympathetic to this. I’m a LessWrong admin, last year on Petrov day, and someone had talked about selling codes, I considered my price. $10,000 is a meaningful some to me and I think was my thought. I don’t remember what his final number was, but I recall Habryka stating a higher number and me than getting an explanation of why this exercise was much more important than I initially thought (more than $10k). If that explanation of its importance was written somewhere, I don’t recall where.
So yeah, kinda fair on that front. I should probably apologize for giving you quite as hard time about it – just because I’ve internalized the importance.
Wow. I honestly don’t get it—do you have a link to the previous discussion that justified why anyone’s taking it all that seriously?
IMO, this was a completely optional, artificial setup—“just a game”, in Chris’s words. When I got the e-mail, I wondered if it was already down, and was surprised that it wasn’t (though maybe I just didn’t notice—it never seemed down to me, but I go straight to /allPosts without ever looking at the front page).
There was none of the weight of Petrov’s decision, and no tension about picking one or the other—no lasting harm for pressing the button, no violation of norms (or being executed for treason, or losing WWIII) by failing to do so if it were necessary. And no evidence one way or the other what the actual territory is. Really, just a game. And not even a very good one.
The fundamental cooperation to take down the site had ALREADY HAPPENED. When someone wrote the code that would do so if someone pressed the button, that’s FAR FAR stronger than some rando actually pressing the button.
In fairness, my values diverge pretty substantially from a lot of the community here, particularly around “life is serious” vs “life isn’t very serious” and the value of abstract bonds/ties/loyalties/camaraderie.
Thanks. I am not convinced, but I have a better idea of where our perspectives differ. I have to admit this feels a bit like a relationship shit-test, where an artificial situation is created, and far too much weight is put on the result.
I’d be interested to hear various participants’ and observers’ takes on the actual impact of this event, in terms of what they believe about people’s willingness to support the site or improve the world in non-artificial conditions.
Hmm. Appreciate your reply. I think there’s a subtle difference here, let me think about it some.
Hmm.
Okay.
Thrashing it out a bit more, I do think a lot of semi-artificial situations are predictive of future behavior.
Actually, to use an obviously extreme example that doesn’t universally apply, that’s more-or-less the theory behind the various Special Forces selection procedures —
As opposed to someone artificially creating a conflict to see how the other party navigates it — which I’m not at all a fan of — I think exercises in shared trust have both predictive value for future behavior and build good team cohesion when overcome.
I’d be interested to hear various participants’ and observers’ takes on the actual impact of this event
Me too, but I’d ideally want the data captured semi-anonymously. Most people, especially effective people, won’t comment publicly “I think this is despicable and have incremented downwards various confidences in people as a result” whereas the “aww it’s ok, no big deal” position is much more easily vocalized.
(Personally, I’m trying to tone down that type of vocalization myself. It’s unproductive on an individual level — it makes people dislike you for minimal gain. But I speculate that the absence of that level of dialogue and expression of genuine sentiment potentially leads to evaporative cooling of people who believe in teamwork, mission, mutual trust, etc.)
Reasonable minds can differ on this and related points, of course. And I’m very aware my values diverge a bit from many here, again around stuff like seriousness/camaraderie/cohesion/intensity/harm-vs-care/self-expression/defection/etc.
I’m pretty sympathetic to this. I’m a LessWrong admin, last year on Petrov day, and someone had talked about selling codes, I considered my price. $10,000 is a meaningful some to me and I think was my thought. I don’t remember what his final number was, but I recall Habryka stating a higher number and me than getting an explanation of why this exercise was much more important than I initially thought (more than $10k). If that explanation of its importance was written somewhere, I don’t recall where.
So yeah, kinda fair on that front. I should probably apologize for giving you quite as hard time about it – just because I’ve internalized the importance.
Wow. I honestly don’t get it—do you have a link to the previous discussion that justified why anyone’s taking it all that seriously?
IMO, this was a completely optional, artificial setup—“just a game”, in Chris’s words. When I got the e-mail, I wondered if it was already down, and was surprised that it wasn’t (though maybe I just didn’t notice—it never seemed down to me, but I go straight to /allPosts without ever looking at the front page).
There was none of the weight of Petrov’s decision, and no tension about picking one or the other—no lasting harm for pressing the button, no violation of norms (or being executed for treason, or losing WWIII) by failing to do so if it were necessary. And no evidence one way or the other what the actual territory is. Really, just a game. And not even a very good one.
The fundamental cooperation to take down the site had ALREADY HAPPENED. When someone wrote the code that would do so if someone pressed the button, that’s FAR FAR stronger than some rando actually pressing the button.
Here was my analysis last year —
https://www.lesswrong.com/posts/vvzfFcbmKgEsDBRHh/honoring-petrov-day-on-lesswrong-in-2019?commentId=ZZ87dbYiGDu6uMtF8
In fairness, my values diverge pretty substantially from a lot of the community here, particularly around “life is serious” vs “life isn’t very serious” and the value of abstract bonds/ties/loyalties/camaraderie.
Thanks. I am not convinced, but I have a better idea of where our perspectives differ. I have to admit this feels a bit like a relationship shit-test, where an artificial situation is created, and far too much weight is put on the result.
I’d be interested to hear various participants’ and observers’ takes on the actual impact of this event, in terms of what they believe about people’s willingness to support the site or improve the world in non-artificial conditions.
Hmm. Appreciate your reply. I think there’s a subtle difference here, let me think about it some.
Hmm.
Okay.
Thrashing it out a bit more, I do think a lot of semi-artificial situations are predictive of future behavior.
Actually, to use an obviously extreme example that doesn’t universally apply, that’s more-or-less the theory behind the various Special Forces selection procedures —
https://bootcampmilitaryfitnessinstitute.com/media/tv-documentaries/elite-special-forces-documentaries/
As opposed to someone artificially creating a conflict to see how the other party navigates it — which I’m not at all a fan of — I think exercises in shared trust have both predictive value for future behavior and build good team cohesion when overcome.
Me too, but I’d ideally want the data captured semi-anonymously. Most people, especially effective people, won’t comment publicly “I think this is despicable and have incremented downwards various confidences in people as a result” whereas the “aww it’s ok, no big deal” position is much more easily vocalized.
(Personally, I’m trying to tone down that type of vocalization myself. It’s unproductive on an individual level — it makes people dislike you for minimal gain. But I speculate that the absence of that level of dialogue and expression of genuine sentiment potentially leads to evaporative cooling of people who believe in teamwork, mission, mutual trust, etc.)
Reasonable minds can differ on this and related points, of course. And I’m very aware my values diverge a bit from many here, again around stuff like seriousness/camaraderie/cohesion/intensity/harm-vs-care/self-expression/defection/etc.