Yes, a new point. Basically: “effective organizations cut corners” is a mild infohazard.
I do not in fact know the right amount of cutting corners to do. This is strong evidence this is not in fact an infohazard! I’d like to at least see some numbers before you declare something immoral and dangerous to discuss! I’m tempted to strong-downvote for such a premature comment, but instead I will strong disagree.
But when this meme becomes popular, it motivates organizations to get sloppy, excusing the sloppiness by “as you see, we care about effectiveness so much that we don’t have any time left for the stupid concerns of lesser minds”. And then… people get hurt, because it turns out that some of the rules actually existed for a reason (usually as a reaction to people getting hurt in the past).
Is this true? Maybe? It definitely causes orgs to get sloppy, but sloppiness in certain areas so you can focus on areas that matter is just exactly what triage is. Obviously the quote you gave is the wrong mindset to have, but why not just talk about how that’s a stupid mindset, but mindsets of the form “Yes, our legal paperwork is not in fact in order because it would cost us $180k/year for a full time lawyer, and this paperwork is never actually checked, so we’re ok with the risk”, or “Berkeley has terrible building regulations, so we’re going to build a nice shack, and this is illegal, but its not visible from the street, and we expect the shack to be really cool, so we’ll build it anyway” seem smart to me, to list a few clear & obvious examples.
Cutting corners should be seen as a bad thing that is sometimes necessary, not as a good thing that should be celebrated. Otherwise bad actors (especially) will pass our tests with flying colors.
I don’t think there’s a single rule you can apply to all instances of cutting corners. Sometimes its the right decision, sometimes not. When its the right decision it should be praised, otherwise it should not.
I’d like to at least see some numbers before you declare something immoral and dangerous to discuss!
Discussing hypothetical dangers shouldn’t require numbers. It’s probably not so dangerous to discuss hypothetical dangers that they shouldn’t be discussed when there are no numbers.
This is correct in general. For this particular discussion? It may be right. Numbers may be too strong a requirement to change my mind. At least a Fermi estimate would be nice, also any kind of evidence, even personal, supporting Viliam’s assertions will definitely be required.
The important part isn’t assertions (which honestly I don’t see here), it’s asking the question. Like with advice, it’s useless when taken as a command without argument, but as framing it’s asking whether you should be doing a thing more or less than you normally do it, and that can be valuable by drawing attention to that question, even when the original advice is the opposite of what makes sense.
With discussion of potential issues of any kind, having norms that call for avoiding such discussion or for burdening it with rigor requirements makes it go away, and so the useful question of what the correct takes are remains unexplored.
I think it would be proper to provide a specific prediction, so here is one:
Assuming that we could somehow quantify “good done” and “cutting corners”, I expect a negative correlation between these two among the organizations in EA environment.
I’m glad for the attempted prediction! Seems not very cruxy to me. Something more cruxy: I imagine that people are capable of moderating themselves to an appropriate level of “cutting corners” So I expect a continuity of cutting corners levels. But you expect that small amounts of cutting corners quickly snowball into large amounts. So you should expect a pretty bimodal distribution.
[edit] A way this would not change my mind: If we saw a uni, bi, or multimodal distribution, but each of the peaks corresponded to a different cause area. I would say we’re picking up different levels of cutting corners ability from several different areas people may work in.
But you expect that small amounts of cutting corners quickly snowball into large amounts.
I don’t expect the existing organizations to get more sloppy.
I expect more sloppy organizations to join the EA ecosystem… and be welcome to waste the resources and burn out people (and not produce much actual value in return), because the red flags will be misinterpreted as a sign of being awesome.
I am not sure if this will result in a bimodal distribution, but expect that there will be some boring organizations that do their accounting properly and also cure malaria, and some exciting organizations that will do a lot of yachting and hot tub karaoke parties… and when things blow up no one will be able to figure out how many employees they actually had and whether they actually paid them according to the contract which doesn’t even exist on paper… because everyone was like “wow, these guys are thinking and acting so much out-of-the-box that they are certainly the geniuses who will save the world” when actually there were just some charismatic guys who probably meant good but didn’t think too hard about it.
I’d expect that to depend heavily on the definition of “good done” and “cutting corners”. For some definitions I’d expect a positive correlation and other definitions I’d expect a negative correlation.
I do not in fact know the right amount of cutting corners to do. This is strong evidence this is not in fact an infohazard! I’d like to at least see some numbers before you declare something immoral and dangerous to discuss! I’m tempted to strong-downvote for such a premature comment, but instead I will strong disagree.
Is this true? Maybe? It definitely causes orgs to get sloppy, but sloppiness in certain areas so you can focus on areas that matter is just exactly what triage is. Obviously the quote you gave is the wrong mindset to have, but why not just talk about how that’s a stupid mindset, but mindsets of the form “Yes, our legal paperwork is not in fact in order because it would cost us $180k/year for a full time lawyer, and this paperwork is never actually checked, so we’re ok with the risk”, or “Berkeley has terrible building regulations, so we’re going to build a nice shack, and this is illegal, but its not visible from the street, and we expect the shack to be really cool, so we’ll build it anyway” seem smart to me, to list a few clear & obvious examples.
I don’t think there’s a single rule you can apply to all instances of cutting corners. Sometimes its the right decision, sometimes not. When its the right decision it should be praised, otherwise it should not.
Discussing hypothetical dangers shouldn’t require numbers. It’s probably not so dangerous to discuss hypothetical dangers that they shouldn’t be discussed when there are no numbers.
This is correct in general. For this particular discussion? It may be right. Numbers may be too strong a requirement to change my mind. At least a Fermi estimate would be nice, also any kind of evidence, even personal, supporting Viliam’s assertions will definitely be required.
The important part isn’t assertions (which honestly I don’t see here), it’s asking the question. Like with advice, it’s useless when taken as a command without argument, but as framing it’s asking whether you should be doing a thing more or less than you normally do it, and that can be valuable by drawing attention to that question, even when the original advice is the opposite of what makes sense.
With discussion of potential issues of any kind, having norms that call for avoiding such discussion or for burdening it with rigor requirements makes it go away, and so the useful question of what the correct takes are remains unexplored.
I think it would be proper to provide a specific prediction, so here is one:
Assuming that we could somehow quantify “good done” and “cutting corners”, I expect a negative correlation between these two among the organizations in EA environment.
I’m glad for the attempted prediction! Seems not very cruxy to me. Something more cruxy: I imagine that people are capable of moderating themselves to an appropriate level of “cutting corners” So I expect a continuity of cutting corners levels. But you expect that small amounts of cutting corners quickly snowball into large amounts. So you should expect a pretty bimodal distribution.
[edit] A way this would not change my mind: If we saw a uni, bi, or multimodal distribution, but each of the peaks corresponded to a different cause area. I would say we’re picking up different levels of cutting corners ability from several different areas people may work in.
I don’t expect the existing organizations to get more sloppy.
I expect more sloppy organizations to join the EA ecosystem… and be welcome to waste the resources and burn out people (and not produce much actual value in return), because the red flags will be misinterpreted as a sign of being awesome.
I am not sure if this will result in a bimodal distribution, but expect that there will be some boring organizations that do their accounting properly and also cure malaria, and some exciting organizations that will do a lot of yachting and hot tub karaoke parties… and when things blow up no one will be able to figure out how many employees they actually had and whether they actually paid them according to the contract which doesn’t even exist on paper… because everyone was like “wow, these guys are thinking and acting so much out-of-the-box that they are certainly the geniuses who will save the world” when actually there were just some charismatic guys who probably meant good but didn’t think too hard about it.
I’d expect that to depend heavily on the definition of “good done” and “cutting corners”. For some definitions I’d expect a positive correlation and other definitions I’d expect a negative correlation.