This is correct in general. For this particular discussion? It may be right. Numbers may be too strong a requirement to change my mind. At least a Fermi estimate would be nice, also any kind of evidence, even personal, supporting Viliam’s assertions will definitely be required.
The important part isn’t assertions (which honestly I don’t see here), it’s asking the question. Like with advice, it’s useless when taken as a command without argument, but as framing it’s asking whether you should be doing a thing more or less than you normally do it, and that can be valuable by drawing attention to that question, even when the original advice is the opposite of what makes sense.
With discussion of potential issues of any kind, having norms that call for avoiding such discussion or for burdening it with rigor requirements makes it go away, and so the useful question of what the correct takes are remains unexplored.
I think it would be proper to provide a specific prediction, so here is one:
Assuming that we could somehow quantify “good done” and “cutting corners”, I expect a negative correlation between these two among the organizations in EA environment.
I’m glad for the attempted prediction! Seems not very cruxy to me. Something more cruxy: I imagine that people are capable of moderating themselves to an appropriate level of “cutting corners” So I expect a continuity of cutting corners levels. But you expect that small amounts of cutting corners quickly snowball into large amounts. So you should expect a pretty bimodal distribution.
[edit] A way this would not change my mind: If we saw a uni, bi, or multimodal distribution, but each of the peaks corresponded to a different cause area. I would say we’re picking up different levels of cutting corners ability from several different areas people may work in.
But you expect that small amounts of cutting corners quickly snowball into large amounts.
I don’t expect the existing organizations to get more sloppy.
I expect more sloppy organizations to join the EA ecosystem… and be welcome to waste the resources and burn out people (and not produce much actual value in return), because the red flags will be misinterpreted as a sign of being awesome.
I am not sure if this will result in a bimodal distribution, but expect that there will be some boring organizations that do their accounting properly and also cure malaria, and some exciting organizations that will do a lot of yachting and hot tub karaoke parties… and when things blow up no one will be able to figure out how many employees they actually had and whether they actually paid them according to the contract which doesn’t even exist on paper… because everyone was like “wow, these guys are thinking and acting so much out-of-the-box that they are certainly the geniuses who will save the world” when actually there were just some charismatic guys who probably meant good but didn’t think too hard about it.
I’d expect that to depend heavily on the definition of “good done” and “cutting corners”. For some definitions I’d expect a positive correlation and other definitions I’d expect a negative correlation.
This is correct in general. For this particular discussion? It may be right. Numbers may be too strong a requirement to change my mind. At least a Fermi estimate would be nice, also any kind of evidence, even personal, supporting Viliam’s assertions will definitely be required.
The important part isn’t assertions (which honestly I don’t see here), it’s asking the question. Like with advice, it’s useless when taken as a command without argument, but as framing it’s asking whether you should be doing a thing more or less than you normally do it, and that can be valuable by drawing attention to that question, even when the original advice is the opposite of what makes sense.
With discussion of potential issues of any kind, having norms that call for avoiding such discussion or for burdening it with rigor requirements makes it go away, and so the useful question of what the correct takes are remains unexplored.
I think it would be proper to provide a specific prediction, so here is one:
Assuming that we could somehow quantify “good done” and “cutting corners”, I expect a negative correlation between these two among the organizations in EA environment.
I’m glad for the attempted prediction! Seems not very cruxy to me. Something more cruxy: I imagine that people are capable of moderating themselves to an appropriate level of “cutting corners” So I expect a continuity of cutting corners levels. But you expect that small amounts of cutting corners quickly snowball into large amounts. So you should expect a pretty bimodal distribution.
[edit] A way this would not change my mind: If we saw a uni, bi, or multimodal distribution, but each of the peaks corresponded to a different cause area. I would say we’re picking up different levels of cutting corners ability from several different areas people may work in.
I don’t expect the existing organizations to get more sloppy.
I expect more sloppy organizations to join the EA ecosystem… and be welcome to waste the resources and burn out people (and not produce much actual value in return), because the red flags will be misinterpreted as a sign of being awesome.
I am not sure if this will result in a bimodal distribution, but expect that there will be some boring organizations that do their accounting properly and also cure malaria, and some exciting organizations that will do a lot of yachting and hot tub karaoke parties… and when things blow up no one will be able to figure out how many employees they actually had and whether they actually paid them according to the contract which doesn’t even exist on paper… because everyone was like “wow, these guys are thinking and acting so much out-of-the-box that they are certainly the geniuses who will save the world” when actually there were just some charismatic guys who probably meant good but didn’t think too hard about it.
I’d expect that to depend heavily on the definition of “good done” and “cutting corners”. For some definitions I’d expect a positive correlation and other definitions I’d expect a negative correlation.