I did explicitly note that there are things one can do with higher OOM funding that EA can’t do now, even if they wouldn’t be as efficient per dollar spent, and that this is the counterargument to there being TMM.
In general, I notice I’m more optimistic that someone capable of being a founder (or even early employee) will do more good directly by going out and creating new companies that provide value like PayPal or Tesla, rather than entering a charitable ecosystem (EA or otherwise), whether or not their main motivation is the money. There are of course places where this doesn’t apply because they’re playing zero-sum games, so the ‘cause area’ question still matters, but that’s part of the whole Doing Thing thing. And I worry that a lot of the ‘don’t know what else to do’ is a reflection of both feeling an obligation to seek money and power as an implication of a moral framework, which then will tend to change people over time to primarily care about money and power, and also damages them in other ways, and it reflects that the frameworks and world models don’t see most of what produces value/good in the world as worthwhile except as a means to gather resources, and also requires the narratization of all action lest it not be considered to have value, and that’s why they don’t see ‘what else to do.’
I’d also note that SBF has exactly the Doing Thing energy and mindset, always has, and also managed to earn the money quickly and by being right rather than navigating social dynamics, which are reasons I’m optimistic there. But also that it seems to me like there’s something key missing if going around getting people to donate their money looks similar to creating a crypto exchange (and a subset of that, but an important one, if it doesn’t look like that but does look like joining a trading firm). If it looks like founding Tesla or Amazon, I want to say ‘halt and catch fire.’
I’d also note that I stand by my claim that the best charity in the world is Amazon.com, and that if Jeff Bezos cared exclusively about the world it’s not that he shouldn’t be going into space, it’s that he should suck it up and keep running Amazon.
Then again, a lot of people in the world don’t know what to do, so it’s not at all a unique problem.
The Moral Mazes issue is very real here even if one is founding a positive-sum enterprise, in that even when they originally started with good intentions (which often isn’t the case) the person who comes out at the other end with money and power is by default incapable of using it for the ends they originally had in mind, they’ve changed and believe that the only way to succeed is to be motivated by other things instead (this is me wimping out from writing ‘bad intentions’ here, and it’s important to know that’s what my brain actually believes goes here is partly an active sign error, but it’s not needed for this context and I worry it will distract.) If one is building an organization whose explicit purpose is power and extracting money without otherwise creating value, then these problems are going to be much, much worse.
Also important is my expectation that convincing such folks to later give money won’t result in that money going to the places that you (or whoever the best careful thinkers are) decide are best according to non-distorted principles, it’s going to be a bunch of MOPs that introduce a bunch of naive/dumb money into an ecosystem full of extractive agents and bad incentives, and that’s going to make all those problems worse. Or it’s going to be controlled by people whose habits of being are about increasing money and power, which will also make a bunch of problems worse rather than better.
When I say ‘I don’t know where to start’ that’s not me being condescending, it’s a reflection of a ton of inferential distance and what seems like massively overdetermined intuitions that are hard to share—e.g. I wrote a book-long sequence trying to share some of them, and have others I don’t know how to put into writing yet. So I’ll wrap up here with noting that this is a subset of the things I could say, and that what I could say is probably another book.
I did explicitly note that there are things one can do with higher OOM funding that EA can’t do now, even if they wouldn’t be as efficient per dollar spent, and that this is the counterargument to there being TMM.
In general, I notice I’m more optimistic that someone capable of being a founder (or even early employee) will do more good directly by going out and creating new companies that provide value like PayPal or Tesla, rather than entering a charitable ecosystem (EA or otherwise), whether or not their main motivation is the money. There are of course places where this doesn’t apply because they’re playing zero-sum games, so the ‘cause area’ question still matters, but that’s part of the whole Doing Thing thing. And I worry that a lot of the ‘don’t know what else to do’ is a reflection of both feeling an obligation to seek money and power as an implication of a moral framework, which then will tend to change people over time to primarily care about money and power, and also damages them in other ways, and it reflects that the frameworks and world models don’t see most of what produces value/good in the world as worthwhile except as a means to gather resources, and also requires the narratization of all action lest it not be considered to have value, and that’s why they don’t see ‘what else to do.’
I’d also note that SBF has exactly the Doing Thing energy and mindset, always has, and also managed to earn the money quickly and by being right rather than navigating social dynamics, which are reasons I’m optimistic there. But also that it seems to me like there’s something key missing if going around getting people to donate their money looks similar to creating a crypto exchange (and a subset of that, but an important one, if it doesn’t look like that but does look like joining a trading firm). If it looks like founding Tesla or Amazon, I want to say ‘halt and catch fire.’
I’d also note that I stand by my claim that the best charity in the world is Amazon.com, and that if Jeff Bezos cared exclusively about the world it’s not that he shouldn’t be going into space, it’s that he should suck it up and keep running Amazon.
Then again, a lot of people in the world don’t know what to do, so it’s not at all a unique problem.
The Moral Mazes issue is very real here even if one is founding a positive-sum enterprise, in that even when they originally started with good intentions (which often isn’t the case) the person who comes out at the other end with money and power is by default incapable of using it for the ends they originally had in mind, they’ve changed and believe that the only way to succeed is to be motivated by other things instead (this is me wimping out from writing ‘bad intentions’ here, and it’s important to know that’s what my brain actually believes goes here is partly an active sign error, but it’s not needed for this context and I worry it will distract.) If one is building an organization whose explicit purpose is power and extracting money without otherwise creating value, then these problems are going to be much, much worse.
Also important is my expectation that convincing such folks to later give money won’t result in that money going to the places that you (or whoever the best careful thinkers are) decide are best according to non-distorted principles, it’s going to be a bunch of MOPs that introduce a bunch of naive/dumb money into an ecosystem full of extractive agents and bad incentives, and that’s going to make all those problems worse. Or it’s going to be controlled by people whose habits of being are about increasing money and power, which will also make a bunch of problems worse rather than better.
When I say ‘I don’t know where to start’ that’s not me being condescending, it’s a reflection of a ton of inferential distance and what seems like massively overdetermined intuitions that are hard to share—e.g. I wrote a book-long sequence trying to share some of them, and have others I don’t know how to put into writing yet. So I’ll wrap up here with noting that this is a subset of the things I could say, and that what I could say is probably another book.