I have really different priors than it seems like a lot of EAs and rationalists do about this stuff, so it’s hard to have useful arguments. But here are some related things I believe, based mostly on my experience and common sense rather than actual evidence. (“You” here is referring to the average LW reader, not you specifically.)
Most important abilities for doing most useful work (like running a hedge fund) are mostly not fixed at e.g. age 25, and can be greatly improved upon. FTX didn’t fail because SBF had a lack of “working memory.” It seems to have failed because he sucked at a bunch of stuff that you could easily get better at over time. (Reportedly he was a bad manager and didn’t communicate well, he clearly was bad at making decisions under pressure, he clearly behaved overly impulsively, etc.)
Trying to operate on 5 hours of sleep with constant stimulants is idiotic. You should have an incredibly high prior that this doesn’t work well, and trying it out and it feeling OK for a little while shouldn’t convince you otherwise. It blows my mind that any smart person would do this. The potential downside is so much worse than “an extra 3 hours per day” is good.
Common problems with how your mind works like “can’t pay attention, can’t motivate myself, irrationally anxious” aren’t always things where you need to find silver bullet, quick fixes, or else live with them forever. They are typically amenable to gradual directional improvement.
If you are e.g. 25 years old and you have serious problems like that, now is a dumb time to try to launch yourself as hard as possible into an ambitious, self-sacrificing career where you take a lot of personal responsibility. Get your own house in order.
If you want to do a bunch of self-sacrificing, speculative burnout stuff anyway, I don’t believe for a minute that it’s because you are making a principled, altruistic, +EV decision due to short AI timelines, or something. That’s totally inhuman. I think it’s probably basically because you have a kind of outsized ego and you can’t emotionally handle the idea that you might not be the center of the world.
P.S. I realize you were trying to make a more general point, but I have to point out that all this SBF psychoanalysis is based on extremely scanty evidence, and having a conversation framed as if it is likely basically true seems kind of foolish.
I agree with the first two points and partly the 4th point. Also the P.S. (I tried to hedge with words like “likely” but I didn’t really proofread this a lot).
The 5th point seems like it could apply to me specifically, but like… I don’t really know how I’d solve my ego problem, and it’s still not clear how bad or whether that’s bad in my situation (again, one of my broader points). I know this is likely to be a defense mechanism but… I’m okay with it? Is there decision theory about whether whether I should try to become less inhuman?
If that resembles you, I don’t know if it’s a problem for you. Maybe not, if you like it. I was just expressing that when I see someone appearing to do that, like the FTX people, I don’t take their suggestion that the way they are going about it is really good and important very seriously.
I have really different priors than it seems like a lot of EAs and rationalists do about this stuff, so it’s hard to have useful arguments. But here are some related things I believe, based mostly on my experience and common sense rather than actual evidence. (“You” here is referring to the average LW reader, not you specifically.)
Most important abilities for doing most useful work (like running a hedge fund) are mostly not fixed at e.g. age 25, and can be greatly improved upon. FTX didn’t fail because SBF had a lack of “working memory.” It seems to have failed because he sucked at a bunch of stuff that you could easily get better at over time. (Reportedly he was a bad manager and didn’t communicate well, he clearly was bad at making decisions under pressure, he clearly behaved overly impulsively, etc.)
Trying to operate on 5 hours of sleep with constant stimulants is idiotic. You should have an incredibly high prior that this doesn’t work well, and trying it out and it feeling OK for a little while shouldn’t convince you otherwise. It blows my mind that any smart person would do this. The potential downside is so much worse than “an extra 3 hours per day” is good.
Common problems with how your mind works like “can’t pay attention, can’t motivate myself, irrationally anxious” aren’t always things where you need to find silver bullet, quick fixes, or else live with them forever. They are typically amenable to gradual directional improvement.
If you are e.g. 25 years old and you have serious problems like that, now is a dumb time to try to launch yourself as hard as possible into an ambitious, self-sacrificing career where you take a lot of personal responsibility. Get your own house in order.
If you want to do a bunch of self-sacrificing, speculative burnout stuff anyway, I don’t believe for a minute that it’s because you are making a principled, altruistic, +EV decision due to short AI timelines, or something. That’s totally inhuman. I think it’s probably basically because you have a kind of outsized ego and you can’t emotionally handle the idea that you might not be the center of the world.
P.S. I realize you were trying to make a more general point, but I have to point out that all this SBF psychoanalysis is based on extremely scanty evidence, and having a conversation framed as if it is likely basically true seems kind of foolish.
I agree with the first two points and partly the 4th point. Also the P.S. (I tried to hedge with words like “likely” but I didn’t really proofread this a lot).
The 5th point seems like it could apply to me specifically, but like… I don’t really know how I’d solve my ego problem, and it’s still not clear how bad or whether that’s bad in my situation (again, one of my broader points). I know this is likely to be a defense mechanism but… I’m okay with it? Is there decision theory about whether whether I should try to become less inhuman?
If that resembles you, I don’t know if it’s a problem for you. Maybe not, if you like it. I was just expressing that when I see someone appearing to do that, like the FTX people, I don’t take their suggestion that the way they are going about it is really good and important very seriously.
Alright, well thanks for engaging with it and me!