Lets say the Singularity is likely to happen in 2045 like Kurzweil says, and you want to maximize the chances that it’s positive. The idea that you should get to work making as much money to donate to SIAI, or that you should start researching fAGI (depending on your talents). What you do tomorrow doesn’t matter. What matters is the average output over the next 35 years.
This is important because a strategy where you have a emotional breakdown in 2020 fails. If you get so miserable you kill yourself you’ve failed at your goal. You need to make sure that this fallible agent, XIXIDu, stays at a very high level of productivity for the next 35 years. That almost never happens if you’re not fulfilling the needs your monkey brain demands.
Immediate gratification isn’t a terminal goal, you’ve figured this out, but it does work as an instrumental goal on the path of a greater goal.
Lets say the Singularity is likely to happen in 2045 like Kurzweil says, and you want to maximize the chances that it’s positive. The idea that you should get to work making as much money to donate to SIAI, or that you should start researching fAGI (depending on your talents). What you do tomorrow doesn’t matter. What matters is the average output over the next 35 years.
This is important because a strategy where you have a emotional breakdown in 2020 fails. If you get so miserable you kill yourself you’ve failed at your goal. You need to make sure that this fallible agent, XIXIDu, stays at a very high level of productivity for the next 35 years. That almost never happens if you’re not fulfilling the needs your monkey brain demands.
Immediate gratification isn’t a terminal goal, you’ve figured this out, but it does work as an instrumental goal on the path of a greater goal.
Ditto