Following this logically, to win the most you make the best bets, and you need more resources (more time to live, more money) so that you can make more total bets and thus win more.
This means rationalists should be in favor of life extension, getting rich as individuals, and getting personal access to the most powerful artificial general intelligence tools that can be controlled. (this is why AI pause advocacy, at least at GPT-4 capability level, seems ‘weird’ for a ‘rational’ individual to advocate for. A much strong model can likely be controlled, and if you think it can’t, how do you know this?)
“This means rationalists should be in favor of life extension, getting rich as individuals, and getting personal access to the most powerful artificial general intelligence tools that can be controlled. ”
Uhhhh yes they should do this instead of becoming obsessed with this type of stuff. Though ‘can be controlled’ is certainly load bearing.
Following this logically, to win the most you make the best bets, and you need more resources (more time to live, more money) so that you can make more total bets and thus win more.
This means rationalists should be in favor of life extension, getting rich as individuals, and getting personal access to the most powerful artificial general intelligence tools that can be controlled. (this is why AI pause advocacy, at least at GPT-4 capability level, seems ‘weird’ for a ‘rational’ individual to advocate for. A much strong model can likely be controlled, and if you think it can’t, how do you know this?)
“This means rationalists should be in favor of life extension, getting rich as individuals, and getting personal access to the most powerful artificial general intelligence tools that can be controlled. ”
Uhhhh yes they should do this instead of becoming obsessed with this type of stuff. Though ‘can be controlled’ is certainly load bearing.