More seriously, accusing rationalists of hauling the Amish and their mothers to camps doesn’t seem quite fair. Like you said, most rationalists seem pretty nice and aren’t proposing involuntary rapid changes. And this post certainly didn’t.
You’d need to address the actual arguments in play to write a serious post about this. “Don’t propose weird stuff” isn’t a very good argument. You could argue that went very poorly with communism, or come up with some other argument. Actually I think rationalists have come up with some. It looks to me like the more respected rationalists are pretty cautious about doing weird drastic stuff just because the logic seems correct at the time. See the unilateralist curse and Yudkiwky’s and other’s pleas that nobody do anything drastic about AGI even though they think it’s very likely going to kill us all.
This stuff is fun to think about, but it’s planning the victory party before planning how to win the war.
How to put the future into kind and rational hands seems like an equally interesting and much more urgent project right now. I’d be fine with a pretty traditional utopian future or a very weird one, but not fine with joyless machines eating the sun, or worse yet all of the suns they can reach.
You’re such a traditionalist!
More seriously, accusing rationalists of hauling the Amish and their mothers to camps doesn’t seem quite fair. Like you said, most rationalists seem pretty nice and aren’t proposing involuntary rapid changes. And this post certainly didn’t.
You’d need to address the actual arguments in play to write a serious post about this. “Don’t propose weird stuff” isn’t a very good argument. You could argue that went very poorly with communism, or come up with some other argument. Actually I think rationalists have come up with some. It looks to me like the more respected rationalists are pretty cautious about doing weird drastic stuff just because the logic seems correct at the time. See the unilateralist curse and Yudkiwky’s and other’s pleas that nobody do anything drastic about AGI even though they think it’s very likely going to kill us all.
This stuff is fun to think about, but it’s planning the victory party before planning how to win the war.
How to put the future into kind and rational hands seems like an equally interesting and much more urgent project right now. I’d be fine with a pretty traditional utopian future or a very weird one, but not fine with joyless machines eating the sun, or worse yet all of the suns they can reach.