No offense to Fred, but he’s a bitter loner. Idealistic nerd wants to make the world awesome, runs out and tells everyone, everyone laughs at him, idealistic nerd gives up in disgust and walks away muttering “I’ll show them! I’ll show them all!”.
Also, he thinks this project is really really important, worth declaring war against the rest of the world and killing whoever stands in the way of becoming cooler. (As you say, whether he thinks we can also kill people who don’t actively oppose it is unclear.) This is a dangerous idea (see the zillion glorious revolutions that executed critics and plunged happily into dictatorship) - though it is less dangerous when your movement is made of complete individualists. As it happens, becoming superhumans will not require offing any Luddites (though it does require offending them and coercing them by legal means), but I can’t confidently say it wouldn’t be worth it if it were the only way—even after correcting for historical failures.
By the same token, group rationality is in fact the way to go, but individual rationality does require telling society to take a hike every now and then.
FAI as not the ultimate transhuman, but the ultimate institution/legal system/moral code
It certaintly shouldn’t be a transhuman. Eliezer’s preferred metaphor is more like “the ultimate laws of physics”, which says quite a bit about how individualistic you and he are.
No offense to Fred, but he’s a bitter loner. Idealistic nerd wants to make the world awesome, runs out and tells everyone, everyone laughs at him, idealistic nerd gives up in disgust and walks away muttering “I’ll show them! I’ll show them all!”.
Also, he thinks this project is really really important, worth declaring war against the rest of the world and killing whoever stands in the way of becoming cooler. (As you say, whether he thinks we can also kill people who don’t actively oppose it is unclear.) This is a dangerous idea (see the zillion glorious revolutions that executed critics and plunged happily into dictatorship) - though it is less dangerous when your movement is made of complete individualists. As it happens, becoming superhumans will not require offing any Luddites (though it does require offending them and coercing them by legal means), but I can’t confidently say it wouldn’t be worth it if it were the only way—even after correcting for historical failures.
By the same token, group rationality is in fact the way to go, but individual rationality does require telling society to take a hike every now and then.
It certaintly shouldn’t be a transhuman. Eliezer’s preferred metaphor is more like “the ultimate laws of physics”, which says quite a bit about how individualistic you and he are.