It’s kind of an obvious, major interest to any rationalist.
I’m wary of this conflation of rationality and a specific set of values. Rationality is a tool for achieving your values; it doesn’t specify what your values should be.
Immortality is good as instrument for realisation of many values—you will have more time to reach them.
And also excluding values from rationality means that we think of them like a set of arbitrary irrational axioms.
It may seem rational only if think about rationality as pure math logic. But ratio is intelligence in Latin, if we check the meaning of the word. So, in order to make definition of ratinality we need to make definition of intelligence which is almost equal to have recepie for AI. And as we dont know how to creat AI we also could not claim to know what exactly mean to be rational. And here is beginig of some loop logic which could undermine many LW goals. We cant take absolutely rational approach to creating AI and FAI, because we dont know what is rationality before we know how AGI works.
So if someone claims of understading rationality leads to irrational conclusion like that one—“to fight death is not important”, it could means that his understanding of rationality is wrong, it not add up to normality.
I’m wary of this conflation of rationality and a specific set of values. Rationality is a tool for achieving your values; it doesn’t specify what your values should be.
You’re just going to blast away the entire epistemic half of rationality? :)
nope
Immortality is good as instrument for realisation of many values—you will have more time to reach them. And also excluding values from rationality means that we think of them like a set of arbitrary irrational axioms.
It may seem rational only if think about rationality as pure math logic. But ratio is intelligence in Latin, if we check the meaning of the word. So, in order to make definition of ratinality we need to make definition of intelligence which is almost equal to have recepie for AI. And as we dont know how to creat AI we also could not claim to know what exactly mean to be rational. And here is beginig of some loop logic which could undermine many LW goals. We cant take absolutely rational approach to creating AI and FAI, because we dont know what is rationality before we know how AGI works.
So if someone claims of understading rationality leads to irrational conclusion like that one—“to fight death is not important”, it could means that his understanding of rationality is wrong, it not add up to normality.