You could make a consequential case for it, of course
Certainly true, and disturbing, especially for those of us who feel that consequentialism is in some way “correct”. Since far-future people are virtually guaranteed to have radically different values than us, and likely would have the ability to directly modify our (to them frighteningly evil) values, wouldn’t we (per murder-Gandhi) want to spread a deontological system that forbids tampering with other people’s values, even if we feel that in general consequentialism based on our current society’s values is more morally beneficial? That is, would we prefer for some small spark of our moral system to survive into the distant future, at the expense of it being lost in the here and now?
Certainly true, and disturbing, especially for those of us who feel that consequentialism is in some way “correct”. Since far-future people are virtually guaranteed to have radically different values than us, and likely would have the ability to directly modify our (to them frighteningly evil) values, wouldn’t we (per murder-Gandhi) want to spread a deontological system that forbids tampering with other people’s values, even if we feel that in general consequentialism based on our current society’s values is more morally beneficial? That is, would we prefer for some small spark of our moral system to survive into the distant future, at the expense of it being lost in the here and now?