I would expect the existential risk reduction returns from encouraging long-term thinking by getting people to sign up for cryonics to be dwarfed by the returns from encouraging long-term thinking directly, and I would expect those returns to be dwarfed by the returns from encouraging rational long-term thinking on especially important topics.
I would expect the existential risk reduction returns from encouraging long-term thinking by getting people to sign up for cryonics to be dwarfed by the returns from encouraging long-term thinking directly, and I would expect those returns to be dwarfed by the returns from encouraging rational long-term thinking on especially important topics.
That would make cryonics a self-serving reward that utilitarians award themselves after doing some good deeds.
It’s not hypocritical if we acknowledge that our values are partially but not completely selfish.
Yes, I can imagine that position. I was more curious to see if anyone else was going to try and make a utilitarian case for it.