We may try to quantify it. If an agent is creating a virus, which have 10 per cent to give him immortality and 1 per cent to result in human extinction, is it moral to him to proceed?
Clearly not, even from selfish point of view. If we have 1000 such agents, extinction is inevitable.
So clearly agressive and selfish quest for immortality is immoral and will convert a person in social cancer cell. But in reality the situation is opposite.
You need to give immortality to as many people as possible if you want it to be tested, cheap and predictable technology. Think about Iphone - it is cheap, high quality and reliable because of economy of scale.
So I think that fighting for life extension is second most important and positive thing after prevention x-risks (and it seems to be underestimated by EA.)
A more interesting question for me is that of a silent âtâ: Does immortality imply immorality?
We may try to quantify it. If an agent is creating a virus, which have 10 per cent to give him immortality and 1 per cent to result in human extinction, is it moral to him to proceed? Clearly not, even from selfish point of view. If we have 1000 such agents, extinction is inevitable.
So clearly agressive and selfish quest for immortality is immoral and will convert a person in social cancer cell. But in reality the situation is opposite.
You need to give immortality to as many people as possible if you want it to be tested, cheap and predictable technology. Think about Iphone - it is cheap, high quality and reliable because of economy of scale.
So I think that fighting for life extension is second most important and positive thing after prevention x-risks (and it seems to be underestimated by EA.)