You seem to be making the point that our[1] recommendation of cryonics facilitates an unfounded belief that one day there will be a benevolent superintelligence that will revive the corpsicle patients. I think that criticism could be appropriately aimed at zealous and silly transhumanists, but not at Less Wrong. Here you will be told that signing up for cryonics gives you only a 5% chance at living forever. You’ll be told that there’s a pretty good chance of superintelligence existing in the future, but there are at least even odds of it being not benevolent. And Eliezer, who came up with the Sysop scenario in the first place, explicitly warned against wasting time thinking about such things. You won’t find that kind of shiny eschatology here.
[1] It’s fair to say that Less Wrong advises signing up for cryonics, although there isn’t a consensus on this point.
You seem to be making the point that our[1] recommendation of cryonics facilitates an unfounded belief that one day there will be a benevolent superintelligence that will revive the corpsicle patients. I think that criticism could be appropriately aimed at zealous and silly transhumanists, but not at Less Wrong. Here you will be told that signing up for cryonics gives you only a 5% chance at living forever. You’ll be told that there’s a pretty good chance of superintelligence existing in the future, but there are at least even odds of it being not benevolent. And Eliezer, who came up with the Sysop scenario in the first place, explicitly warned against wasting time thinking about such things. You won’t find that kind of shiny eschatology here.
[1] It’s fair to say that Less Wrong advises signing up for cryonics, although there isn’t a consensus on this point.