I was a bit excited to find a site devoted to rationality, and was rather disappointed to learn that no, it wasn’t.
I wrote a little hymn about it a while ago. It starts with “Our AI, who art in the future”, and you can imagine that it goes downhill from there.
I’m sorry, what is the intended content here? Because you can write a hymn that parodies strong AI claims that therefore we need to take them less seriously?
In fact, a cursory search of the net showed at least one topic that you guys preemptively banned from discussion because some future AI might find it and then torture you for it. If that isn’t a religious taboo, I don’t know what is.
Many people are not in favor of discussing the basilisk not because of the issue with a potential AI, but because of the danger that mentally vulnerable people will be disturbed by the notion. But in any event, you are pattern matching in an unhelpful way. The fact that something resembles something done by religions doesn’t make it intrinsically wrong. Note for example, that large amounts of computer programming and maintenance look heavily ritualistic if you don’t know what it is.
The singularity is not going to happen. Nanomachines the way that they are popularly imagined will never exist. Cyronics, today, is selling hope and smoke, and is a bad investment. You’ve got people discussing “friendly AI” and similar nonsense, without really understanding that they’re really talking about magic, and that all this philosophizing about it is pretty silly.
So these are all conclusions, not arguments. And speaking as someone who agrees with you on a lot of this stuff, you are being both highly irrational and unnecessarily insulting in how you lay out these claims.
Cryonics is a sucker’s bet. Even if there was a possibility it worked, the odds of it working are far less than other routes to immortality.
What other routes are you comparing it to? You mention a few methods of life-extension, but none are methods likely to add by themselves more than a few centuries at most.
Instead, cryonics is just a way to sell people hope. Just as Christians make peace with the idea of death that they will be going to a better place, that they will be okay, Christians avoid death as much as anyone else does. The same is true of cryonics. The rational thing to do, if it is important to avoid dying, is to work towards avoiding it or mitigating it as much as possible. Are you? If the answer is no, is it really so important to you? Or is paying that money for cryonics just a personal way to make peace with death?
Don’t confuse not having a certain goal set with disagreeing with you about what will most likely accomplish that goal set.
I’m sorry, what is the intended content here? Because you can write a hymn that parodies strong AI claims that therefore we need to take them less seriously?
Many people are not in favor of discussing the basilisk not because of the issue with a potential AI, but because of the danger that mentally vulnerable people will be disturbed by the notion. But in any event, you are pattern matching in an unhelpful way. The fact that something resembles something done by religions doesn’t make it intrinsically wrong. Note for example, that large amounts of computer programming and maintenance look heavily ritualistic if you don’t know what it is.
So these are all conclusions, not arguments. And speaking as someone who agrees with you on a lot of this stuff, you are being both highly irrational and unnecessarily insulting in how you lay out these claims.
What other routes are you comparing it to? You mention a few methods of life-extension, but none are methods likely to add by themselves more than a few centuries at most.
Don’t confuse not having a certain goal set with disagreeing with you about what will most likely accomplish that goal set.