Since XiXiDu and multifoliaterose’s posts have all been made during the Singularity Summit, when everyone at SIAI is otherwise occupied and so cannot respond, I thought someone familiar with the issues should engage rather than leave a misleading appearance of silence. And giving a bit of advice that I think has a good chance of improving XiXiDu’s contributions seemed reasonable and not too costly.
Future is the stuff you build goodness out of. The properties of stuff don’t matter, what matters is the quality and direction of decisions made about arranging it properly.
There is not enough stuff to sustain a galactic civilization for very long (relative to the expected time of the universe to sustain intelligence). There is no way to alter the quality or direction of the fundamental outcome in any way to overcome this problem (given what we know right now).
If you suggest a plan with obvious catastrophic problems, chances are it’s not what will be actually chosen by rational agents (that or your analysis is incorrect).
That’s what I am inquiring about, is it rational given that we adopt a strategy of minimizing suffering? Or are we going to create trillions to have fun for a relatively short period and then have them suffering or commit suicide for a much longer period?
The analysis is incorrect? Well, ask the physicists.
Moral analysis.
Yes, I think so too. But I haven’t seen any good arguments against Negative utilitarianism in the comments yet. (More here)
You lost the context. Try not to drift.
Is this really worth your time (or Carl Shulman’s)? Surely you guys have better things to do?
If you tell me where my argumentation differs from arguments like this, I’ll know if it is a waste or not. I can’t figure it out.
Since XiXiDu and multifoliaterose’s posts have all been made during the Singularity Summit, when everyone at SIAI is otherwise occupied and so cannot respond, I thought someone familiar with the issues should engage rather than leave a misleading appearance of silence. And giving a bit of advice that I think has a good chance of improving XiXiDu’s contributions seemed reasonable and not too costly.
There is not enough stuff to sustain a galactic civilization for very long (relative to the expected time of the universe to sustain intelligence). There is no way to alter the quality or direction of the fundamental outcome in any way to overcome this problem (given what we know right now).
That’s what I am inquiring about, is it rational given that we adopt a strategy of minimizing suffering? Or are we going to create trillions to have fun for a relatively short period and then have them suffering or commit suicide for a much longer period?