Talking about the “reason” for adaptations is biology 101.
I know that. Most of the time I use the same language. But that’s because I trust the people I’m talking to to know that I’m speaking metaphorically. I also trust them to understand enough basic morality to know that just because something is extremely common in nature, doesn’t mean it’s morally good. The reason I am not doing that when talking to you is that I am not convinced that I should extend you that trust. You constantly confuse the descriptive with the normative and the “common in nature” with the “morally good.”
I do not have issue with the majority of factual statements you make. What I have issue with is the appalling moral statements you make. I get the impression that your are upset at Eliezer because he wants to preserve the values that make us morally significant beings, even if doing so will stop us from evolving. You act like evolving is our “real” purpose and that things that people actually value, like creativity, novelty, love, art, friendship, etc. are not important. This is the exact opposite of the truth. Evolution is useful only so far as it preserves and enhances our values such as creativity, novelty, love, art, friendship, etc.
Again, if you really think maximizing entropy is your real purpose in life; would you torture 50 children to death if it would get some sadistic aliens to make a far-off star go nova for you? Detonating one star would produce far more entropy than those children would over their lifetimes, but I still bet you wouldn’t torture them, because you know it’s wrong. The fact that you wouldn’t do this proves you think doing the right thing is more important than maximizing entropy.
I do not have issue with the majority of factual statements you make. What I have issue with is the appalling moral statements you make.
Gee, thanks for that.
Again, if you really think maximizing entropy is your real purpose in life; would you torture 50 children to death if it would get some sadistic aliens to make a far-off star go nova for you?
People often seem to think that entropy maximisation priinciples imply that organisms should engage in wanton destruction, blowing things up. However, that is far from the case. Causing explosions is usually a very bad way of maximising entropy in the long term—since it tends to destroy the world’s best entropy maximisers, living systems. Living systems go on to cause far more devastation than exploding a sun ever could. So, wanton destruction of a sun, is bad—not good—from this perspective.
Causing explosions is usually a very bad way of maximising entropy in the long term—since it tends to destroy the world’s best entropy maximisers, living systems.
That’s why I said “far-off” star. I was trying to imply that the star was so far away its destruction would not harm any living things. Please don’t fight the hypothetical.
In any case, the relevant part of the question isn’t “Would you blow up a star?” That was just an attempt to give the hypothetical some concrete details so it sounded less abstract. The relevant question is “Would you torture fifty children to death in order to greatly increase the level of entropy in the universe.” Assume that the increase would be greater than what the kids would be able to accomplish themselves if you allowed them to live.
This is ridiculous. Are you actually proposing entropy maximisation as a reduction of “should”, normative ethical theory, etc., or do you just find it humorous to waste our time?
Causing explosions is usually a very bad way of maximising entropy in the long term—since it tends to destroy the world’s best entropy maximisers, living systems. Living systems go on to cause far more devastation than exploding a sun ever could.
Are you sure? A black hole is the system with the most possible entropy among those with a given mass. Your point would only be valid if interstellar civilizations are easy to achieve, and given that we don’t see any of those around I don’t think they are.
I know that. Most of the time I use the same language. But that’s because I trust the people I’m talking to to know that I’m speaking metaphorically. I also trust them to understand enough basic morality to know that just because something is extremely common in nature, doesn’t mean it’s morally good. The reason I am not doing that when talking to you is that I am not convinced that I should extend you that trust. You constantly confuse the descriptive with the normative and the “common in nature” with the “morally good.”
I do not have issue with the majority of factual statements you make. What I have issue with is the appalling moral statements you make. I get the impression that your are upset at Eliezer because he wants to preserve the values that make us morally significant beings, even if doing so will stop us from evolving. You act like evolving is our “real” purpose and that things that people actually value, like creativity, novelty, love, art, friendship, etc. are not important. This is the exact opposite of the truth. Evolution is useful only so far as it preserves and enhances our values such as creativity, novelty, love, art, friendship, etc.
Again, if you really think maximizing entropy is your real purpose in life; would you torture 50 children to death if it would get some sadistic aliens to make a far-off star go nova for you? Detonating one star would produce far more entropy than those children would over their lifetimes, but I still bet you wouldn’t torture them, because you know it’s wrong. The fact that you wouldn’t do this proves you think doing the right thing is more important than maximizing entropy.
Gee, thanks for that.
People often seem to think that entropy maximisation priinciples imply that organisms should engage in wanton destruction, blowing things up. However, that is far from the case. Causing explosions is usually a very bad way of maximising entropy in the long term—since it tends to destroy the world’s best entropy maximisers, living systems. Living systems go on to cause far more devastation than exploding a sun ever could. So, wanton destruction of a sun, is bad—not good—from this perspective.
So, if the nova’s explosion did not destroy any living systems, you would happily trade the 50 kids for the nova explosion?
That’s why I said “far-off” star. I was trying to imply that the star was so far away its destruction would not harm any living things. Please don’t fight the hypothetical.
In any case, the relevant part of the question isn’t “Would you blow up a star?” That was just an attempt to give the hypothetical some concrete details so it sounded less abstract. The relevant question is “Would you torture fifty children to death in order to greatly increase the level of entropy in the universe.” Assume that the increase would be greater than what the kids would be able to accomplish themselves if you allowed them to live.
This is ridiculous. Are you actually proposing entropy maximisation as a reduction of “should”, normative ethical theory, etc., or do you just find it humorous to waste our time?
Are you sure? A black hole is the system with the most possible entropy among those with a given mass. Your point would only be valid if interstellar civilizations are easy to achieve, and given that we don’t see any of those around I don’t think they are.