A problem here seems to be that creating a being in intense suffering would be ethically neutral
Well don’t existing people have a preference about there not being such creatures? You can have preferences that are about other people, right?
Sure, existing people tend to have such preferences. But hypothetically it’s possible that they didn’t, and the mere possibility is enough to bring down an ethical theory if you can show that it would generate absurd results.
This might be one reason why Eliezer talks about morality as a fixed computation.
P.S. Also, doesn’t the being itself have a preference for not-suffering?
Well don’t existing people have a preference about there not being such creatures? You can have preferences that are about other people, right?
Sure, existing people tend to have such preferences. But hypothetically it’s possible that they didn’t, and the mere possibility is enough to bring down an ethical theory if you can show that it would generate absurd results.
This might be one reason why Eliezer talks about morality as a fixed computation.
P.S. Also, doesn’t the being itself have a preference for not-suffering?