I’m pretty sure (but don’t know how to test) that I am not capable of simulating another mind to the point where it has moral value above epsilon. But if I was, then it would be wrong of me to do that and torture it.
I think I hold freedom of thought as a terminal value, but torture also violates at least one of my terminal values, and it feels like torture is worse than some versions of forbidding people from torture-by-thought. But it might be that there’s no practical way to implement this forbidding. If the only way to make sure that nobody commits thoughture is to have an external body watching everybody’s thoughts, then that might be worse than having some people commit thoughture because there’s no way to catch them.
(But one person is probably capable of commiting an awful lot of thoughture. In Egan’s Permutation City, bar crefba perngrf n fvzhyngrq pybar bs uvzfrys va beqre gb unir vg gbegherq sbe cbffvoyl-rgreavgl.)
I’m pretty sure (but don’t know how to test) that I am not capable of simulating another mind to the point where it has moral value above epsilon. But if I was, then it would be wrong of me to do that and torture it.
I think I hold freedom of thought as a terminal value, but torture also violates at least one of my terminal values, and it feels like torture is worse than some versions of forbidding people from torture-by-thought. But it might be that there’s no practical way to implement this forbidding. If the only way to make sure that nobody commits thoughture is to have an external body watching everybody’s thoughts, then that might be worse than having some people commit thoughture because there’s no way to catch them.
(But one person is probably capable of commiting an awful lot of thoughture. In Egan’s Permutation City, bar crefba perngrf n fvzhyngrq pybar bs uvzfrys va beqre gb unir vg gbegherq sbe cbffvoyl-rgreavgl.)
I have no idea how to even start thinking about how to test if I am capable of simulating another mind with moral value.