‘Pressing a button’ is one act, and ‘pressing both buttons’ and ‘pressing neither button’ are two others. If you press a button randomly, it isn’t morally relevant which random choice you made.
What does it mean to choose between G and B, when you have zero relevant information?
(shrug) It means that I do something that either causes G to be pressed, or causes B to be pressed. It means that the future I experience goes one way or another as a consequence of my act.
I have trouble believing that this is unclear; I feel at this point that you’re asking rhetorical questions by way of trying to express your incredulity rather than to genuinely extract new knowledge.Either way, I think we’ve gotten as far as we’re going to get here; we’re just going in circles.
I prefer a moral system in which the moral value of an act relative to a set of values is consistent over time, and I accept that this means it’s possible for there to be a right thing to do even when I don’t happen to have any way of knowing what the right thing to do is… that it’s possible to do something wrong out of ignorance. I understand you reject such a system, and that’s fine; I’m not trying to convince you to adopt it.
I’m not sure there’s anything more for us to say on the subject.
‘Pressing a button’ is one act, and ‘pressing both buttons’ and ‘pressing neither button’ are two others. If you press a button randomly, it isn’t morally relevant which random choice you made.
What does it mean to choose between G and B, when you have zero relevant information?
(shrug) It means that I do something that either causes G to be pressed, or causes B to be pressed. It means that the future I experience goes one way or another as a consequence of my act.
I have trouble believing that this is unclear; I feel at this point that you’re asking rhetorical questions by way of trying to express your incredulity rather than to genuinely extract new knowledge.Either way, I think we’ve gotten as far as we’re going to get here; we’re just going in circles.
I prefer a moral system in which the moral value of an act relative to a set of values is consistent over time, and I accept that this means it’s possible for there to be a right thing to do even when I don’t happen to have any way of knowing what the right thing to do is… that it’s possible to do something wrong out of ignorance. I understand you reject such a system, and that’s fine; I’m not trying to convince you to adopt it.
I’m not sure there’s anything more for us to say on the subject.