....Ha! Guys, I should have made this clearer. I don’t need counseling. I more-or-less fixed my problem for myself. By which I mean I could do with having it expressed for myself a bit more succinctly in a snappy, catchy sentence or two, but essentially, I got it already.
My point in bringing it to this audience was, “Hey, pretty sure generalizing fundamental techniques of human rationality shouldn’t cause existential angst. Seems like a problem that comes from and incomplete application of rationality to the issue. I think I figured out how to solve it for myself, but has anyone else ever had this problem, and how did you solve it?”
And we’re talking about a situation in which a being discovered that its values were internally inconsistent, and the same logic that identifies wireheading as “not what I actually want” extended to everything. Leaving the being with nothing ‘worth’ living for, but still capable of feeling pain.
So it wouldn’t make any sense for it to care at all how its death affected the state of the universe after it was gone. The point is that there are NO states that it actually values over others, other than ending its own subjective experience of pain.
If it had any reason to value killing itself to save the world over killing itself with a world destroying bomb (so long as both methods were somehow equally quick, easy, and painless to itself), then the whole reason it was killing itself in the first place wouldn’t be true.
The questions I mean to raise here are, is it even possible for a being to have a value system that logically eats itself from the inside out like that? And even if it was, I don’t think human values would fit into that class. But what’s the simplest, clearest way of proving that?
....Ha! Guys, I should have made this clearer. I don’t need counseling. I more-or-less fixed my problem for myself. By which I mean I could do with having it expressed for myself a bit more succinctly in a snappy, catchy sentence or two, but essentially, I got it already.
My point in bringing it to this audience was, “Hey, pretty sure generalizing fundamental techniques of human rationality shouldn’t cause existential angst. Seems like a problem that comes from and incomplete application of rationality to the issue. I think I figured out how to solve it for myself, but has anyone else ever had this problem, and how did you solve it?”
And we’re talking about a situation in which a being discovered that its values were internally inconsistent, and the same logic that identifies wireheading as “not what I actually want” extended to everything. Leaving the being with nothing ‘worth’ living for, but still capable of feeling pain.
So it wouldn’t make any sense for it to care at all how its death affected the state of the universe after it was gone. The point is that there are NO states that it actually values over others, other than ending its own subjective experience of pain.
If it had any reason to value killing itself to save the world over killing itself with a world destroying bomb (so long as both methods were somehow equally quick, easy, and painless to itself), then the whole reason it was killing itself in the first place wouldn’t be true.
The questions I mean to raise here are, is it even possible for a being to have a value system that logically eats itself from the inside out like that? And even if it was, I don’t think human values would fit into that class. But what’s the simplest, clearest way of proving that?
Well, I recognized that… :P