Would you sell possessions to buy weapons to attack a person would runs an online voluntary community who changes the rules without consulting anyone?
If the two situations are comparable, I think it’s important to know exactly why.
Also note that manacling you to a dungeon isn’t just eliminating your ability freely choose things arbitrarily, it’s preventing you from having satisfying relationships, access to good food, meaningful life’s work and other pleasures. Would you mind being in a prison that enabled you to do those things?
Would you mind being in a prison that enabled you to do those things?
Yes. If this were many years ago and I weren’t so conversant on the massive differences between the ways different humans see the world, I’d be very confused that you even had to ask that question.
Would you sell possessions to buy weapons to attack a person would runs an online voluntary community who changes the rules without consulting anyone?
No. There are other options. At the moment I’m still vainly hoping that Eliezer will see reason. I’m strongly considering just dropping out.
I feel like asking this question is wrong, but I want the information:
If I know that letting you have freedom will be hurtful (like, say, I tell you you’re going to get run over by a train, and you tell me you won’t, but I know that you’re in denial-of-denial and subconsciously seeking to walk on train tracks, and my only way to prevent your death is to manacle you in a dungeon for a few days), would you still consider the freedom terminally important? More important than the hurt? Which other values can be traded off? Would it be possible to figure out an exchange rate with enough analysis and experiment?
Yes. If this were many years ago and I weren’t so conversant on the massive differences between the ways different humans see the world, I’d be very confused that you even had to ask that question.
Regarding this, what if I told you “Earth was a giant prison all along. We just didn’t know. Also, no one built the prison, and no one is actively working to keep us in here—there never was a jailor in the first place, we were just born inside the prison cell. We’re just incapable of taking off the manacles on our own, since we’re already manacled.”? In fact, I do tell you this. It’s pretty much true that we’ve been prisoners of many, many things. Is your freedom node only triggered at the start of imprisonment, the taking away of a freedom once had? What if someone is born in the prison Raemon proposes? Is it still inherently wrong? Is it inherently wrong that we are stuck on Earth? If no, would it become inherently wrong if you knew that someone is deliberately keeping us here on Earth by actively preventing us from learning how to escape Earth?
The key point being: What is the key principle that triggers your “Freedom” light? The causal action that removes freedoms? The intentions behind the constraints?
It seems logical to me to assume that if you have freedom as a terminal value, then being able to do anything, anywhere, be anything, anyhow, anywhen, control time and space and the whole universe at will better than any god, without any possible restrictions or limitations of any kind, should be the Ultimately Most Supremely Good maximal possible utility optimization, and therefore reality and physics would be your worst possible Enemy, seeing as how it is currently the strongest Jailer than restricts and constrains you the most. I’m quite aware that this is hyperbole and most likely a strawman, but it is, to me, the only plausible prediction for a terminal value of yourself being free.
You’re right, this does answer most of my questions. I had made incorrect assumptions about what you would consider optimal.
After updates based on this, it now appears much more likely for me that you use terminal valuation of your freedom node such that it gets triggered by more rational algorithms that really do attempt to detect restrictions and constraints in more than mere feeling-of-control manner. Is this closer to how you would describe your value?
I’m still having trouble with the idea of considering a universe optimized for one’s own personal freedom as a best thing (I tend to by default think of how to optimize for collective sum utilities of sets of minds, rather than one). It is not what I expected.
True, and I don’t quite see where I implied this. If you’re referring to the optimal universe question, it seems quite trivial that if the universe literally acts according to your every will with no restrictions whatsoever, any other terminal values will instantly be fulfilled to their absolute maximal states (including unbounded values that can increase to infinity) along with adjustment of their referents (if that’s even relevant anymore).
No compromise is needed, since you’re free from the laws of logic and physics and whatever else might prevent you from tiling the entire universe with paperclips AND tiling the entire universe with giant copies of Eliezer’s mind.
So if that sort of freedom is a terminal value, this counterfactual universe trivially becomes the optimal target, since it’s basically whatever you would find to be your optimal universe regardless of any restrictions.
Would you sell possessions to buy weapons to attack a person would runs an online voluntary community who changes the rules without consulting anyone?
If the two situations are comparable, I think it’s important to know exactly why.
Also note that manacling you to a dungeon isn’t just eliminating your ability freely choose things arbitrarily, it’s preventing you from having satisfying relationships, access to good food, meaningful life’s work and other pleasures. Would you mind being in a prison that enabled you to do those things?
Yes. If this were many years ago and I weren’t so conversant on the massive differences between the ways different humans see the world, I’d be very confused that you even had to ask that question.
No. There are other options. At the moment I’m still vainly hoping that Eliezer will see reason. I’m strongly considering just dropping out.
I feel like asking this question is wrong, but I want the information:
If I know that letting you have freedom will be hurtful (like, say, I tell you you’re going to get run over by a train, and you tell me you won’t, but I know that you’re in denial-of-denial and subconsciously seeking to walk on train tracks, and my only way to prevent your death is to manacle you in a dungeon for a few days), would you still consider the freedom terminally important? More important than the hurt? Which other values can be traded off? Would it be possible to figure out an exchange rate with enough analysis and experiment?
Regarding this, what if I told you “Earth was a giant prison all along. We just didn’t know. Also, no one built the prison, and no one is actively working to keep us in here—there never was a jailor in the first place, we were just born inside the prison cell. We’re just incapable of taking off the manacles on our own, since we’re already manacled.”? In fact, I do tell you this. It’s pretty much true that we’ve been prisoners of many, many things. Is your freedom node only triggered at the start of imprisonment, the taking away of a freedom once had? What if someone is born in the prison Raemon proposes? Is it still inherently wrong? Is it inherently wrong that we are stuck on Earth? If no, would it become inherently wrong if you knew that someone is deliberately keeping us here on Earth by actively preventing us from learning how to escape Earth?
The key point being: What is the key principle that triggers your “Freedom” light? The causal action that removes freedoms? The intentions behind the constraints?
It seems logical to me to assume that if you have freedom as a terminal value, then being able to do anything, anywhere, be anything, anyhow, anywhen, control time and space and the whole universe at will better than any god, without any possible restrictions or limitations of any kind, should be the Ultimately Most Supremely Good maximal possible utility optimization, and therefore reality and physics would be your worst possible Enemy, seeing as how it is currently the strongest Jailer than restricts and constrains you the most. I’m quite aware that this is hyperbole and most likely a strawman, but it is, to me, the only plausible prediction for a terminal value of yourself being free.
This should answer most of the questions above. Yes, the universe is terrible. It would be much better if the universe were optimized for my freedom.
All values are fungible. The exchange rate is not easily inspected, and thought experiments are probably no good for figuring them out.
You’re right, this does answer most of my questions. I had made incorrect assumptions about what you would consider optimal.
After updates based on this, it now appears much more likely for me that you use terminal valuation of your freedom node such that it gets triggered by more rational algorithms that really do attempt to detect restrictions and constraints in more than mere feeling-of-control manner. Is this closer to how you would describe your value?
I’m still having trouble with the idea of considering a universe optimized for one’s own personal freedom as a best thing (I tend to by default think of how to optimize for collective sum utilities of sets of minds, rather than one). It is not what I expected.
“freedom as a terminal value” != “freedom as the only terminal value”
True, and I don’t quite see where I implied this. If you’re referring to the optimal universe question, it seems quite trivial that if the universe literally acts according to your every will with no restrictions whatsoever, any other terminal values will instantly be fulfilled to their absolute maximal states (including unbounded values that can increase to infinity) along with adjustment of their referents (if that’s even relevant anymore).
No compromise is needed, since you’re free from the laws of logic and physics and whatever else might prevent you from tiling the entire universe with paperclips AND tiling the entire universe with giant copies of Eliezer’s mind.
So if that sort of freedom is a terminal value, this counterfactual universe trivially becomes the optimal target, since it’s basically whatever you would find to be your optimal universe regardless of any restrictions.