“It’s hard to see on an emotional level why a genie might be a good thing to have, if you haven’t acknowledged any wishes that need granting.”
Why not?
Personal wishes are the simplest ones.
Minimal needs fulfilment plus really high class of security may be the first thing. It leaves a lot of time to have your wishes to come to you naturally. May be effortlessly, even.
The wishes of your friends come with all the limitations to yours (and then—their) security. Now, we got some kind of working recursion.
“I suppose there could be some section of the procedure where you’ve got to do a septillion operations...”
Just so—and even far worse, than that. To get a “working” set of wishes, I’d like to emulate some option’s results really well.
“”Boy, if I had human intelligence I sure could get a lot more bananas.”″
Right—and even worse… again. There is nothing wrong with the bananas, I’ll order on the first iteration! The problem startes with a couple of slaves, that any half way decent utopist proposed for the “humblest farmer”. It gets all the way downhill, afterwards.
Well. I do know, that I’ll ask some virtual [reality] words from supercomputer. And I do know, what’s the ethics of evolution (only “evil” words would develop some life and some minds of it… and it looks like a good idea to have the evolution “turned on” for that very purpose). But at the point where every person we do count as “real person” would have his own “virtual” world full of “virtual” persons—it’s there it gets really complicated and weird. Same with the “really strong AI” and advanced robots. We get to the same “couple of slaves” on the entirely new level, that’s what we do.
“It’s hard to see on an emotional level why a genie might be a good thing to have, if you haven’t acknowledged any wishes that need granting.”
Why not? Personal wishes are the simplest ones. Minimal needs fulfilment plus really high class of security may be the first thing. It leaves a lot of time to have your wishes to come to you naturally. May be effortlessly, even. The wishes of your friends come with all the limitations to yours (and then—their) security. Now, we got some kind of working recursion.
“I suppose there could be some section of the procedure where you’ve got to do a septillion operations...”
Just so—and even far worse, than that. To get a “working” set of wishes, I’d like to emulate some option’s results really well.
“”Boy, if I had human intelligence I sure could get a lot more bananas.”″
Right—and even worse… again. There is nothing wrong with the bananas, I’ll order on the first iteration! The problem startes with a couple of slaves, that any half way decent utopist proposed for the “humblest farmer”. It gets all the way downhill, afterwards.
Well. I do know, that I’ll ask some virtual [reality] words from supercomputer. And I do know, what’s the ethics of evolution (only “evil” words would develop some life and some minds of it… and it looks like a good idea to have the evolution “turned on” for that very purpose). But at the point where every person we do count as “real person” would have his own “virtual” world full of “virtual” persons—it’s there it gets really complicated and weird. Same with the “really strong AI” and advanced robots. We get to the same “couple of slaves” on the entirely new level, that’s what we do.