My symbiotic-ecology-of-software-tools scenario was not a serious proposal as the best strategy to Friendliness. I was trying to increase the plausibility of SOME return at SOME cost, even given that AIs could produce value.
I’m afraid I see the issue as clear-cut, you can’t get “some” return, you can only win or lose (probability of getting there is of course more amenable to small nudges).
Making such a statement significantly increases the standard of reasoning I expect from a post. That is, I expect you to be either right or at least a step ahead of the one with whom you are communicating.
My symbiotic-ecology-of-software-tools scenario was not a serious proposal as the best strategy to Friendliness. I was trying to increase the plausibility of SOME return at SOME cost, even given that AIs could produce value.
I seem to have stepped onto a cached thought.
I’m afraid I see the issue as clear-cut, you can’t get “some” return, you can only win or lose (probability of getting there is of course more amenable to small nudges).
Making such a statement significantly increases the standard of reasoning I expect from a post. That is, I expect you to be either right or at least a step ahead of the one with whom you are communicating.