what do you think of the following idea that I have already posted on the sl4 mailing list:
Bootstrap the FAI by first building a neutral obedient AI(OAI) that is constrained in such a way that it doesn’t act besides giving answers to questions. Once you have that could it possibly be easier to build a FAI? The OAI could be a tremendous help answering difficult questions and proposing solutions.
Eliezer,
what do you think of the following idea that I have already posted on the sl4 mailing list:
Bootstrap the FAI by first building a neutral obedient AI(OAI) that is constrained in such a way that it doesn’t act besides giving answers to questions. Once you have that could it possibly be easier to build a FAI? The OAI could be a tremendous help answering difficult questions and proposing solutions.
This reminds me of the process of bootstrapping compilers: http://en.wikipedia.org/wiki/Bootstrapping_(compilers) http://cns2.uni.edu/~wallingf/teaching/155/sessions/session02.html