Alright, what I got from your post is that if you know the definition of an FAI and can instruct a computer to design one, you’ve basically already made one. That is, having the precise definition of the thing massively reduces the difficulty of creating it i.e. when people ask ‘do we have free will?’ defining free will greatly reduces the complexity of the problem. Is that correct?
Alright, what I got from your post is that if you know the definition of an FAI and can instruct a computer to design one, you’ve basically already made one.
Yes. Although to be clear, the most likely path probably involves a very indirect form of specification based on learning from humans.
Ok. So why could you not replace ‘encode an FAI’ with ‘define an FAI?’ And you would place all the restrictions I detailed on that AI. Or is there still a problem?
Alright, what I got from your post is that if you know the definition of an FAI and can instruct a computer to design one, you’ve basically already made one. That is, having the precise definition of the thing massively reduces the difficulty of creating it i.e. when people ask ‘do we have free will?’ defining free will greatly reduces the complexity of the problem. Is that correct?
Yes. Although to be clear, the most likely path probably involves a very indirect form of specification based on learning from humans.
Ok. So why could you not replace ‘encode an FAI’ with ‘define an FAI?’ And you would place all the restrictions I detailed on that AI. Or is there still a problem?