I wrote the post while tired last night, probably not a good idea.
The numbers were not what I was trying to get across, (you can make them a lot smaller across the board and I wouldn’t have a problem). It is the general shape of the problem, and the interfering nature of the actions for each world.
Do you think what we know about AI is so low that we shouldn’t even try to think about trying to shape its development?
I wrote the post while tired last night, probably not a good idea.
The numbers were not what I was trying to get across, (you can make them a lot smaller across the board and I wouldn’t have a problem). It is the general shape of the problem, and the interfering nature of the actions for each world.
Do you think what we know about AI is so low that we shouldn’t even try to think about trying to shape its development?