So we are considering a small team with some computers claiming superior understanding of what the best set of property rights is for the world?
No. That would be worked out by the FAI itself, as part of calculating all of the implications of its value systems, most likely using something like CEV to look at humanity in general and extrapolating their preferences. The programmers wouldn’t need to, and indeed probably couldn’t, understand all of the tradeoffs involved.
If they really are morally superior, they will first find ways to grow the pie, then come back to changing how it gets divided up.
There are large costs to that. People will die and suffer in the meantime. Parts of humanity’s cosmic endowment will slip out of reach due to the inflation of the universe, because you weren’t willing to grab the local resources needed to build probe launchers to get to them in time. Other parts will remain rechable, but will have decreased in negentropy due to stars having continued to burn for longer than they needed to. If you can fix these things earlier, there’s a strong reason to do so.
No. That would be worked out by the FAI itself, as part of calculating all of the implications of its value systems, most likely using something like CEV to look at humanity in general and extrapolating their preferences. The programmers wouldn’t need to, and indeed probably couldn’t, understand all of the tradeoffs involved.
There are large costs to that. People will die and suffer in the meantime. Parts of humanity’s cosmic endowment will slip out of reach due to the inflation of the universe, because you weren’t willing to grab the local resources needed to build probe launchers to get to them in time. Other parts will remain rechable, but will have decreased in negentropy due to stars having continued to burn for longer than they needed to. If you can fix these things earlier, there’s a strong reason to do so.