All of this being said, what do you think SIAI should do differently? FAI has to be built. It has to be built by a team of humans. SIAI will do its best to make sure those humans are aware of problems like this, and that’s where I’d say it would be nice to know some concrete steps they could take to mitigate the risk. But aside form that, are there any other alternatives?
One important question is how long to delay attempting to build FAI, which requires balancing the risks that you’ll screw it up against the risks that someone else will screw it up first. For people in the SIAI who seriously think they have a shot at making FOOM-capable AI first, the primary thing I’d ask them to do is pay more attention to the considerations above when making that calculation.
But given that I think it’s extremely likely that some wealthier organization will end up steering any first-mover scenario (if one occurs, which it may not), I think it’s also worth putting some work into figuring out what the SIAI could do to get those people (whoever they end up being) to behave altruistically in that scenario.
All of this being said, what do you think SIAI should do differently? FAI has to be built. It has to be built by a team of humans. SIAI will do its best to make sure those humans are aware of problems like this, and that’s where I’d say it would be nice to know some concrete steps they could take to mitigate the risk. But aside form that, are there any other alternatives?
One important question is how long to delay attempting to build FAI, which requires balancing the risks that you’ll screw it up against the risks that someone else will screw it up first. For people in the SIAI who seriously think they have a shot at making FOOM-capable AI first, the primary thing I’d ask them to do is pay more attention to the considerations above when making that calculation.
But given that I think it’s extremely likely that some wealthier organization will end up steering any first-mover scenario (if one occurs, which it may not), I think it’s also worth putting some work into figuring out what the SIAI could do to get those people (whoever they end up being) to behave altruistically in that scenario.