The thing is, humans do that by… well, not being formal systems. Which pretty much requires you to keep a good fraction of the foibles and flaws of a nonformal, nonrigorously rational system.
You’d be more likely to get FAI, but FAI itself would be devalued, since now it’s possible for the FAI itself to make rationality errors.
The thing is, humans do that by… well, not being formal systems. Which pretty much requires you to keep a good fraction of the foibles and flaws of a nonformal, nonrigorously rational system.
You’d be more likely to get FAI, but FAI itself would be devalued, since now it’s possible for the FAI itself to make rationality errors.
More likely, really?
You’re essentially proposing giving a human Ultimate Power. I doubt that will go well.
Iunno. Humans are probably less likely to go horrifically insane with power than the base chance of FAI.
Your chances aren’t good, just better.