Why bother with friendly AI, surely it will stumble upon the built in objective rules of morality too. Hm, he may not follow them and instead tile the universe with paper-clips. This might sound crazy, but why don’t we follow the AI’s lead on this? Maybe paperclip the universe with utopia instead of making giant cheesecakes or plies of pebbles or turning all matter into radium atoms or whatever “objective morality” prescribes?
Why bother with friendly AI, surely it will stumble upon the built in objective rules of morality too. Hm, he may not follow them and instead tile the universe with paper-clips. This might sound crazy, but why don’t we follow the AI’s lead on this? Maybe paperclip the universe with utopia instead of making giant cheesecakes or plies of pebbles or turning all matter into radium atoms or whatever “objective morality” prescribes?