Nothing needs to be justified in the philosophical sense of the term.
I think justification is important, especially in matters like AI design, as an uFAI could destroy the world.
In the case of AI design in general, consider the question “Why should we program an AI with a prior biased towards simpler theories?” I don’t think anyone would argue that a more detailed answer than “It’s our best guess right now.” would be desirable.
I think justification is important, especially in matters like AI design, as an uFAI could destroy the world.
In the case of AI design in general, consider the question “Why should we program an AI with a prior biased towards simpler theories?” I don’t think anyone would argue that a more detailed answer than “It’s our best guess right now.” would be desirable.