You are completely missing the point. If we’re all going to agree that AI is possible, and agree that there’s a completely crappy but genuinely existent example of AGI right now, then it follows that getting AI up to dangerous and/or beneficial levels is a matter of additional engineering progress
Progress in 1. The sense of incrementally throwing more resources at AIXI, or 2. Forgetting AIXI , and coming up with something more parsimonious?
Because, if it’s 2, there is no other AGI to use as a stating point got incremental progress.
Progress in 1. The sense of incrementally throwing more resources at AIXI, or 2. Forgetting AIXI , and coming up with something more parsimonious?
Because, if it’s 2, there is no other AGI to use as a stating point got incremental progress.
Is that what they tell you?