#2 is why I’m coming up with this scheme despite my substantial p(doom). I think we can do something like this with subhuman (hence non-dangerous) levels of AI. Material abundance is one of the things we expect from the Singularity. This provides abundance without superhuman AI, reducing the impetus toward it.
I would love to see someone actually do this.
It might get people take the idea of self-replicating AI seriously
The abundance provided by something like this might convince people that they actually don’t need to AGI to solve all of the worlds issues
#2 is why I’m coming up with this scheme despite my substantial p(doom). I think we can do something like this with subhuman (hence non-dangerous) levels of AI. Material abundance is one of the things we expect from the Singularity. This provides abundance without superhuman AI, reducing the impetus toward it.