Humans are trained on a tiny unique subset of available training data. I would expect multiple instances of a set of AI software trained on close to the same set of data to think very similar to each-other, and not provide more creative capability than a single AI with more bandwidth.
That’s a good point. I guess I don’t expect this to be a big problem because: 1. I think 1,000,000 copies of myself could still get a heck of a lot done. 2. The first human-level AGI might be way more creative than your average human. It would probably be trained on data from billions of humans, so all of those different ways of thinking could be latent in the model. 2. The copies can potentially diverge. I’m expecting the first transformative model to be stateful and be able to meta-learn. This could be as simple as giving a transformer read and write access to an external memory and training it over longer time horizons. The copies could meta-learn on different data and different sub-problems and bring different perspectives to the table.
Humans are trained on a tiny unique subset of available training data. I would expect multiple instances of a set of AI software trained on close to the same set of data to think very similar to each-other, and not provide more creative capability than a single AI with more bandwidth.
That’s a good point. I guess I don’t expect this to be a big problem because:
1. I think 1,000,000 copies of myself could still get a heck of a lot done.
2. The first human-level AGI might be way more creative than your average human. It would probably be trained on data from billions of humans, so all of those different ways of thinking could be latent in the model.
2. The copies can potentially diverge. I’m expecting the first transformative model to be stateful and be able to meta-learn. This could be as simple as giving a transformer read and write access to an external memory and training it over longer time horizons. The copies could meta-learn on different data and different sub-problems and bring different perspectives to the table.