Thank you for making the point about existing network efficiencies! :)
The assumption, years ago, was that AGI would need 200x as many artificial weights and biases when compared to a human’s 80 to 100 Trillion synapses. Yet—we see the models beating our MBA exams, now, using only a fraction of the number of neurons! The article above pointed to the difference between “capable of 20%” and “impacting 20%”—I would guess that we’re already at the “20% capability” mark, in terms of the algorithms themselves. Every time a major company wants to, they can presently reach human-level results with narrow AI that uses 0.05% as many synapses.
I wonder if, without any meaning to assign to your bot’s blurbs, GPT found its own, new meanings? Makes me worry about hidden operations....