“these trillions of people also cared, very strongly, about making giant cheesecakes.”
Uh oh. IMO, that is fallacy. You introduce quite reasonable scenario, then inject some nonsense, without any logic or explanation, to make it look bad.
You should better explain when, on the way from single sentient AI to voting rights fot trillions, cheesecakes came into play. Is it like all sentients being are automatically programmed to like creating big cheescakes? Or anything equally bizzarre?
Subtract cheescakes and your scenario is quite OK with me, including 0,1% of galaxy for humans and 99.9% for AIs. 0.1% of galaxy is about 200 millions of stars...
BTW, it is most likely that without sentient AI, there will be no human (or human originated) presence outside solar system anyway.
Well, so far, my understanding is that your suggestion is to create nonsentient utility maximizer programmed to stop research in certain areas (especially research in creating sentient AI, right?). Thanks, I believe I have a better idea.
“these trillions of people also cared, very strongly, about making giant cheesecakes.”
Uh oh. IMO, that is fallacy. You introduce quite reasonable scenario, then inject some nonsense, without any logic or explanation, to make it look bad.
You should better explain when, on the way from single sentient AI to voting rights fot trillions, cheesecakes came into play. Is it like all sentients being are automatically programmed to like creating big cheescakes? Or anything equally bizzarre?
Subtract cheescakes and your scenario is quite OK with me, including 0,1% of galaxy for humans and 99.9% for AIs. 0.1% of galaxy is about 200 millions of stars...
BTW, it is most likely that without sentient AI, there will be no human (or human originated) presence outside solar system anyway.
Well, so far, my understanding is that your suggestion is to create nonsentient utility maximizer programmed to stop research in certain areas (especially research in creating sentient AI, right?). Thanks, I believe I have a better idea.