The gzillions of other copies of you are not relevant unless they exist in universes exactly like yours from your observational perspective.
That being said, your point is interesting but just gets back to a core problem of the SA itself, which is how you count up the set of probable universes and properly weight them.
I think the correct approach is to project into the future of your multiverse, counting future worldlines that could simulate your current existence weighted by their probability.
So if it’s just one AI in a box and he doesn’t have much computing power you shouldn’t take him very seriously, but if it looks like this AI is going to win and control the future then you should take it seriously.
The gzillions of other copies of you are not relevant unless they exist in universes exactly like yours from your observational perspective.
That being said, your point is interesting but just gets back to a core problem of the SA itself, which is how you count up the set of probable universes and properly weight them.
I think the correct approach is to project into the future of your multiverse, counting future worldlines that could simulate your current existence weighted by their probability.
So if it’s just one AI in a box and he doesn’t have much computing power you shouldn’t take him very seriously, but if it looks like this AI is going to win and control the future then you should take it seriously.