I think the whole concept of labeling goods as “fungible” or “non-fungible” is a category error. Everything trades off against something.
Either you value your fingers more than what [some specific amount of money] will buy you or you don’t. If you value your fingers more, then keeping them is the right call for you.
Chaos in complex systems is guaranteed but also bounded. I cannot know what the weather will be like in New York City one month from now. I can, however, predict that it probably won’t be “tornado” and near-certainly won’t be “five hundred simultaneous tornadoes level the city”. We know it’s possible to build buildings that can withstand ~all possible weather for a very long time. I imagine that a thing you’re calling a puppet-master could build systems that operate within predictable bounds robustly and reliably enough to more or less guarantee broad control.
Caveat: The transition from seed AI to global puppet-master is harder to predict than the end state. It might plausibly involve psychohistorian-like nudges informed by superhuman reasoning and modeling skills. But I’d still expect that the optimization pressure a superintelligence brings to bear could render the final outcome of the transition grossly overdetermined.