It is not very important, but since you mentioned it :
The interval of convergence of the Taylor series of 1/(1-z) at z=0 is indeed (-1,1).
But “1/(1-z) = 1 + z + O(z^2) for all z” does not make sense to me.
1/(1-z) = 1 + z + O(z^2) means that there is an M such as |1/(1-z) - (1 + z)| is no greater that M*z^2 for every z close enough to 0.
It is about the behavior of 1/(1-z) - (1 + z) when z tends toward 0, not when z belongs to (-1,1).
It is not very important, but since you mentioned it :
The interval of convergence of the Taylor series of 1/(1-z) at z=0 is indeed (-1,1).
But “1/(1-z) = 1 + z + O(z^2) for all z” does not make sense to me.
1/(1-z) = 1 + z + O(z^2) means that there is an M such as |1/(1-z) - (1 + z)| is no greater that M*z^2 for every z close enough to 0. It is about the behavior of 1/(1-z) - (1 + z) when z tends toward 0, not when z belongs to (-1,1).