It could also be understood that hitting the AI condition means that human labor productivity becomes 0. Part of the disparity problem would be that we might need to find a new reasoning to give humans resoures aside from economical neccesity.
It could also be understood that hitting the AI condition means that human labor productivity becomes 0.
I don’t agree with this. Using the formula “labor productivity = output volume / labor input use” (which I grabbed from Wikipedia, which is maybe not the best source, but it seems right to me), if “labor input use” is zero and “output volume” is positive, then “labor productivity” is +infinity.
Division by zero is undefined and it not guaranteed to correspond to +infinity in all context. In this context there might be a difference whether we are talking about the limit when labor apporaches 0 and when labour is completely absent. This is the difference between paying your employees 2 cents and not having employees at all. If you don’t have to get the inputs you waste zero resources bargaining for them and a firm that pays no wages is not enriching anyone (by proleriat mode).
If you are benetting from a gift that you don’t have to work for at all it not like you “work at infinite efficiency” for it. You can get “infinite moon dust efficiency” by not using any moon dust. But if your goal is to make a margarita moon dust is useless instead of being supremely helpful.
Division by zero is undefined and it not guaranteed to correspond to +infinity in all context. In this context there might be a difference whether we are talking about the limit when labor apporaches 0 and when labour is completely absent.
True, and in this context the limiting value when approaching from above is certainly the appropriate interpretation. After all, we’re talking about a gradual transition from current use of labor (which is positive) to zero use of labor. If the infinity is still bothersome, imagine somebody is paid to spend 1 second pushing the “start the AGI” button, in which case labor productivity is a gazillion (some enormous finite number) instead of infinity.
If you are benetting from a gift that you don’t have to work for at all it not like you “work at infinite efficiency” for it.
You seem to be arguing against the definition of labor productivity here. I think though that I’m using the most common definition. If you consider for example Our World In Data’s “productivity per hour worked” graph, it uses essentially the same definition that I’m using.
If AGI becomes available then it would replace the labor of AI researchers too. (At least it would if we assume that AGI is cheaper to operate than a human. But that seems almost certain, since humans are very expensive.)
In any case I’m not sure it really makes sense to talk about the productivity of people who aren’t employed. I’m considering the economy-wide stat here.
It could also be understood that hitting the AI condition means that human labor productivity becomes 0. Part of the disparity problem would be that we might need to find a new reasoning to give humans resoures aside from economical neccesity.
I don’t agree with this. Using the formula “labor productivity = output volume / labor input use” (which I grabbed from Wikipedia, which is maybe not the best source, but it seems right to me), if “labor input use” is zero and “output volume” is positive, then “labor productivity” is +infinity.
Division by zero is undefined and it not guaranteed to correspond to +infinity in all context. In this context there might be a difference whether we are talking about the limit when labor apporaches 0 and when labour is completely absent. This is the difference between paying your employees 2 cents and not having employees at all. If you don’t have to get the inputs you waste zero resources bargaining for them and a firm that pays no wages is not enriching anyone (by proleriat mode).
If you are benetting from a gift that you don’t have to work for at all it not like you “work at infinite efficiency” for it. You can get “infinite moon dust efficiency” by not using any moon dust. But if your goal is to make a margarita moon dust is useless instead of being supremely helpful.
True, and in this context the limiting value when approaching from above is certainly the appropriate interpretation. After all, we’re talking about a gradual transition from current use of labor (which is positive) to zero use of labor. If the infinity is still bothersome, imagine somebody is paid to spend 1 second pushing the “start the AGI” button, in which case labor productivity is a gazillion (some enormous finite number) instead of infinity.
You seem to be arguing against the definition of labor productivity here. I think though that I’m using the most common definition. If you consider for example Our World In Data’s “productivity per hour worked” graph, it uses essentially the same definition that I’m using.
If AGI becomes available then it would replace the labor of AI researchers too. (At least it would if we assume that AGI is cheaper to operate than a human. But that seems almost certain, since humans are very expensive.)
In any case I’m not sure it really makes sense to talk about the productivity of people who aren’t employed. I’m considering the economy-wide stat here.