I think there’s a distinction between the environment being in ~equillibrium and you wrestling a resource out from the equllibrium, versus you being part of a greater entity which wrestles resources out from the equillibrium and funnels them to your part?
That’s a good point, though I’d word it as an “uncaring” environment instead. Let’s imagine though that the self-improving AI pays for its electricity and cloud computing with money, which (after some seed capital) it earns by selling use of its improved versions through an API. Then the environment need not show any special preference towards the AI. In that case, the AI seems to demonstrate as much vertical generality as an animal or plant.
I think there’s a distinction between the environment being in ~equillibrium and you wrestling a resource out from the equllibrium, versus you being part of a greater entity which wrestles resources out from the equillibrium and funnels them to your part?
That’s a good point, though I’d word it as an “uncaring” environment instead. Let’s imagine though that the self-improving AI pays for its electricity and cloud computing with money, which (after some seed capital) it earns by selling use of its improved versions through an API. Then the environment need not show any special preference towards the AI. In that case, the AI seems to demonstrate as much vertical generality as an animal or plant.
That seems reasonable to me.