So I guess I’d want to create FAI that never integrates any of it’s experiences into it self in a way that we (or it) would find precious, or unique and meaningfully irreproducible.
It’s only a concern about initial implementation. Once the things get rolling, FAI is just another pattern in the world, so it optimizes itself according to the same criteria as everything else.
It’s only a concern about initial implementation. Once the things get rolling, FAI is just another pattern in the world, so it optimizes itself according to the same criteria as everything else.