How about “if I contain two subagents with different goals, they should execute Pareto-improving trades with each other”? This is an aspect of “becoming more rational”, but it’s not very well described by your maxim, because the maxim includes “your goal” as if that’s well defined, right?
Unrelated topic: Maybe I didn’t read carefully enough, but intuitively I treat “making a plan” and “executing a plan” as different, and I normally treat the word “planning” as referring just to the former, not the latter. Is that what you mean? Because executing a plan is obviously necessary too ….
Shooting from the hip: The maxim does include “your goal” as if that’s well-defined, yeah. But this is fair, because this is a convergent instrumental goal; a system which doesn’t have goals at all doesn’t have convergent instrumental goals either. To put it another way: It’s built into the definition of “planner” that there is a goal, a goal-like thing, something playing the role of goal, etc.
Anyhow, so I would venture to say that insofar as “my subagents should execute pareto-improving trades” does not in fact further my goal, then it’s not convergently instrumental, and if it does further my goal, then it’s a special case of self-improvement or rationality or some other shard of P2B.
Re point 2:
We take “planning” to include things that are relevantly similar to this procedure, such as following a bag of heuristics that approximates it. We’re also including actually following the plans, in what might more clunkily be called “planning-acting.”
How about “if I contain two subagents with different goals, they should execute Pareto-improving trades with each other”? This is an aspect of “becoming more rational”, but it’s not very well described by your maxim, because the maxim includes “your goal” as if that’s well defined, right?
Unrelated topic: Maybe I didn’t read carefully enough, but intuitively I treat “making a plan” and “executing a plan” as different, and I normally treat the word “planning” as referring just to the former, not the latter. Is that what you mean? Because executing a plan is obviously necessary too ….
Shooting from the hip: The maxim does include “your goal” as if that’s well-defined, yeah. But this is fair, because this is a convergent instrumental goal; a system which doesn’t have goals at all doesn’t have convergent instrumental goals either. To put it another way: It’s built into the definition of “planner” that there is a goal, a goal-like thing, something playing the role of goal, etc.
Anyhow, so I would venture to say that insofar as “my subagents should execute pareto-improving trades” does not in fact further my goal, then it’s not convergently instrumental, and if it does further my goal, then it’s a special case of self-improvement or rationality or some other shard of P2B.
Re point 2: